Meta has released Llama 3.3 70B, a new large language model that achieves GPT-4 level performance while running on high-end consumer laptops. The breakthrough was documented by developer Simon Willison testing the model on a 64 GB MacBook Pro M2, demonstrating capabilities comparable to much larger models like Meta’s own Llama 3.1 405B. The model requires significant system resources, consuming 42GB of storage space and needing most of the 64GB RAM to operate effectively.
The model’s abilities were independently verified through LiveBench testing, where it ranked 19th overall among AI models and scored particularly well on instruction-following tasks. Users can run the model using Ollama, an open-source tool, or Apple’s MLX library. Meta claims this development represents a significant advance in making powerful AI accessible for local execution on personal computers.