Nvidia introduces “desktop AI supercomputer” Project Digits for $3,000

At CES 2025 in Las Vegas, Nvidia announced Project Digits, a compact desktop AI supercomputer aimed at researchers, data scientists, and students. The device, scheduled for release in May 2025 at a price point of $3,000, represents the company’s effort to bring powerful AI computing capabilities to individual desks.

At the core of Project Digits is the GB10 Grace Blackwell Superchip, developed in collaboration with MediaTek. The system combines a Blackwell GPU with a 20-core Grace CPU based on Arm architecture. According to Nvidia, the device delivers up to one petaflop of AI performance at FP4 precision, while operating from a standard power outlet. The hardware includes 128GB of unified memory and up to 4TB of NVMe storage.

The system’s capabilities, as stated by Nvidia, allow it to run AI models with up to 200 billion parameters on a single unit. Users can link two Project Digits machines together to handle models with up to 405 billion parameters. The device runs on Nvidia’s Linux-based DGX OS and supports common frameworks including PyTorch, Python, and Jupyter notebooks.

Nvidia CEO Jensen Huang presented the device during a press conference, describing it as a complete platform that runs “the entire Nvidia AI stack.” The company positions Project Digits as a development platform that allows users to prototype locally and then deploy their work to cloud services or data center infrastructure using the same Grace Blackwell architecture.

Users will have access to Nvidia’s AI software library, including development kits, orchestration tools, and pre-trained models through the Nvidia NGC catalog. The system supports the Nvidia NeMo framework for model fine-tuning and RAPIDS libraries for data science workflows. When moving from development to production, users can access enterprise-grade security and support through the Nvidia AI Enterprise license.

Industry leaders have expressed support for the initiative. Meta’s head of GenAI, Ahmad Al-Dahle, highlighted the device’s potential for local development with Llama models, while Hugging Face’s head of product, Je Boudier, emphasized its capabilities for edge computing. Salesforce’s chief scientist, Silvio Savarese, noted the significance of having 128GB of memory in such a compact form factor.

Sources: TechCrunch, The Verge, VentureBeat

Related posts:

Stay up-to-date: