Fastino launches CPU-based AI models for enterprise tasks

A San Francisco-based startup Fastino has unveiled new task-optimized AI models that run efficiently on standard CPUs without requiring expensive GPUs. As reported by Sean Michael Kerner, the company has secured $7 million in pre-seed funding from investors including Microsoft’s Venture Fund M12 and Insight Partners. Fastino’s models differ from traditional large language models by focusing on specific enterprise functions like data structuring, RAG pipelines, and JSON response generation. The company claims its technology delivers responses in milliseconds through reduced matrix multiplication operations, making it suitable for deployment even on devices like Raspberry Pi. While not yet generally available, Fastino is already working with major clients in financial services, e-commerce, and consumer devices sectors, offering on-premises solutions for data-sensitive industries.

Related posts:

Stay up-to-date: