New AI architecture STAR reduces model cache size by 90 percent

MIT startup Liquid AI has developed a new AI framework called STAR (Synthesis of Tailored Architectures) that significantly improves upon traditional Transformer models. As reported by Carl Franzen for VentureBeat, the system uses evolutionary algorithms to automatically generate and optimize AI architectures. The STAR framework achieved a 90% reduction in cache size compared to traditional …

Read more

MIT spin-off Liquid AI shows its highly efficient models

A MIT spin-off called Liquid AI has unveiled new AI models that are not based on the usual transformer architecture. According to the company, these “Liquid Foundation Models” (LFMs) already outperform comparable transformer-based models in performance and efficiency. Liquid AI announced this in a statement. Instead of transformers, the developers used approaches from the theory …

Read more