Transformer models not suitable for AI agents

AI21 CEO Ori Goshen criticizes Transformer models as unsuitable for AI agents. They are too expensive and inefficient, Goshen told VentureBeat in an interview. Alternatives like Mamba and Jamba offer faster inference times and longer context. This makes them more suitable for agents that need to access multiple models. Goshen attributes the lack of reliability in current AI agents to the Transformer architecture. Although AI agents are becoming more popular, they are not yet ready for widespread enterprise use.

Related posts:

Stay up-to-date: