A new open source model called OLMoE has been released by the Allen Institute for AI (AI2) in collaboration with Contextual AI. As Emilia David reports for VentureBeat, the model aims to be both powerful and inexpensive. OLMoE uses a mixed-expert architecture with 7 billion parameters, of which only 1 billion are active per input token. Unlike many other MoE models, OLMoE is completely open source, including training data and code. According to AI2, OLMoE outperformed similar models in benchmark tests and achieved performance close to larger models such as Mistral-7B or Llama 3.1-8B. The project aims to provide researchers with greater access to powerful AI models.