AI company Cohere has introduced Command R7B, a new compact language model designed for enterprise applications. According to VentureBeat reporter Taryn Plumb, the model supports 23 languages and specializes in retrieval-augmented generation (RAG). Command R7B outperforms similar-sized models from competitors like Google, Meta, and Mistral in mathematics and coding tasks. The model features a 128K context length and can run on standard consumer hardware, including MacBooks. Cohere CEO Aidan Gomez states that the model is optimized for speed and cost-efficiency. The company prices the service at $0.0375 per million input tokens and $0.15 per million output tokens. The model excels at various enterprise tasks, including technical support, HR inquiries, and financial information processing, while supporting tool integration with search engines and APIs.