Anthropic has launched a new, more affordable batch processing API for businesses. According to a VentureBeat report by Michael Nuñez, the new Message Batches API allows companies to process up to 10,000 queries asynchronously within a 24-hour window, at half the cost of standard API calls. Both input and output tokens are 50% cheaper compared to real-time processing. Anthropic aims to make large AI models more accessible and cost-effective for enterprises dealing with big data, increasing competition with OpenAI. The API is available for the Claude 3.5 Sonnet, Claude 3 Opus, and Claude 3 Haiku models, with support for Google Cloud’s Vertex AI and Amazon Bedrock planned or already available.