Kagi Assistant now available to all users with no price increase

Paid search engine Kagi has announced that its AI assistant feature is now available to all users across all subscription plans at no additional cost. According to Kagi’s announcement, the Assistant combines access to leading large language models (LLMs) with optional integration of Kagi Search results. The tool was previously exclusive to Ultimate subscribers but is now being offered to all customers as part of their existing subscriptions.

The company emphasizes that privacy remains central to the Assistant’s design. Conversations are private by default, automatically expire based on user settings, and interaction data is not used to train AI models, either by Kagi or third-party providers.

Users can choose from various AI models depending on their subscription level. All plans include access to models such as GPT-4o mini, GPT-4.1 mini, Gemini 2.5 Flash, and Llama 3.3 70B. Ultimate subscribers gain additional access to more powerful models including GPT-4o, Claude 3.5 Haiku, Claude 3.7 Sonnet, and Gemini 2.5 Pro Preview.

Kagi describes the Assistant as a research aid that enhances rather than replaces its core search functionality. Users can enable web access to ground the AI’s responses in search results, which will respect personalized domain rankings and work with Kagi’s Lenses feature to narrow search scope. Alternatively, users can chat directly with the models without web access or upload files for additional context.

The announcement also introduces a fair-use policy that ties AI usage limits to subscription values. For example, a $25 monthly plan allows up to $25 worth of raw token cost across all models, with a 20% margin reserved for search provision, development, and infrastructure. According to Kagi, 95% of users should never hit this limit based on their usage statistics.

For those who exceed their allocation, Kagi offers the option to renew their subscription cycle immediately, with plans to introduce credit top-ups for additional flexibility in the future.

Related posts:

Stay up-to-date: