Mistral unveils small models for laptops and smartphones

French AI company Mistral has introduced new generative AI models for laptops and smartphones. Known as “Les Ministraux,” the models are optimized for various applications such as text generation or collaboration with more powerful models. Kyle Wiggers reports for TechCrunch that two variants are available: Ministral 3B and Ministral 8B, both with a context window …

Read more

Why researchers are betting on local AI

Researchers are apparently increasingly using small AI models on their own computers instead of the offerings of large AI services like ChatGPT. As Nature reports, freely available tools such as Llama or Phi-3 allow scientists to run AI models locally. This offers advantages: cost savings, data protection, and reproducibility. The local AI models are becoming …

Read more

Updates from Google and Meta

Google wants to improve the accuracy of its AI models. To avoid “hallucinations,” the company is working with partners such as Moody’s, Thomson Reuters, and ZoomInfo who will feed the AI systems with up-to-date information. A new “confidence score” is also supposed to indicate how confident the AI is in its answer being correct. With …

Read more

Researchers work on better local AI

Researchers are making great strides in developing 1-bit LLMs that can achieve similar performance to their larger counterparts while using significantly less memory and power. This development could open the door to more complex AI applications on everyday devices such as smartphones, as they require less processing power and energy.

New AI kit for the Raspberry Pi

The new $70 Raspberry Pi AI Kit allows users to implement AI applications for visual tasks with the microcomputer. The kit enables real-time object recognition, segmentation, and pose estimation with low power consumption, which could make the Raspberry Pi 5 suitable for many local AI applications.

Intels “Lunar Lake” is a laptop chip for AI

Intel introduced the “Lunar Lake” processor, a completely redesigned laptop chip for AI applications that eliminates the need for separate memory modules and instead integrates up to 32GB of LPDDR5X memory directly into the chip package. Thanks to numerous optimizations, Intel promises up to 14 percent more CPU performance at the same clock speed, 50 …

Read more

Jolla Mind2 is a local AI assistant

Jolla, known for its Sailfish operating system, presents a new approach to AI assistants with Jolla Mind2. This personal server, which will be available in the fall for 699 Euros (plus monthly subscription fees), is designed to perform AI-based tasks without sending data to the cloud.

Perplexica is an open source AI search engine

Perplexica is an open source search engine with AI support, similar to Perplexity. It provides answers with references. Once installed, it can use local language models such as Llama 3 or Mixtral.

Zyphra Zamba brings AI to more devices

Zyphra introduces Zamba, an open source 7 billion parameter model designed to bring artificial intelligence to more devices, with a decentralized approach and smaller model size to provide a more cost-effective and personalized alternative to large, centralized AI models.

Intel’s new chips supposed to deliver better AI performance

Intel confirms that Microsoft’s Copilot AI assistant will soon run locally on PCs powered by its chips. Intel’s Lunar Lake processors, due out at the end of the year, are expected to deliver three times the AI performance of current Meteor Lake chips, meeting the needs of the next generation of AI PCs.