Why researchers are betting on local AI

Researchers are apparently increasingly using small AI models on their own computers instead of the offerings of large AI services like ChatGPT. As Nature reports, freely available tools such as Llama or Phi-3 allow scientists to run AI models locally. This offers advantages: cost savings, data protection, and reproducibility.

The local AI models are becoming increasingly powerful and can already rival older, larger models. Researchers use them for a variety of purposes, such as summarizing scientific texts or analyzing protein structures. Experts expect this trend to grow as computers become faster and models become more powerful. For most applications, local AI on laptops or mobile devices may soon be sufficient.

Related posts:

Stay up-to-date: