Researchers have found a way to dramatically improve the energy efficiency of large language models without sacrificing performance. Using their system, a language model with billions of parameters can be run on as little as 13 watts. The researchers have also developed proprietary hardware that further maximizes energy savings.
Researchers claim dramatic improvements to energy efficiency
Related posts:
Tags: Research