Revolutionizing On-Device AI with Apple's New Open LLMs

Published On Sat Apr 27 2024
Revolutionizing On-Device AI with Apple's New Open LLMs

Apple releases eight new open LLMs - SD Times

Apple has released eight new small LLMs as part of CoreNet, which is the company’s library for training deep neural networks. The models, called OpenELM (Open-source Efficient Language Models), come in eight different options: four are pre-trained models and four are instruction-tuned, each coming in sizes of 270M, 250M, 1.1B, and 3B parameters.

OpenELM Instruct Models - a apple Collection

Due to the smaller model size, the models should be able to run directly on devices instead of having to connect back to a server to perform calculations. According to Apple, the goal of OpenELM is to “empower and enrich the open research community by providing access to state-of-the-art language models.” The models are currently only available on Hugging Face and the source code was made available by Apple.

“The reproducibility and transparency of large language models are crucial for advancing open research, ensuring the trustworthiness of results, and enabling investigations into data and model biases, as well as potential risks. To this end, we release OpenELM, a state-of-the-art open language model… This comprehensive release aims to empower and strengthen the open research community, paving the way for future open research endeavors,” the Apple researchers wrote in a paper.

Apple Unveils OpenELM: The Next Leap in On-Device AI | by Zamal ...