Apple takes effective action to power on-device AI
Tim Cook's tech giant aims to enhance consumer experience
A series of open-source large language models (LLM) called OpenELM (Open-source Efficient Language Models) have been made available by Apple for powering "on-device" AI capabilities.
Rather than using cloud services, the tech behemoth with headquarters in Cupertino plans to run this natively on its devices, according to Money Control.
The Hugging Face page, a forum for exchanging AI codes, offers the OpenELM AI models.
Apple stated while introducing the SLMs: "We introduce OpenELM, a family of Open-source Efficient Language Models. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy."
Apple claims that OpenELM is a cutting-edge language model that efficiently distributes parameters throughout the layers of the transformer model, increasing accuracy.
OpenELM is said to contain eight models with four different parameter sizes: 270M, 450M, 1.1B, and 3B. Publicly accessible datasets have been used to train every model.
On the other hand, Apple is reported to be developing its own large language model (LLM) to power on-device generative artificial intelligence (AI) features for its new iPhone series.
It is possible that Apple’s AI model will run entirely on-device, according to Bloomberg.
Without necessitating internet connectivity, it essentially means Apple’s inaugural AI features would work offline.
-
NASA plans historic lunar flyby for upcoming crewed mission
-
NASA shows how sun could swallow Earth
-
Scientists discover new form of life that grew 26ft tall
-
China eyes 2028 launch for first private crewed suborbital space tourism
-
Astronomers discover vast cloud of vaporised metal orbiting mysterious object
-
Jeff Bezos vs Elon Musk: Blue Origin enters satellite race to rival Starlink
-
NASA celebrates one year of Trump’s second term with Moon and Mars achievements
-
Send your name to Moon with Nasa’s Artemis mission: Here’s how