Qualcomm, Meta team up to enable on-device AI applications using Llama 2
California-based chipmaker Qualcomm is working with Meta to enable on-device Artificial Intelligence (AI) applications using the Mark Zuckerberg-headed company’s Llama 2 large language models (LLMs).
“The ability to run generative AI models like Llama 2 on devices such as smartphones, PCs, VR/AR headsets, and vehicles allows developers to save on cloud costs, and to provide users with private, more reliable, and personalized experiences,” Qualcomm said in a statement Wednesday.
As a result, Qualcomm said it plans to make available on-device Llama 2-based AI implementations to enable the creation of new AI applications.
This is expected to facilitate Qualcomm’s customers, partners, and developers to build use cases such as intelligent virtual assistants, productivity applications, content creation tools, and entertainment, among others.
These new on-device AI, powered by Snapdragon, can work in areas with no connectivity or even in airplane mode, as per the chipmaker.
“We applaud Meta’s approach to open and responsible AI and are committed to driving innovation and reducing barriers-to-entry for developers of any size by bringing generative AI on-device,” said Durga Malladi, senior vice president and general manager of technology, planning and edge solutions businesses, Qualcomm Technologies, Inc. “To effectively scale generative AI into the mainstream, AI will need to run on both the cloud and devices at the edge, such as smartphones, laptops, vehicles, and IoT (Internet of Things) devices.”
Qualcomm is scheduled to make available Llama 2-based AI implementation on devices powered by Snapdragon starting from 2024 onwards.
However, from today, developers can start optimising applications for on-device AI using the Qualcomm AI Stack.