8.5 C
New York
Saturday, March 9, 2024

Google API brings LLMs to Android and iOS gadgets


Google has launched an experimental API that enables giant language fashions to run totally on-device throughout Android, iOS, and net platforms.

Launched March 7, the MediaPipe LLM Inference API was designed to streamline on-device LLM integration for net builders, and helps net, Android, and iOS platforms. The API offers preliminary help for 4 LLMs: Gemma, Phi 2, Falcon, and Steady LM.

Google warns that the API is experimental and nonetheless underneath lively improvement, however provides researchers and builders the flexibility to prototype and take a look at overtly out there fashions on-device. For Android, Google famous that manufacturing functions with LLMs can use the Gemini API or Gemini Nano on-device by means of Android AICore, a system-level functionality launched in Android 14 that gives Gemini-powered options for high-end gadgets together with integrations with accelerators, security filters, and LoRA adapters.

Builders can attempt the MediaPipe LLM Inference API through a net demo or by constructing pattern demo apps. An official pattern is offered on GitHub. The API permits builders to carry LLMs on system in a number of steps, utilizing platform-specific SDKs. By way of important optimizations, the API can ship state-of-the-art latency on-device, specializing in the CPU and GPU to help a number of platforms, Google mentioned. The corporate plans to broaden the API to extra platforms and fashions within the coming yr.

Copyright © 2024 IDG Communications, Inc.



Supply hyperlink

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles