Matti Saarinen’s Post

Last year, Apple released open-source tools for training and inference on Apple silicon. Yesterday, they provided more details about their models running on Apple silicon. While Apple's on-device models currently lag behind GPT-4 in performance, their strategy of leveraging the computing power in personal devices offers numerous advantages, potentially leading to low latency and high energy efficiency. Utilizing the processing capabilities of personal devices incurs no additional cost for Apple and ensures data remains on the device. Even the relatively low costs of using OpenAI models can accumulate over time and block some use cases. It remains to be seen whether Apple's on-device models will become sufficiently advanced for widespread applications.

Introducing Apple’s On-Device and Server Foundation Models

Introducing Apple’s On-Device and Server Foundation Models

machinelearning.apple.com

ari saarinen

neurologist, retired

6mo

Why Musk so angry?

Like
Reply

To view or add a comment, sign in

Explore topics