Introduction

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Machine learning and other artificial intelligence systems have significantly accelerated in the last few years. In the process, many tasks that were previously difficult or impossible, even on enterprise equipment, can now be done on a device that fits in your hand. Few of these technologies have garnered the hype and controversy associated with Large Language Models (LLMs). The traditional LLM requires a massive amount of computational power, memory, and resources to run. These needs could only be implemented by well-funded startups able to train and run these models.

A recent development in LLMs comes with the deployment of local models. These models are optimized and simplified to run on the devices and equipment commonly found in the hands of everyday users. Starting with iOS 26, iPadOS 26, macOS 26, and other operating systems sharing the 26 version, Apple is providing its own local model, optimized for use in apps, called Apple Foundation Models. Apple Foundation Models are Apple’s on-device AI models, designed to protect privacy while helping with tasks like writing text, summarizing information, and organizing data on supported devices. Because all data remains on the device, you don’t need an internet connection, latency is reduced, and you avoid the privacy risks of sending data to third-party services.

In this lesson, you will take a chat-style app and expand it to interact with Foundation Models.

See forum comments
Download course materials from Github
Next: Introduction to Apple Foundation Models