Developer Arthur-Ficial (franze) launched Apfel on April 3, 2026, unlocking Apple's proprietary on-device LLM that ships with every modern Mac but remains restricted to Siri and system features. The project gained 398 points on Hacker News and reached 625 GitHub stars, democratizing access to a capable language model that Apple already installed on every Mac running macOS 26+.
Three Interfaces for Apple's FoundationModels Framework
Apfel provides developers and users with three ways to access Apple's on-device LLM through the FoundationModels framework:
- CLI tool: Supports pipe operations, file attachments, and JSON output
- OpenAI-compatible HTTP server: Runs at localhost:11434 for SDK integration and drop-in replacement in existing projects
- Interactive chat: Configurable context management strategies for the constrained token window
Technical Specifications and Implementation Details
The tool runs exclusively on Apple Silicon, leveraging the Neural Engine and GPU with a fixed 3 billion parameter model. Key specifications include:
- 4,096 token context window (approximately 3,000 English words)
- Mixed 2/4-bit quantization
- 100% on-device processing with zero cloud dependency or API costs
- Support for 11 languages: English, German, Spanish, French, Italian, Japanese, Korean, Portuguese, Chinese (Simplified and Traditional), and Cantonese
- Five context trimming strategies for managing the token window
- Native file attachment support and tool calling with schema conversion
Built in Swift 6.3 with strict concurrency safety, Apfel includes three targets: ApfelCore (pure logic library), apfel (CLI/server executable), and apfel-tests (48 unit tests + 51 Python integration tests). Version 0.6.25 ships with 114 commits and requires no Xcode—only Command Line Tools.
Privacy-Preserving Alternative to Cloud LLMs
The innovation lies in providing a zero-cost, privacy-preserving alternative to cloud LLMs for Mac users. As the creator states, 'every token generated locally on your Apple Silicon...nothing leaves your machine.' The MIT-licensed open source project wraps Apple's SystemLanguageModel API with token counting, context management, and OpenAI protocol translation.
Hacker News discussion highlighted praise for the zero-cost aspect and privacy benefits, developer excitement about OpenAI API compatibility, and acknowledgment of the 4K context window limitation compared to cloud models.
Key Takeaways
- Apfel unlocks Apple's 3 billion parameter on-device LLM embedded in macOS 26+ but restricted to Siri and system features
- Gained 398 points on Hacker News and 625 GitHub stars following launch on April 3, 2026
- Provides CLI tool, OpenAI-compatible HTTP server at localhost:11434, and interactive chat interfaces
- Runs 100% on-device with zero cloud dependency, supporting 11 languages and 4,096 token context window
- MIT licensed and built in Swift 6.3 with 114 commits across version 0.6.25