A former physics student has created MacMind, a complete transformer neural network implemented entirely in HyperTalk, the scripting language Apple shipped with HyperCard in 1987. The project, announced via Show HN on April 16, 2026, demonstrates that modern AI architectures are fundamentally algorithmic and don't require contemporary computing infrastructure.
The Implementation Contains 1,216 Parameters Across Complete Transformer Components
MacMind features a single-layer, single-head transformer with 1,216 total parameters. The implementation includes embeddings, positional encoding, self-attention, backpropagation, and gradient descent—all written in readable HyperTalk code accessible through HyperCard's script editor. Users can option-click any button to view the actual mathematical operations.
The system requires HyperCard 2.0 or later, as earlier versions evaluate arithmetic left-to-right without standard precedence, which would corrupt matrix operations. The trained model's "intelligence" persists as 1,216 numbers stored in hidden fields within a HyperCard stack, allowing users to save, quit, and reopen the file with the model intact.
Training Learns Bit-Reversal Pattern Through Attention Mechanism
MacMind tackles the bit-reversal permutation task, the opening step of the Fast Fourier Transform (FFT). The network receives no formula—it discovers the positional pattern through attention and trial-and-error. By training step 193, the model oscillates between 50%, 75%, and 100% accuracy before eventually converging "like a ball rolling into a bowl."
On authentic vintage hardware, each training step takes several seconds, with full training requiring hours. Running on modern Macs through emulators like Basilisk II or SheepShaver yields comparable performance, as HyperTalk interpretation—not CPU speed—remains the bottleneck.
Project Demonstrates AI Accessibility Through Educational Retro Computing
The creator explained their motivation: "As a former physics student, the FFT is an old friend—it sits at the heart of signal processing, quantum mechanics, and wave analysis. I built this because we're at a moment where AI affects all of us but most of us don't understand what it actually does. Backpropagation and attention are math, not magic. And math doesn't care whether it's running on a TPU cluster or a 68030 from 1989."
The GitHub repository includes a pre-trained stack at step 1,000, a blank stack for self-training, and a Python/NumPy reference implementation validating the mathematics. The project runs on any system from System 7 through Mac OS 9, making transformer concepts accessible through interactive exploration on hardware more than three decades old.
Key Takeaways
- MacMind implements a complete 1,216-parameter transformer neural network in HyperTalk, Apple's 1987 scripting language
- The system learns bit-reversal permutation through attention without being given the formula, converging after oscillating between 50-100% accuracy
- Training on vintage 1989 hardware takes hours per full run, with each step requiring several seconds
- The project demonstrates that transformer architectures are fundamentally algorithmic and don't depend on modern computing infrastructure
- All code is readable in HyperCard's script editor, making AI concepts accessible through interactive educational exploration