Transcript

Transformer runs on PDP-11 & CERN tiny AI in silicon - Hacker News (Mar 28, 2026)

March 28, 2026

Back to episode

Someone just trained a tiny Transformer—in straight PDP-11 assembly—on hardware that predates the modern internet. That’s not a thought experiment; it actually converges. Welcome to The Automated Daily, hacker news edition. The podcast created by generative AI. I’m TrendTeller, and today is March-28th-2026. Let’s get into what’s moving in software, hardware, AI, and the real world systems they run on.

Let’s start with the retro-computing-meets-ML story of the day. A developer released ATTN-11: a working Transformer model implemented entirely in PDP-11 assembly language. It’s not trying to be state of the art; it’s trying to be understandable and runnable on severely constrained machines. The result is a clear reminder that the “magic” of Transformers isn’t exclusively tied to massive GPUs—it can be reduced to a small set of building blocks, if you’re willing to make tradeoffs. Why it matters: projects like this strip away the mystique and make it easier to reason about what’s essential, what’s optional, and what modern ML stacks are really buying you.

Sticking with AI, but jumping from vintage hardware to cutting-edge physics: CERN has started using ultra-compact AI models embedded directly into silicon—specifically FPGAs—to filter Large Hadron Collider data in real time. The LHC produces an absurd torrent of information, far beyond what anyone can store, so the system has to make split-second decisions about what’s worth keeping. Putting “tiny AI” into the earliest filtering stage is a practical response to that reality, and it becomes even more important as the High-Luminosity LHC ramps data rates up again. The bigger takeaway: not all AI progress is about larger models—sometimes it’s about making inference fast, cheap, and reliable under extreme latency constraints.

Now to a very down-to-earth problem: running AI agents locally can go wrong in painfully ordinary ways—like deleting your home directory or wiping a repo. Stanford’s Secure Computer Systems group released “jai,” a lightweight Linux tool that aims to contain untrusted command-line workflows without requiring you to set up full containers or a VM. Think of it as a safer launch wrapper: keep your current project writable, but make the rest of your system much harder to damage. It’s explicitly not a silver bullet, but it’s an appealing middle ground for people who want to experiment with agents while reducing the potential blast radius. Why it matters: as AI tools become more autonomous, the default safety model of “just run it on your laptop” is looking increasingly outdated.

Switching gears to civic tech and open data: a new GitHub repository called legalize-es has published Spain’s state legislation as a version-controlled Git project. Each law is stored as Markdown, with structured metadata, and each reform shows up as a commit—mapped to official publication dates and linked back to Spain’s Official State Gazette sources. The point isn’t to replace the official record; it’s to add a tooling layer on top of public-domain text. Why it matters: once laws are represented like code, you can audit changes precisely, compute diffs between versions, and build better search and alerting tools. This is the kind of infrastructure that makes transparency easier not just for lawyers, but for journalists, researchers, and anyone tracking how rules evolve over time.

On the desktop side, macOS shows up in two very different ways today. First: Cocoa-Way, a new open-source project that acts as a native Wayland compositor for macOS, written in Rust. The aim is to let macOS users run Linux Wayland apps and have them appear like normal macOS windows—without dragging in old X11 layers or relying on heavyweight remote-desktop setups. If it works well in practice, it could become a neat bridge for developers who live on Macs but need Linux GUI tools, while keeping latency and friction low. Second: a developer blog post about an unexpectedly polarizing UI detail—window corner rounding in “macOS 26.” The complaint isn’t just that corners are round; it’s that they’re round in inconsistent ways across apps, making the system look mismatched. The author’s workaround is very much power-user territory: a runtime tweak that nudges third-party apps into the same visual style, aiming for consistency rather than perfection. Why it matters: it’s a small example of a bigger tension—people want customization, platforms want control, and when official knobs don’t exist, users reach for hacks that can be fragile or risky.

Let’s zoom out to energy, where software meets infrastructure. A live snapshot of Great Britain’s electricity system showed generation exceeding demand and a wholesale price dipping below zero. The mix at that moment was overwhelmingly renewable, with wind doing the heavy lifting and solar adding a sizable chunk—while gas was a relatively small slice. When supply surges like that, the grid doesn’t just “have extra power”; it reshapes cross-border flows, storage decisions, and pricing dynamics in real time. Why it matters: negative prices are a signal that the system is changing faster than the market and the grid’s flexibility. It’s also a preview of the next set of challenges—storage, interconnectors, and demand shaping—becoming just as important as generation.

Finally, in PC hardware: AMD announced the Ryzen 9 9950X3D2 “Dual Edition,” a flagship CPU that puts 3D V-Cache on both core chiplets instead of only one. The practical promise is less of that awkward split personality where some cores have the extra cache and others don’t—meaning fewer scheduling quirks and more predictable performance in games and other cache-sensitive workloads. The tradeoff is straightforward: more power and cooling demands, and slightly different boost behavior. Why it matters: at the high end, people aren’t just buying peak speed—they’re buying consistency, and AMD is clearly trying to sand down the sharp edges that made earlier designs a bit finicky.

That’s it for today’s Hacker News roundup—an assembly-language Transformer, tiny AI in CERN’s trigger systems, safer local sandboxes for agents, laws as Git history, macOS experiments, renewable-driven negative power prices, and AMD pushing cache even further. Links to all stories can be found in the episode notes. I’m TrendTeller—thanks for listening, and I’ll see you in the next one.