California Built a Surveillance Pipeline and Called It Child Safety

California Built a Surveillance Pipeline and Called It Child Safety

To the Authors of Assembly Bill 1043,

I’m writing to you not as a partisan or a tech industry representative, but as someone who’s spent considerable time trying to understand what this law actually does and what it means for ordinary people. I want to be clear from the outset that I don’t question your motives. The desire to protect children online is genuine and important. But I’m asking you to walk with me through a logical progression of what this law sets in motion, because I believe that when you follow those effects to their conclusion, you’ll find that this legislation is poised to accomplish the exact opposite of what you intend.

Listen to the podcast: About 10 minutes.

The basic mechanism is straightforward. Beginning in 2027, operating system providers must build into their software a system that collects the age of a device’s primary user and transmits that information as a signal to any application that requests it. Developers who receive that signal are deemed to have actual knowledge of the user’s age and must treat it as authoritative. On its face, this seems like a clean, centralized way to ensure that apps know whether their users are children and can apply appropriate protections.

Before we talk about what the law demands, though, we need to talk about who it demands it from. Most people picture a software company when they imagine an operating system provider. They picture a legal department, a compliance office, engineers on salary, and a budget for outside counsel. That picture fits Microsoft and Apple reasonably well. It doesn’t fit the world that provides most of the rest of our computing infrastructure.

Linux, BSD, and the vast ecosystem of open source operating systems aren’t built by companies. They’re built by thousands of individuals scattered across the globe who contribute their time voluntarily. There’s a person in Norway who maintains a critical component of the networking stack. There’s a person in Brazil who handles the graphics subsystem. There’s a person in Australia who manages the package repository. These people have never met each other. They have no central office and no budget for legal counsel. They contribute because they believe in the project and they want to see it succeed.

When you ask this network to implement an age verification system, the questions multiply faster than anyone can answer them. Where will volunteers store the age data they collect, and what security standards must they meet? Who audits their compliance? What happens when a contributor in a country with different privacy laws touches the code that handles this data? If a developer in Norway commits a change that inadvertently creates a vulnerability in the age signaling system, is that person subject to fines from the California Attorney General? How would that even be enforced? These aren’t edge cases to be sorted out in implementation. They’re foundational problems with no good answers.

The only answer that actually makes sense for many of these projects is the one a small operating system called MidnightBSD has already chosen. MidnightBSD is a descendant of software originally developed at UC Berkeley, maintained by a small team of volunteers. When AB 1043 passed, they announced they would modify their license to prohibit residents of California from using their operating system for desktop purposes after January 1, 2027. They didn’t do this because they’re hostile to California or opposed to child safety. They did it because the compliance burden is simply more than a small volunteer project can bear, and blocking California is the only way to protect themselves from a law that was written without any apparent consideration of how open source development actually works. They won’t be the last.

Consider what that means for the families who can least afford the alternative. Linux Mint is a free, secure, privacy-conscious operating system and a common choice among home educators and digital equity programs that put refurbished computers into the hands of low-income students. If Mint follows MidnightBSD’s lead, and there’s every reason to expect it will, those families don’t have a fallback. The children this law was written to protect may find themselves with an inoperable computer and no path forward that doesn’t cost money they don’t have. Mint won’t be alone in that decision either. A significant portion of the open source operating systems that serve as basic infrastructure across California face exactly the same impossible choice.

This is what performative legislation looks like in practice. The law signals concern for children. What it produces is a small volunteer project in the United States blocking an entire state to survive. No child is safer. No predatory app has been constrained. The only concrete outcome so far is that law-abiding contributors to a legitimate open source project have been forced to treat California as a liability. That’s not child protection. That’s the appearance of it. And as it turns out, the appearance of child protection is considerably more dangerous than no protection at all.

Now put yourself in the position of the people caught in the middle of this. Maybe you’re a resident of Oregon or Nevada, and you travel to California for a conference or a family visit. You bring your laptop, the same one you’ve used for years without issue. You aren’t a California resident. You don’t vote in California elections. You’ve never heard of Assembly Bill 1043. But when you cross the state line and turn on your machine, your operating system recognizes that it’s now being used within California’s borders and stops functioning. Updates are disabled. The screen displays a message indicating that the software isn’t authorized for use in this location. Your presentation for tomorrow morning’s meeting is locked on a machine that no longer works, and you have no idea why.

Or maybe you’re a California resident who’s relied on a particular Linux distribution for years. You have files on that machine: documents, photographs, projects you’ve worked on for a long time. You aren’t following technical news closely, and you don’t read legislative updates. On January 1, 2027, you turn on your computer and find that your operating system will no longer receive updates, or that the project has simply declared its software unavailable in your state. Your machine has become a paperweight and your data is inaccessible, and nothing in your daily life prepared you for that possibility.

These aren’t hypothetical worst cases. They’re the direct, predictable consequence of a geofencing response that projects like MidnightBSD have already committed to. The law creates the problem and the geofencing is the rational response to it.

Now consider what you’ve actually built from a security standpoint. You’ve mandated that every operating system, the most fundamental software on any computing device, become a centralized source of identity information for every application that runs on it. You’ve created a standardized pipeline through which data about the user flows to any piece of software that asks. And you’ve done so in a world where we already know that every standardized piece of digital infrastructure eventually becomes a target for exploitation.

Think about what this looks like from the perspective of a hostile actor. A foreign government, a criminal enterprise, or an abusive individual doesn’t need to compromise dozens of different applications to gather information about a target. They simply need to reach the operating system at its source, or intercept the signal in transit. We’ve already watched sophisticated actors compromise systems through carefully constructed attacks on far less standardized infrastructure than this. The idea that a new pipeline, built under legislative deadline pressure by a distributed network of volunteers, would somehow resist that kind of exploitation requires a faith in good behavior that the historical record simply doesn’t support. And the harm here isn’t limited to children. This infrastructure doesn’t distinguish between a child and an adult when it’s exploited. It simply provides a pipeline, and everyone who uses a computing device is on the other end of it.

That pipeline also doesn’t stay limited to age. You’ve built an endpoint. You’ve established the precedent that an operating system can and should transmit information about its user to applications on request. Once that infrastructure exists, what constrains a future legislature, facing a different perceived crisis under different political pressure, from expanding the signal? Citizenship status, vaccination records, voter registration. The mechanism you’ve created doesn’t inherently limit itself to any particular attribute. It simply transmits what it’s told to transmit. And once the precedent is set at the OS level, the argument for expanding it will always be that the infrastructure is already there.

Taken together, what this law actually does is assume good faith at every level simultaneously. It assumes developers won’t misuse the signal. It assumes operating system providers won’t exploit the data anticompetitively. It assumes parents will accurately report their children’s ages. It assumes malicious actors will respect the law’s prohibitions. And it assumes a global network of volunteer contributors will somehow coordinate to implement complex compliance mechanisms without legal protection or financial resources. These are assumptions that the real world doesn’t honor, and building law on top of them doesn’t make them more reliable. It just means the consequences of their failure fall on ordinary people who had no say in any of this.

That last point deserves more attention than it usually gets. California is the fifth largest economy in the world, and when you pass a law like this, you aren’t simply regulating conduct within your borders. You’re setting a standard that ripples through the entire global digital ecosystem. The open source contributors most affected by this law live in Norway, Brazil, Australia, and dozens of other countries. They’ve never voted in a California election. They have no representation in your legislature and no meaningful recourse against your Attorney General. And yet this law reaches into their projects, their code, and their personal legal exposure as directly as if they were constituents. No single legislative body, regardless of its economic weight, should hold the authority to impose compliance obligations on people who have no voice in its decisions. That’s not a partisan position. It’s a basic principle of democratic legitimacy.

The only responsible path forward is to repeal this law before it takes effect. Not to amend it around the edges or adjust the definitions, but to recognize that the entire approach is flawed at its foundation. You can’t protect privacy by mandating surveillance infrastructure. You can’t protect children by creating an attack surface that hostile actors will inevitably find. You can’t serve California residents while passing a law that leaves visitors stranded with inoperable hardware and no explanation why. And you can’t expect a distributed global network of volunteers to absorb compliance costs that would strain the resources of major corporations.

The road to unintended consequences is paved with good intentions, and this law, for all of its good intentions, is leading somewhere none of us want to go. Please don’t let it stand.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Generating challenge...

Challenge verified successfully!