Malus.sh: The Satirical 'Clean Room as a Service' That Exposes Open Source's Existential Crisis
Malus.sh presents itself as a service that uses AI robots to recreate open source code without license obligations. It's satire—but the legal and technical questions it raises about AI-driven clean room engineering are deadly serious.
It took Simon Willison, one of the most respected voices in the open source community, a moment to confirm it was a joke. That alone tells you everything about where we are in 2026.
Malus.sh presents itself as a "Clean Room as a Service" platform that uses "proprietary AI robots" to recreate open source projects from scratch, delivering "legally distinct code with corporate-friendly licensing." The pitch is as brazen as it is absurd: "No attribution. No copyleft. No problems."

The name itself is the first clue—Malus is Latin for "bad" or "harmful." The website's testimonials include gems like "I used to feel guilty about not attributing open source maintainers. Then I remembered that guilt doesn't show up on quarterly reports." The CEO letter signs off with: "We owe you a debt, but we have no intention of repaying it."
It's satire. Brutal, cutting, uncomfortably close-to-reality satire. And the Hacker News community responded with over 1,400 upvotes and 500 comments, suggesting it struck a nerve that runs deeper than any joke.
The Legal Fiction That Makes Malus.sh Almost Believable
The satire works because it's built on a real legal foundation. The website references Baker v. Selden (1879), a U.S. Supreme Court case that established copyright protects expression but not ideas. This principle gave rise to "clean room engineering"—a technique where one team studies a specification and another team, completely isolated from the original code, implements it from scratch.
Phoenix Technologies used this method in 1984 to clone the IBM BIOS. It took months. It required careful documentation, strict isolation protocols, and significant engineering resources.
Malus.sh claims to do the same thing in minutes using AI. Upload your package.json. Their "robots" analyze documentation and API specs. Different "robots" implement from those specs. The result is code that functions identically but carries none of the license obligations.
The joke, of course, is that this isn't actually a service you can buy. The deeper joke—and this is what makes people uncomfortable—is that it's describing something AI tools are increasingly capable of doing right now.
The Real Controversy That Inspired the Satire
Malus.sh didn't appear in a vacuum. It's a direct response to a genuine dispute that erupted in early March 2026 involving chardet, a widely-used Python library for detecting text encoding.
The maintainer of a new version claimed they had reimplemented the library using Anthropic's Claude AI without directly referencing the original source code. Therefore, they argued, the LGPL license didn't apply—and they changed the license to MIT, which doesn't require source code disclosure.
The original author pushed back, arguing that even AI-reimplemented code based on knowledge of the original still violated the spirit if not the letter of the LGPL. The debate split the community. Some developers defended the move as legitimate clean room practice. Others saw it as license laundering—exploiting a technicality to strip away copyleft protections.
Malus.sh takes this tension and pushes it to its logical, absurd conclusion. What if a company offered this as a service? What if you could systematically "liberate" yourself from every open source obligation for a per-kilobyte fee?
Why the Satire Lands So Hard
The Malus.sh website is meticulously designed to mirror the language and aesthetics of actual enterprise SaaS companies. There's pricing (pay per KB of unpacked npm size). There's an SLA guarantee with an asterisk about offshore legal jurisdictions. There's a comparison table showing how Malus is better than "giving credit to maintainers."
One Hacker News commenter captured the unease perfectly: "I understand this is satire, but in six months it might not be so far from reality."
The site's blog post, titled "Thank You for Your Service: On the Obsolescence of Open Source," is particularly ruthless. It begins with genuine-sounding gratitude to open source maintainers—the unpaid volunteers who answer GitHub issues at 2 AM, who maintain the infrastructure that powers Fortune 500 companies, who have built "a miracle of human cooperation."
Then the pivot: "It is also, from a fiduciary standpoint, completely insane."
The post catalogs real open source supply chain failures: the left-pad deletion that broke thousands of builds in 2016, the Log4Shell vulnerability that ruined Christmas 2021 for engineers worldwide, the colors.js sabotage where a maintainer deliberately introduced infinite loops to protest corporate exploitation, the node-ipc package that embedded file-wiping geopolitical payloads targeting Russian IP addresses.
Each incident illustrates the same uncomfortable truth: companies have built critical infrastructure on software maintained by volunteers who have no contractual obligations, no security teams, and occasionally very strong opinions about how their work is used.
The AI Factor: When Clean Room Becomes Trivial
The core technical claim underlying Malus.sh—that AI can perform clean room engineering—is already being tested in practice.
The chardet dispute wasn't theoretical. Someone actually used Claude to reimplement a library and claimed the new license terms applied. The legal question—whether AI-generated code based on understanding of existing implementations constitutes a derivative work—remains largely untested in courts.
What's changed is the cost structure. Traditional clean room engineering required months of careful human work. Malus.sh claims (satirically, but pointedly) that their "robots" can replicate the Phoenix BIOS effort in about an hour, or left-pad in ten seconds.
If AI makes clean room engineering trivially easy, what happens to copyleft? The entire enforcement mechanism of licenses like GPL and AGPL depends on the assumption that copying code is the primary way to derive value from it. If you can achieve functional equivalence without technically copying, the legal foundation starts to look like Swiss cheese.
As one Hacker News commenter noted: "It's interesting that the focus is just on open source licenses. If one can strip licenses from source code using LLMs, then surely a Microsoft employee could do the same with the Windows source code!"
The Existential Question for Open Source
Malus.sh is ultimately asking a question that the open source community has been reluctant to confront: what if the generosity model is unsustainable precisely because it's been too successful?
The website's fake CEO, "Mike Nolan," puts it bluntly: "Many companies use open-source software without contributing anything. We are simply charging those companies a fee in exchange for releasing the license."
The uncomfortable reality is that plenty of companies already treat open source this way. They consume without contributing. They strip attribution when they think they can get away with it. They complain about copyleft requirements while building billion-dollar products on volunteer labor.
Malus.sh isn't really offering a new service. It's simply making explicit what's already implicit in how many companies behave. The satire works because it removes the polite fiction that everyone is playing by the same rules.
What Happens Now
The Malus.sh website includes one detail that blurs the line between satire and reality: if you actually upload a manifest and pay the fee, you reportedly receive a clean custom build package. As one user noted: "If you're actually providing the service you're satirizing, can you still call it satire?"
This ambiguity is probably the point. The creators want us to sit with the discomfort of not knowing exactly where the line is.
The site is based on a presentation given at FOSDEM 2026 titled "Let's end open source together with this one simple trick." The presentation, like the website, was clearly satirical—but it highlighted a technical capability that is very real and growing more accessible by the day.
The open source community now faces a choice. It can pretend that AI-driven clean room engineering is still too difficult to threaten the license ecosystem. Or it can acknowledge that the rules may need to evolve, that new enforcement mechanisms might be necessary, that the social contract underlying open source—already strained by corporate exploitation—may need to be rewritten entirely.
Malus.sh won't be the last word on this topic. But it might be the most honest one.
Malus.sh describes itself as "Liberating corporations from open source obligations since 2024." The service is satirical, but the questions it raises are very real.