A March 5, 2026, analysis of the chardet v7.0.0 relicensing controversy has sparked debate about whether AI can be used to circumvent traditional open source licensing requirements. Developer tuananh's blog post examining the issue generated 248 points and 251 comments on Hacker News, highlighting growing concerns about AI's role in software licensing.
The Chardet Relicensing Controversy
Maintainers of chardet, a Python character encoding detection library, used Claude Code to completely rewrite the project, converting it from LGPL to MIT licensing. This approach avoided the traditional requirement of obtaining consent from all original contributors—typically impossible for legacy projects with numerous contributors.
The original chardet author, a2mark, challenged the relicense on GitHub issue #327, arguing that "exposure to originally licensed code" makes this not a genuine clean room implementation, regardless of AI involvement. This challenge generated a separate Hacker News discussion with 330 points and 198 comments.
Three Core Legal Concerns Identified
The analysis identifies three major legal and ethical problems with AI-assisted relicensing:
The Clean Room Problem: Traditional relicensing requires separation—one team writes specifications while another codes without seeing the original. Using AI trained on the original LGPL code potentially bypasses this protection, making the output a derivative work that must remain LGPL.
The Copyright Vacuum: A recent Supreme Court decision established that AI-generated material cannot be copyrighted. If the rewritten code isn't copyrightable, maintainers may lack legal standing to license it under MIT at all.
The Copyleft Threat: Accepting AI rewriting as valid relicensing could undermine copyleft licenses entirely. Developers could feed GPL projects to large language models with minimal prompts and release under permissive licenses.
Implications for Open Source Licensing
This case represents a novel legal question about whether AI intermediation can bypass traditional clean room requirements for license conversion. The debate touches on fundamental questions about AI's role in copyright, derivative works, and open source licensing.
The controversy highlights potential vulnerabilities in the copyleft licensing model, which relies on derivative work provisions to maintain code openness. If AI can be used to create legally distinct rewrites, the entire copyleft framework could be undermined.
Key Takeaways
- Chardet maintainers used Claude Code to rewrite the library from LGPL to MIT licensing without contributor consent
- Three legal concerns emerged: clean room violation, AI copyright vacuum, and potential copyleft undermining
- A recent Supreme Court decision established AI-generated material cannot be copyrighted, creating licensing uncertainty
- The original author challenged the relicense, arguing AI exposure to original code makes it a derivative work
- The case represents a novel legal question about whether AI can bypass traditional license conversion requirements