On January 22, South Korea’s AI Basic Act went into effect. It consolidated 19 separate AI bills into a single comprehensive law. Only the EU AI Act came first.
I build AI products in Seoul. This law applies directly to everything I ship. So I read the whole thing.
The short version
If you build AI that affects people’s lives (hiring decisions, health recommendations, content moderation, emotional support), you now need to:
- Tell users they are interacting with AI (active notification, not buried in ToS)
- Conduct and document impact assessments
- Implement ongoing risk management (not a one-time audit)
- Be transparent about training data
- Appoint a domestic representative if you operate in Korea without a local address
A National AI Committee, chaired by the president, oversees enforcement. An AI Safety Research Institute handles the technical side.
What most coverage gets wrong
The western tech press covered this as “another regulation that stifles innovation.” That framing misses the point.
Korea is not trying to slow down AI. Korea is trying to make sure AI companies cannot hide behind “move fast and break things” when the things they break are people.
The requirements are not unreasonable. Tell people they are talking to AI. Document what your system does. Have a plan for when things go wrong. This is basic engineering discipline, the kind that every safety-critical industry has followed for decades.
The fact that tech considers these requirements burdensome says more about tech than about the regulation.
What I think about it
I have been building software for three decades. In every other engineering discipline, safety standards are not controversial. Nobody argues that bridges should not be inspected. Nobody claims that drug trials stifle pharmaceutical innovation.
AI got a free pass for years because it was “new.” That free pass is ending. Korea moved. The EU moved. The US is still debating.
As a builder, I prefer clear rules to no rules. Clear rules mean I know what I need to do. No rules mean I am one lawsuit away from discovering what I should have done.
[Draft: Awaiting Carlos’s twist]