Repeated premeditated crimes by ankle monitor wearers lead to AI behavioral prediction surveillance that begins with offenders and expands into a preventive monitoring infrastructure for the general population.
The electronic ankle bracelet was supposed to be the leash. It was not short enough. When monitored offenders begin methodically scouting locations before committing new crimes, the public demands something beyond location tracking. The government introduces ARGOS, an AI that analyzes movement patterns, speed changes, location dwell times, and proximity to vulnerable sites to generate risk scores. It works — recidivism among monitored offenders drops 60%. The success is so visible that city governments request access. First for parolees. Then for areas near schools. Then for public transit hubs. The system that was built for the worst of society becomes the lens through which all of society is seen. No one voted for it. It simply metastasized from one reasonable decision to the next.
A 42-year-old father named Dongsu receives a push notification on his phone while walking his daughter to elementary school: 'ARGOS advisory — elevated behavioral anomaly detected within 200m of Hanbit Elementary. Exercise standard caution.' He looks around at the quiet morning street and sees nothing unusual. He holds his daughter's hand tighter anyway. Later he will learn that the alert was triggered by a delivery driver who paused too long checking an address. The system cannot tell him this, and he cannot unlearn the anxiety.
Predictive systems may genuinely prevent harm in narrowly defined contexts. The moral question is not whether the technology works but whether a society can deploy it in limited form and resist the gravitational pull toward universal application. South Korea's strong civil liberties tradition and active Constitutional Court may prove to be the institutional antibody that other nations lack.