As repeated stalking murders expose the limits of reactive state policing, private AI-powered threat prediction and physical intervention services create a parallel security market that challenges the state's monopoly on public safety.
Another stalking murder dominates headlines. The victim had filed police reports, obtained a protection order, and done everything the system asked. The system failed anyway. A wave of grief and rage follows. Into this moment steps AegisAI, a startup offering a subscription service: AI algorithms analyze communication patterns, location data, social media activity, and public records to predict escalation risk. When risk crosses a threshold, the system dispatches contracted private security personnel within minutes. The service costs 290,000 won per month — roughly the price of a gym membership. Within six months, 180,000 subscribers sign up, overwhelmingly women aged 20 to 40. The National Police Agency objects that privatized security undermines the state's duty to protect citizens. Constitutional scholars debate whether safety has become a commodity that only the paying can access. AegisAI's investors quietly double down. A class-action lawsuit argues the service constitutes illegal private policing. The court delays its ruling. Meanwhile, the subscriber count climbs past 400,000.
Lee Eunji, a 31-year-old graphic designer in Bundang, wakes at 3:17 AM on a Tuesday in January 2028 to a vibration pattern she has memorized: three short, two long. Her AegisAI app shows an amber alert — her ex-boyfriend's phone has been detected within 800 meters of her apartment for the third time this week, between midnight and 4 AM. A message reads: 'Security response team notified. ETA 6 minutes. Remain inside. Recording activated.' She hears a car pull up outside. She does not know if it is the security team or him. She does know that the police protection order in her drawer has never once made a car appear outside her window at 3 AM.
The state's monopoly on security has always been aspirational rather than actual. Police cannot be everywhere, and protection orders are pieces of paper that depend on the willingness of the perpetrator to obey them. If private technology can genuinely predict and prevent violence that the state cannot, opposing it on principle means accepting preventable deaths for the sake of institutional symmetry. The real scandal is not that a private service exists, but that the public system's failure created the demand for it. The correct response is not to ban AegisAI but to make its capabilities a public service available to everyone regardless of ability to pay.