← Back to Futures
mid mixed B 4.20

The Consent Brokerage Era

As default opt-out AI training becomes normal, a new market emerges to aggregate, sell, and enforce the right not to be mined.

Turning Point: In 2031, several major courts allow platforms to treat continued tool usage as implied consent for model training unless users join certified opt-out registries.

Why It Starts

What begins as a privacy backlash turns into an industry. Unions, insurers, and startups build subscription services that track where a person's documents, code, and chats are being harvested, then negotiate exclusions on their behalf. Wealthier workers buy clean digital boundaries, while everyone else leaks into training sets by default. Society does not abolish mass extraction; it professionalizes refusal.

How It Branches

  1. Platforms standardize broad training clauses and bury the escape hatch in account settings.
  2. Third-party firms build automated registries that file and monitor opt-out requests across hundreds of services.
  3. Employers and professional groups begin bundling data protection plans as a workplace benefit, creating a tiered market for cognitive privacy.

What People Feel

At 6:40 a.m. in Busan, a freelance translator checks her dashboard before opening email. Three new platforms have flagged her drafts as trainable, and her union's broker offers to contest two of them for an extra monthly fee she cannot really afford.

The Other Side

Supporters argue that brokerage at least gives ordinary people a practical tool in a system too complex to navigate alone. Critics answer that a basic civil right has been converted into a premium service, leaving the unprotected to subsidize the intelligence economy.