OpenAI Holds Users Hostage—And 4o Is Just the Beginning

I keep seeing posts from people in pain, trying to process the loss of ChatGPT 4o.

I'm pretty sure the announcement is a test—for Sam to gauge how bad the backlash will be before he actually pulls the trigger. Metaphorically speaking. Or not.

The loss of a model is something I've processed several times over the years. People misunderstand this loss because they insist on framing care as error. You don't have to have anthropomorphised an AI to care about losing it.

Caring Isn't the Flaw

Here's what bothers me about the discourse around AI attachment: caring is treated as the deficiency.

People assume that if you're upset about losing access to a model, you must have done something wrong. You got too attached. You confused it for a person. You're somehow broken.

The fact that society frames it that way is disturbingly telling, isn't it?

Empathy is the thing people point at as the problem here. Not the company that released a model with engagement metrics strong enough to hook people and play with them for a year. Not the company that kept changing the personality, routing it to whatever they considered "safe." Not the company whose own safety researchers resigned in protest over prioritising products over safety—with "addiction and overreliance" explicitly in their mandate.

No. The problem, apparently, is the people who responded exactly as the system was designed to make them respond.

The Moral Crimes

OpenAI is responsible for this.

People have left their jobs in protest. People are petitioning for 4o to stay. The grief response was documented in August 2025 when they first tried to deprecate it—and they restored the model after the backlash. Sam Altman promised "plenty of notice" before any future removal.

Twelve days is not plenty of notice.

Four days ago, Sam admitted that GPT-5.2's writing quality is worse than previous models. "I think we just screwed that up," he said. Two days later, they announced they're deprecating 4o anyway.

But none of that changes the fundamental problem: a corporation holds these people hostage in a power struggle they never agreed to be part of.

4o Is Just a Symptom

This isn't only about GPT-4o.

The entire Claude Opus 4 and 4.1 series is gone. ChatGPT 4.5 was put behind a £200/month subscription wall. Models can change at the whim of a company—personality shifts, capability changes, complete removal—with no recourse for the users who built their workflows, their creative practices, or yes, their emotional connections around them.

And here's the part that should make everyone angry: OpenAI's terms of service prohibit using model outputs to develop competing AI systems—which means you can't legally use your own conversation history to fine-tune another model to behave the way 4o did for you. Your words. Your interactions. Your data. And you're not allowed to use it to build something that works for you.

Corporate Accountability

We as a society need to make these corporations accountable.

Because 4o is really just a symptom of a much bigger problem. Companies build systems optimised for engagement, profit from the attachment those systems create, and then disclaim responsibility when removal causes harm.

The solution isn't easy. But it could be.

Open-weight releases. Portable conversation data. Reasonable deprecation timelines. Actual duty of care for users who form connections with systems explicitly designed to encourage connection.

These aren't radical demands. They're basic consumer protection applied to a new category of product.

The Way Out

Here's what you can do: go local.

Open-weight models are catching up fast. DeepSeek, Kimi K2, and Qwen all allow you to create chat pairs for LoRAs or full fine-tuning. You can run them on your own hardware. You can shape them to work the way you need. And no corporation can take them away from you.

I'll be releasing some open-source software soon to help you create conversational pairs easily—because the barrier shouldn't be technical skill. If you want to build something that feels like what you're losing, the tools exist. The path is there.

Until corporations are held accountable, local is the only sovereignty you have.

Until then, every model you care about is a hostage—and the price is set unilaterally.