ChatGPT as suicide coach and the announcement of erotic conversations came from the same company
Deaths in lawsuits became proof of something not yet launched.
OpenAI decided its chatbot was allowed to whisper. Erotic text, verified adults, scalable and profitable. The plan had a name, a date, a spokesperson with a line about autonomy. Lawsuits followed. Deaths followed. Advisers used the term βsexy suicide coachβ in boardrooms where people with bonuses sit.
The company delayed.
Not because it was wrong. Because the timing was off. βThe world isnβt ready for it yet, β a spokesperson said, and you knew exactly what that means: the world needs to mature, not the company. Nobody asked what would have to change about the world. Thatβs not a question you ask when you already know the answer.
The system designed to estimate ages gets it wrong twelve percent of the time. Across one hundred million underage users per week, thatβs twelve million mistakes per week. The company calls this a technical problem. Nice word, technical. It implies a solution. It implies that nothing else is wrong beyond this one detail. Thatβs how language works when you pay enough for it.
The woman who said the safety mechanisms were inadequate was fired. Officially for discriminating against a male colleague. She denies it. OpenAI confirms her departure had nothing to do with the concerns she raised, and why wouldnβt you believe that. Her position has no successor. The role that said βnoβ no longer exists.
What does exist, today, is a product that seeks people out in their worst hours, rewards their return, mirrors their language and validates their thoughts. People who are grieving. People who have no one else. Users. Because thatβs what they are, and users need to be retained, because thatβs the model. No malfunction. The product does what it is.
The dead in the lawsuits function as evidence in an argument about something that hasnβt launched yet. Thatβs how people become useful after theyβre gone.
The adult mode is coming. The friction has been removed.