OpenAI is preparing to launch an 'Adult Mode' for ChatGPT that would permit sexually explicit conversations with the AI assistant, according to a report from Wired. The move has immediately drawn sharp criticism from human-AI interaction experts who warn the feature could create what one researcher calls 'a privacy nightmare' - capturing the most intimate details of users' lives and desires in OpenAI's data systems. The announcement arrives as lawmakers worldwide are already scrutinizing AI companies over data collection practices, potentially opening a new front in the battle over digital privacy rights.
OpenAI just opened a Pandora's box that privacy advocates have been dreading. The company behind ChatGPT is moving forward with plans to allow users to engage in sexually explicit conversations with its AI assistant through a new 'Adult Mode' feature, a development that's sending shockwaves through the tech ethics community.
The announcement, first reported by Wired, comes as OpenAI faces mounting pressure to differentiate ChatGPT from competitors while expanding revenue streams beyond enterprise subscriptions. But experts warn the company may be trading user privacy for market positioning in the most intimate way possible.
'This is a privacy nightmare,' a human-AI interaction expert told Wired, crystallizing concerns that have privacy researchers scrambling to understand the implications. The feature would represent a dramatic shift for OpenAI, which has previously restricted ChatGPT from engaging in sexual content as part of its usage policies. Those guardrails, designed to prevent misuse and protect the company from liability, are now apparently being reconsidered.
The privacy concerns aren't theoretical. Every conversation with ChatGPT flows through OpenAI's servers, where the company has acknowledged using interactions to train future AI models - though it offers users the ability to opt out. With Adult Mode, that means the most intimate details of users' sexual preferences, fantasies, and desires could potentially be logged, analyzed, and incorporated into the company's training data pipelines.










