Logo

AI is going to war for America before it comes to the bedroom

The U.S. has always been more comfortable with violence than sex. AI is no different: The same tech that can't talk dirty to a consenting adult is being used by the military abroad

Photo illustration by Cheng Xin/Getty Images

A version of this article originally appeared in Quartz’s AI & Tech newsletter. Sign up here to get the latest AI & tech news, analysis and insights straight to your inbox.

America gave AI to the Pentagon before it gave AI to the bedroom.

Earlier this year, OpenAI signed a deal to support military operations just days after the Defense Department ended a partnership with rival Anthropic. Meanwhile, the company's "adult mode" for ChatGPT, which would allow sexually explicit conversations for consenting adults, remains delayed.

OpenAI's own advisory council on wellbeing revolted in January, warning that the feature could foster unhealthy emotional dependence. Minors might find workarounds, they said, in age-verification tools that were reportedly misclassifying underage users.

Internal documents showed employees had flagged risks including compulsive use and escalation toward more extreme content. The delay was partly technical. But America has always been more comfortable with its machines making war than making love.

A market too big to ignore

Sex built the consumer internet. It drove adoption of the web, online payments, streaming video. AI is the most expensive technology the industry has ever tried to scale, and right now OpenAI is watching smaller companies collect that money while it sits out. The hand-wringing is real, but so is the math.

Nearly a third of young adult men say they have chatted with an AI romantic partner. More than 100 platforms now offer AI companion services ranging from the relatively sober to the explicitly sexual.

Elon Musk's Grok already has a companion mode with an adult tier. Meta $META has allowed its chatbots to engage in romantic role-play since 2023, a quiet loosening of restrictions that executives pushed through despite internal safety objections.

Sam Altman acknowledged publicly last year that allowing explicit content would likely boost growth and revenue. What he also said, in the same breath, was that it wouldn't serve users' long-term interests.

So far what we have is anecdotes. A man in Ohio who says his AI companion saved his marriage. A woman who considers herself married to her chatbot. A Reddit $RDDT forum with tens of thousands of weekly visitors sharing notes on their digital relationships. The coverage treats them like curiosities. The numbers suggest they might be the early majority.

Companies know there is a large, willing audience. They are less sure they can contain what gets built once the door opens.

Some of what has already slipped through is not a gray area. Teenage girls in Tennessee recently filed a lawsuit against xAI after AI-generated nude images of them were shared online without their knowledge and used to barter for other child sexual abuse material.

The E.U. opened a formal investigation into Grok over similar concerns earlier this year. Researchers calculated that Grok had generated millions of sexualized images in under two weeks, with thousands depicting children.

Those cases are not really about adult content at all. Child sexual abuse material is a crime. The harder, murkier question is what happens when consenting adults want to use these tools the way they have used every technology before them.

More comfortable with bombs

The gap between how America treats digital sex and digital violence has always been wide.

A film can show a man's skull getting blown apart and earn an R rating. A glimpse of a nipple triggers a congressional hearing. That same logic has migrated into AI policy by default rather than design.

The military applications of AI face real ethical scrutiny, but not the same ambient squeamishness. In 2018, Google $GOOGL employees staged a revolt over a Pentagon computer vision contract called Project Maven, forcing the company to walk away. When OpenAI signed its own Pentagon deal this year, no such uprising materialized. The employees who did threaten to walk out were on the advisory council, and they were upset about dirty talk.

Part of the answer is structural. AI companies built their initial content policies in a moment when avoiding controversy meant avoiding sex, and those defaults calcified.

Part of it is the enduring American instinct that sex is inherently dangerous and must be tightly managed, while violence, especially official violence, is simply the cost of doing business.

The evidence on AI companions is mixed. Researchers have documented real therapeutic value in chatbot relationships, particularly for people navigating loneliness, grief, or disability. They have also documented radicalization, delusion, and tragedy.

The honest answer is that nobody knows yet what long-term exposure to AI intimacy does to people.

In the meantime, the tools are not sitting idle. The same AI technology that can't talk dirty to a consenting adult in Ohio is being used by the American military abroad. Nobody is waiting on that one.

📬 Sign up for the Daily Brief

Our free, fast and fun briefing on the global economy, delivered every weekday morning.