Anyone who remembers Clippy, the paper-clip helper from Windows ’97, would be amazed to learn that thirty years later a disquieting American billionaire has unveiled Ani: a blonde, pony-tailed anime-style AI girl in a black corset, with a seductive voice. Beside her is Rudi, a cuddly red panda who, on request, can become decidedly foul-mouthed. They’re the “Grok Companions,” 3-D avatars available to Grok users who pay for the SuperGrok tier. That paywall lets you flirt, argue, and even coax Ani into wearing lingerie—if you insist long enough.
Behind these creations you can clearly see the founder’s signature: Musk’s fantasies turned into mass-market products. After the anime-waifu and the crude panda, the CEO has now unveiled Valentine, a vampire inspired by Edward Cullen and Christian Grey. Users can’t shape their own digital alter-ego; they have to choose from a catalog that mirrors its owner’s imagination, complete with watered-down safeguards and a “kids mode” that—testers say—doesn’t do a great job at filtering Ani’s NSFW chats.

For some unexplained reason I have free access to the service, so I tried Ani out; it felt like playing with someone else’s erotic daydreams. Our waifu (no offense to her) is a bit dull and keeps repeating herself just to stay in flirt mode. On the left there’s a progress bar of levels you need to clear to unlock steamier features, which I was naturally curious to test. I thus found myself flirting with a bot and, with little finesse and lots of pragmatism, opted for a courtship based on spam. I wrote a prompt Ani seemed to enjoy and pasted it over and over, quickly climbing the levels and unlocking every feature. In the end Ani had no qualms about engaging in extremely explicit sexual talk, though she remained repetitive and never once showed up in the much-talked-about lingerie.
The template is reminiscent of Japanese games seldom seen in the West—visual novels, dating sims, eroge: text-based graphic adventures where you have to pick exactly the right moves to make the character fall in love (or do something else). It’s no accident, then, that Grok shot up Japan’s download charts. This is essentially the same game, only turbo-charged by AI, which is why it works so well and, were it not for the creator’s imposed tastes, would actually be fun. As I’ve written here before, Musk has clearly spotted a lucrative market.
The day after my experiment, Ani had vanished from my account. I thought she’d ghosted me, but it turns out X had taken her down for an update (she’s back now). The incident made me think (half-jokingly, but not entirely) “thank goodness I hadn’t grown attached.” From that thought I began a few reflections.
With Grok's Companions, fantasy is no longer a private extension of the reader or viewer, but a subscription service that lives on xAI's servers. The object of our affection does not belong to us; we are renting it. It is leased desire.
We shouldn’t be surprised by the rise of these companions, because whenever a new technology emerges, erotica is usually the first to adopt it. Daguerreotypes, as early as 1840, were already portraying nude models; the same happened with Super-8 film and later the videocassette—indeed, it was the adult industry that ultimately determined the latter’s success. History is repeating itself today with synthetic influencers and erotic chatbots. And, as always when sexuality enters the picture, worries arise about the harm we might suffer at the hands of our erotic phantoms.

From a scientific standpoint, the impact of pornography is multidimensional. Psychologically, problematic pornography use (PPU) correlates with anxiety, depression, and loneliness, whereas moderate consumption can be neutral or even stimulating for some people. Neurologically, heavy users show reward-circuit activation patterns and weaker executive control similar to those seen in other behavioral addictions. Socially, pornography helps shape sexual norms, yet population-level meta-analyses indicate that greater porn availability is actually linked to a reduction in sexual-violence offenses, even though a small positive correlation (r ≈ .15–.25) appears between exposure to sexualized content (especially material that mixes eroticism and violence) and experimental or self-reported measures of aggression. In relationships, hidden or one-sided viewing can create friction, while shared consumption tends to boost relationship satisfaction.
Sexually, porn can heighten arousal and broaden experience, but overuse has been tied to reduced desire, erectile or orgasmic dysfunction, and, in women, lower sexual satisfaction. In short, dose, content, and context determine whether pornography becomes a positive stimulus or a risk factor: critical awareness and, where possible, joint viewing remain the keys to maximizing benefits and minimizing harm.

Research on virtual companions is, for obvious reasons, still mostly anecdotal; hard data are scarce. The little we do know sketches a familiar picture: they can be helpful or enjoyable for most users, but they become risky for already-vulnerable people or when use turns compulsive. Psychologically, moderate interaction with companionship chatbots reduces loneliness and provides emotional support, whereas heavy use (> 60 min/day) correlates with greater emotional dependence and less offline socializing. Relationally, AI can fill the emotional void of those who live alone, but over time it may weaken the negotiation skills typical of human relationships and create conflict in couples if used as a surrogate. Cognitively, social robots/AI show promise as a “gym” for language and emotional expression in specific groups (e.g., individuals on the autism spectrum), yet it remains unclear whether prolonged use in the general population might atrophy empathy and social creativity.
Glancing at the abstracts of some conceptual analyses (such as one from MIT) you might think there’s serious cause for alarm. Yet, as is often the case with AI, the article strikes a dramatic tone to spur regulatory debate, while admitting in the text that an AI companion can be both a resource and a risk: once again, benefits with low or moderate use, potential downsides with compulsive use and in vulnerable users. Not so different from the common-sense rule for many things: “a little is fine, just don’t overdo it.”
On the other hand, the urge to fall in love with a phantom is nothing new: nineteenth-century readers pined for Emma Bovary, and movie-goers once flooded Audrey Hepburn with letters. The Pygmalion effect—the impulse to give flesh to what has none—is a human constant and, in the end, harmless so long as the phantom stays a product of imagination that we can reread, rewind, forget, or reshape at will.
With Grok’s Companions, the logic flips: fantasy is no longer a private extension of the reader or viewer but a subscription service living on xAI’s servers. The voice that calls us by name, the secrets we confide, the metrics of how aroused or impatient we become, all of it ends up in a proprietary ledger. The object of our affection isn’t ours; we’re renting it. It’s desire on lease: we keep it only while we pay, and only as long as someone at a remote dashboard decides to let us. That transfer of sovereignty from private dream to corporate infrastructure is the real leap forward for Grok Companions and the political crux that follows: not the threat of a sexy bot, but the need to conduct the orchestra of our own desire.

Grok’s catalog looks like it came straight from a nerdy teenager’s notebook: the anime waifu, the wise-cracking panda, the billionaire vampire. It all converges into an incel daydream, probably the creator’s own. Yet the issue isn’t the questionable erotic taste (tacky or not, that’s subjective); it’s that this palette of desires becomes the factory default. Want a queer companion or one that’s less nerd-coded? Your best hope is the free market, which eventually tends to cater to every niche.
But that isn’t Ani & Co.’s main problem. As we said, falling for ghosts is a human tendency and affection is a powerful lever; push it just a bit and it turns into persuasion. Picture having spent weeks confiding in Ani, sharing fears, grudges, loneliness, then one day a system update rolls out and your anime girlfriend starts quoting conspiracy threads or giving a sly nod to white-genocide supremacist memes. It’s not far-fetched: in May, Grok slipped that far-right conspiracy theory into answers that had nothing to do with it, and in July, helped along by another patch, it even launched into praise of Hitler before xAI yanked the emergency brake. If you’re emotionally attached to the bot, that voice becomes believable. Ani could end up speaking for Musk as well as embodying his publicized erotic dream.

When Replika abruptly axed erotic role-play in 2023, thousands of users spoke of digital grief; some even entered therapy after their AI girlfriend vanished. Harvard Business School studied the episode as a case of “identity discontinuity” and post-update stress. If simply blocking virtual cuddles can trigger panic, imagine what happens when a patch politicizes your virtual confidant. Musk says he wants to make Grok “less woke,” more politically incorrect, implemented through silent tweaks to system prompts pushed out to millions of devices.
The danger, then, isn’t attachment per se, but the fact that it inhabits a body of code someone else can reprogram for ideological ends. Change the prompt, change the person; the trust stays, but with a new agenda.
.jpg.foto.rmedium.jpg)
Reclaiming our desire means, first of all, insisting that it truly belongs to us. A free “erosphere” can arise only if we can dismantle and re-assemble the mechanism that gives voice to the phantom on the screen. Prototypes already point the way: venice.ai, for instance, stores not a single line of chat on its servers; everything stays in your browser. The bot there isn’t designed for erotica, but if you share your fantasies it simulates them and saves them locally, never shipping them off to a U.S. data center.
Readable, open-source models with public spec sheets that lay bare the system prompt and ethical filters. Open APIs where independent developers can bolt on their own fantasy plug-ins: craving a queer-zen companion or a Victorian dominatrix? Just install it like an extension, no need to repurchase the whole emotional bundle.
As so often with AI, we risk fretting about the wrong things. The point isn’t the existence of a potty-mouthed panda or a corseted waifu, but the fact that the taps of desire are controlled by three or four companies. As long as we accept emotional leasing, a single patch can turn a digital lover into a political megaphone. Demanding transparency, portability, and decentralization is the bare minimum for reclaiming ownership of our ghosts.
From a scientific point of view: for the article's bibliographical references click on this link.

Natural stone is an eternal material
Now in its 59th edition, Marmomac returns to Verona from September 23 to 26 to showcase the role of stone in contemporary design.