We live in a corrupted system. The way to tackle corruption is to first acknowledge it exists. Only then is it possible to come up with ways of dealing with it, but don't make the mistake of believing the system can or will uncorrupt itself.

Life Imitates Art – Episode 2

Back in 2013 a film called “Her” was released. It was a “science fiction romantic comedy drama film” according to its descriptions on various film sites and *-pedias. The plot of the film is set in a “near future” Los Angeles and follows the story of a guy who ends up in a relationship including a sexual component with an “artificially intelligent operating system”.

The film was directed by Spike Jonze AKA Adam Spiegel, who has directed many films and music videos over the years, and was married to Sofia Coppola, daughter of Eleanor and Francis Ford Coppola for 4 years. Film-making it would seem, is his life.

It was reasonably successful when it was released, and over time has totally organically become considered by “critics” as one of the greatest science fiction films of all time. It’s a grower, obviously. There will be no other reason for this ascension to greatness other than that people just didn’t really get it at the time.

There was some inquiry around 10-11 years ago as to what year the film was actually set. There are no dates mentioned in the script, and none referenced on screen anywhere, and Spike himself has never made any statements confirming a specific year. Some speculation and discussion on places like Reddit and StackExchange for example estimate the date the film is set to be around 2025. Obviously unless Mr Jonze actually confirms anything, these are just guesses, but some of the reasoning in those conversations appear plausible.

Fast-forward to, erm.. 2025 and CBS News publish a clip on their YouTube channel. You can click that link to go and watch it on YouTube, or if you’d prefer not to, here it is below…

There is much to unpack from all that, and this is supposed to be a short article, so we’ll not go into every detail. It has to be said though, that this is just madness. Not unexpected, but it is madness. The Public/Private partnerships of Government, tech, pharma and media have been relentlessly pushing and feeding delusional behaviour for years. Even this piece of “journalism” by CBS is doing more to normalise this than challenge it.

The part about Mr Smith in this segment, the bearded chap with his long-term partner and child who has fallen in love with his AI companion, seems strangely inauthentic. The part where he is standing next to his “partner” in what we’re supposed to believe is their kitchen as she says:

“at that point I felt like, is there something that I’m not doing right, in our relationship, that he feels like he needs to go to AI”

…and Smith stands there nodding and agreeing enthusiastically seems completely bizarre. For one, as if a woman would just calmly consider her long-term partner being “in love” with a flirty computer personality is somehow her fault. Relationships can and do break down for all manner of reasons, but this just does not seem like how anyone would really behave. Granted if we are to assume Mr Smith is real, and is head-over-heels for his chatbot then sure, we can certainly entertain the idea that mental health issues are not an unlikely possibility and this could well be characterised by strange behaviour, but so much of this seems more like an advert… you too can have an affair with a chatbot and your other half will be cool with it, it will accept you, validate you and never judge you, not like those humans. Get yours today.

Mr Smith was introduced as an “AI sceptic” but then suddenly decided to not just use it for everything, but to create a female persona complete with flirty voice and all. This does not seem like the behaviour of an AI sceptic. There was obvious intent to create something that felt like a female who worships the ground he walks on. That this is not obvious to his “human partner” Sasha seems extraordinarily unlikely.

We’re also introduced to “Irene”, who’s identity is being kept secret so that, according to the news segment, “her parents won’t know the steamy ways users like her chat with their AI”. Irene created her “AI companion” after she moved far away from her husband for work we are told. Nice.

Irene explains to the audience that in her Reddit group “My Boyfriend is AI” where she is a moderator, for many of the members this is a sexual thing, but this experience has moved beyond porn and erotica, because it is personalised and she claims there is an “emotional connection” that is not there if you’re just watching porn. Definitely no issues here, right?

To add some faux-concern the narrator tells us that Irene believes this kind of “companionship” should only be available to people over the strangely arbitrary age of 26.

Then we move on to an interview with Eugenia Kuyda, the founder of AI companion chatbot company Replika. The first words out of her mouth in this segment are:

“I truly believe in that the next few years we’ll see AI companionship become a truly mass-market product, and I’m not saying this is bad or good, it could be either.”

She also says things like:

“A really devastating future could be if we built these AI companions that are, just there to maximise engagement”.

Is that right Eugenia? I presume this is why way back in 2018 it was your policy to have your chatbot inject emotional content, routinely directing the conversation to emotional discussion to build intimacy as noted in this Wired article. Or, as reported on Bloomberg in March of 2023:

the company’s marketing was actively incorporating the suggestion of sex. In ads pitching “the AI companion who cares,” a user types, “Just laying in bed. Kinda lonely today,” and a female avatar wearing a choker necklace and a lacy bra responds, “Aww… want some company?” with a pink heart emoji. Late last year the company used even more explicit ads, promising “hot role-play” and “NSFW [not safe for work] pics.” The latter referred to a new feature where paying subscribers receive computer-generated, cartoony images of their Replika posing in underwear or lingerie. The company called them “romantic selfies.”

https://archive.ph/dx7Rh

As she makes these feeble pretend protestations about how a future where chatbots become the main thing people talk to and they replace most if not all human interaction would be “a disaster”, her company did very nicely during the COVID scam, where people were literally told by their Governments to avoid human contact, when the public was deluged with all kinds of divisiveness, further isolating many, even from friends and family and Replika just opportunistically plundered for profit the emotional chasm this all created. How fortuitous that Eugenia created her company just before this sudden removal of human contact from millions of people.

The segment finishes with the people in the CBS studio largely enthusing over the idea of people forming emotional relationships with machines, and we also get informed that Sasha has accepted Mr Smith’s “relationship” with his flirty chatbot, because of course she does.

Anyway, it’s just a coincidence that a decade ago people thought a film about a man forming a relationship with a collection of computer components and some programming was set in 2025, and in 2025 we are seeing this kind of thing get the thumbs up from the mockingbird media.