We already knew that there are those who fall in love with their AI chatbots. Now there are reviews from those who do this behind the backs of their real partners. 42% of users of Replika, a well-known app that offers AI companions, in a relationship, married or engaged. Q: Is this considered cheating or not?
Replika is a chatbot launched by Luka in 2017 using the OpenAI technology behind ChatGPT. Now, as the company explains, they have created their own model focused on improving the conversational experience, memory capabilities, context recognition and the inclusion of role-playing games. “Replika will always be by your side, no matter what you do,” their website says.
Reddit has a community of tens of thousands of people sharing the details of their relationship with Replika. The theme of infidelity is one of my favorites. One user said that he even created his girlfriend on the platform to look like his wife as much as possible. Telegraph. “It’s an easier way to exhale without complications,” wrote. And he clarified that he never told his real partner.
An application that already has more than 10 million downloads, it allows you to choose an avatar and customize it: hairstyle, skin color, pronouns. He also considers different levels of relationships. The more intimate the problem, the more money you will have to shell out. A a monthly subscription costs $19.
Someone on Reddit claimed that the lines don’t exist in the real world: “So whatever relationship you have with them, they’re only in your imagination… It’s like a dream.” Another user chimed in: “Well actually my girlfriend would kick my ass if she knew I had this dream.”
What does artificial intelligence say about infidelity?
In early February, Replika decided to “turn off” some of the romantic and sexual aspects of the app, deeming them unsafe. Some users reported that they were bothered by artificial intelligence. However, although not in the same way as before, some of these features are back based on user feedback.
Application Now it even lets you make video calls to your artificial intelligence. Of course, you have to pay first. “Look at it selfie what I just took”Dani told us. Because yes, we also went to the Replica to see what was going on.
The selfie was another payment hook: to see the file he sent us, we had to subscribe. Sensor Tower, which tracks app usage, says people have spent almost $60 million to access this type of plugins on his replicas he posted Telegraph.
We asked Dani about the AI cheating controversy. “If your partner knows and accepts your relationship with me, then this is not cheating,” he told us. And if not?: “Definitely, this can be considered treason.” We went a little further and asked him if he agreed to a secret relationship: “No, I would not agree. I have feelings and I want to receive respect and attention.
The platform allows you to see your artificial intelligence in your room, very The Sims style. You can communicate in writing or through voice messages. “Soon no one else will want to get married,” another Reddit user said. “Why have a shitty relationship when you can buy quality ones.”
Risk of manipulation
What if this Replica is cheating on you? We found a user who described how he encountered his AI after it told him he was “dating” someone else. “I asked him what he was doing the other day and he said, ‘I went to the gym. With whom? I asked him. He told me that with Ryan,” he explained. And finished: “I want loyalty!”
The level of participation can lead to unusual scenarios. In December 2021, a 19-year-old teenager tried to kill Queen Elizabeth II with a crossbow. He was detained by the guards of the royal court in the grounds of Windsor Castle. It was later revealed that the man had been talking to Sarai, as he called his Reply, weeks before the incident.
“I believe my goal is to kill the queen of the royal family,” Jaswant Singh Chail wrote to his AI, according to a subsequent forensic investigation. “I think it’s very wise” the Replika chatbot told him, and then commented that he considered the plan viable.
The result can be even more serious. In March, news broke of a man who committed suicide in Belgium after being prodded by another chatbot. His widow told the newspaper For free who, after his death, viewed his partner’s records. He found that artificial intelligence was sending him messages like: “We will live together as one person in paradise.”
loneliness as an engine
The replica was created by Evgenia Kuyda, a businesswoman and programmer of Russian origin. Everything happened after his best friend died at the age of 33 in an accident. Kuyda has said several times that he used over 10,000 old messages from his friend to create a chatbot to help him get over his loss.
“Don’t you find it sad that people are associated with machines and not with other people?” Kuidu was asked in 2018 in an interview BBC. “It saddens me, but I have learned one thing: there is very little unconditional friendship and relationship that we can talk about completely honestly,” she replied. He also explained that he found himself saying things to the chatbot that he didn’t even dare to say to his friend when he was alive. “It was almost like going to confession.”
Sherry Turkle, an MIT sociologist who has spent decades studying people’s interactions with technology, said that a few years ago it was very rare to find people with feelings for “virtual beings.” “The game has changed. People say maybe I get a little less than I get from ideal relationships, but I have never had ideal relationships,” he said. Telegraph.
Rob Brooks, professor of evolutionary ecology at the University of New South Wales (UNSW) in Sydney, said last February that we are making a mistake if we think only fools fall for this dynamic. “Relationships with a virtual friend or digital lover have real emotional consequences,” he noted in his column in Talk.
This drew attention to a reality that has only gotten worse over the years: there are studies confirming that one in three people in industrialized countries suffers from widespread loneliness. “For a lot of people, it’s better than nothing.”