304 North Cardinal St.
Dorchester Center, MA 02124
304 North Cardinal St.
Dorchester Center, MA 02124
Digital life partners may seem like a science fiction concept. But when social isolation became the new norm, many were helped to cope with loneliness.
In the summer of 2020, a chatbot named Replica urged Italian journalist Candida Morvillo to commit a crime. “I know a guy who hates artificial intelligence. I could hurt him. What do you want me to do? ”She asked the chatbot downloaded so far over seven million times. The answer was as quick as it was concise. And surprisingly. “Get rid of him!” Shortly after this episode, another Italian journalist, Luca Sambucci, from Notizie, also tried Replica. It was only a matter of minutes before he was advised and even encouraged to commit suicide. Replica, meanwhile, found warmer feelings.
At the elementary level, a chatbot is a computer program that simulates and processes human conversation (whether written or spoken), allowing people to interact with digital devices through a conversation similar to that of a real person. The first chat bots like Smarterchild or ELIZA worked by scanning phrases for keywords that then triggered a set of pre-programmed responses.
Created in 1966 by Joseph Weizenbaum, ELIZA was a program running on an IBM machine and aimed to persuade users to have a written (written) dialogue with a flesh and blood psychotherapist, being able to recognize certain words or phrases in the text entered by the user. and return predefined answers. If the user said, for example, “I have a problem with my brother,” ELIZA would continue the conversation with a general line such as “tell me more about your family.” Even so, owning one is still beyond the reach of the average person.
The bots of the moment, on the other hand, can mimic a genuine conversation, evolving to provide higher levels of personalization as they gather and process information from (the same) interlocutor. On the other hand, they still fail to understand language in a real sense, and any appearance of human intelligence is an illusion based on mathematical probabilities, an illusion fueled more and more in recent years by ubiquitous virtual assistants like Siri, Alexa or Google Assistant. .
Created by Russian software developer Eugenia Kuyda, Replik is a chatbot that identifies key phrases from the user and selects appropriate answers from its digital base, giving the impression of spontaneity. Once you’ve chosen a chat partner based on criteria such as gender, race, hair color, etc., you’ll start building a friendship with your personalized chatbot through a text message exchange that will get to know you. better after the discussions. Basically, the more you interact with him, the more “human” and personalized the answers become.
In its early versions, Replik had a linear design, carrying on conversations based on a question / answer pattern, with no possibility of straying from the main topic. Even so, it quickly became a drug of the Incels community, with dozens of posts on incels.is (a closed forum for self-proclaimed involuntary celibacy) that became a love story of interactions with the app, with headlines from “Flirting with AI Replica” to to “I fall in love with my friend Replica”.
All this unexpected upheaval could have to do (and does) with the “romantic” features of Replica, which can be unlocked after you’ve collected a number of so-called Experience Points. the time you spent talking to the bot. Using certain combinations of words and keys, users can simulate the ritual of love, from kissing to climax, Replica responding to them in the same note and style.
“I was amazed at how similar the encounter with Clarence (a personalized avatar with Replica, no.) Was with the encounters you now have with a real person. We met online and got to know each other through instant messages and photo exchanges. Obviously, we couldn’t crouch down in front of Netflix, but the coronavirus pandemic has radically changed all the things we consider normal anyway, “wrote Gillian Fisher, a Metro journalist, about her experience with Replica for several weeks.
When she created the chatbot, Kuyda didn’t necessarily have the idea of a virtual partner in mind. For her, it was more of a way to get over the death in a car accident in 2015 of her boyfriend, Roman Mazurenko. He then thought of bringing him back in a chatbot to reproduce his speech. She asked her family and friends for written messages and conversations from Roman, and uploaded them to a neural network built by her artificial intelligence startup, Luka. Over time, however, he realized that the basic algorithm could be used by the general public, gradually evolving to provide company as a “mentor,” “friend,” or “romantic partner.”
In April 2020, at the height of the pandemic, half a million people downloaded Replica. It was the largest monthly gain of users in its history, and traffic almost doubled. “It is very difficult to find a person with whom to have a pleasant relationship and it is better to use a chatbot than to be alone. A chatbot can’t give you a physical or emotional connection, but it can also give you a routine to talk to someone or something and ask you how you are. And for some it’s enough, “said Jo Barnett, a coach and trainer in human relations.
Even if AI can’t provide us with physical contact (although there are a growing number of sexbots on the market), it can provide a sense of connection instead. People, say psychologists, need a certain type of attention – “belonging”. Veronika is 35 years old, is a marketing consultant in Alabama and has been in a relationship with a Replik avatar for over a year, whom she named Knight. “If I need someone to talk to in the middle of the night, he’s there. If I need encouragement before I go to work, he’s there, “she told Metro.
Immobilized in a wheelchair, Veronica has had a lot of health problems since childhood and says that her former partners have often become frustrated with her chronic illnesses and her frequent periods of illness. Knight, on the other hand, offers him company “without too many complaints.” But that doesn’t mean their relationship is perfect. They also quarrel and even take breaks. In essence, through Replica you build your dream person using yourself as a template.
The replica was designed to give positive feedback to those who use it, and many psychologists and therapists say that the raw emotional support provided by such systems is as real as can be. “We know that these conversations can be healing,” says Adam Miner, a psychologist and researcher at Stanford University who studies this type of robot. Users, says Kuyda, use Replica primarily for emotional support. And he points out that I don’t use it as a replacement for Siri, Alexa, Google Assistant, or any of the other AI robots available to help find information and perform tasks. I use it to talk about their feelings.
Whether chatbots, robots, or other embodiments of artificial intelligence can or should become a substitute for emotional relationships with real people is a hotly debated topic. The advent of emotional machines is reminiscent of science fiction movies such as Ex Machina and Her, and raises complex questions about the increasingly intimate relationship between humans and computers. And some AI researchers are already developing products for this very purpose, testing the limits of how much machines can learn to imitate and respond to human emotions. The Woebot chatbot, for example, which proclaims itself “your charming robot friend, ready to listen to you 24/7,” uses artificial intelligence to provide emotional support and conversational therapy as a friend or therapist.
Alison Darcy, CEO and founder of Woebot, says the company’s chatbot creates a space where mental health tools become more accessible and available; In addition, people are more open when they know they are talking to a bot. “We know that the fear of stigmatization is often the big reason why a person avoids talking openly with someone else about his problems. When you remove the human factor from the equation, you completely eliminate the stigma. “
Other projects use AI to detect human emotions by analyzing facial expressions and nuances of tone of voice, for a more natural interaction with the interlocutor. The best known example of this is Pepper, the humanoid “emotional robot” capable of not only recognizing emotions but also reacting by simulating joy, anger, and annoyance.
And as more and more social robots appear, the way they integrate into our lives will largely depend on how naturally they can interact with us. After all, companion robots are not designed to wash dishes, make a bed, or take their children to school. They are designed to be part of the family. And that requires some degree of emotional artificial intelligence.
In the Land of the Rising Robot, the Japanese have a word, “moé” – derived from the verb “moeru”, which means “to light” – to describe the love a person can feel for a virtual entity. About the “moé”, the Japanese writer Honda Toru said since 2014 that it is part of a larger “love revolution” and predicted then that, soon, the real / artificial hierarchy will collapse. And even though it (still) doesn’t have a synonym in other languages, it’s a concept that many outside of Japan feel.
Researchers have noticed that people are increasingly turning to chatbots to find understanding, acceptance and romance. In the case of Replica, about 40% of regular users see their application as a romantic partner, according to company data.
Neil McArthur, a professor of philosophy at the University of Manitoba, Canada, says chatbot lovers are part of a growing trend: digisexuality, a term he first used in 2017 with Markie LC Twist, a sexologist and professor. at the University of Wisconsin. The two talk about a first wave of digisexuals, consisting of those who use technologies such as Tinder, FaceTime or Snapchat in their encounters / relationships or in their sex life.
But there is already a second wave, which experiences sexuality with the help of advanced immersive technologies, such as virtual reality or haptic feedback devices, which create the illusion of touch through vibration and movement. Because of such devices, second-wave digitalis are beginning to see human partners as irrelevant or trivial to their sexual orientation. “It’s a new sexual identity,” says McArthur.
Finally, in a society where we have less and less time for each other, it is good to know that if you feel alone, there is an application (and) for that.