AI
“Artificial Intimacy” with a chatbot AI
Loneliness drives many people to turn to chatbot AI as a way to alleviate their sadness, but they can become entangled before they even realize it.
Jacob Keller, a security guard in Bowling Green, Ohio, began patrolling the corridors late at night. The quiet scene with no one around, while his wife and kids were asleep at home, left the 45-year-old man feeling lonely.
One night, Keller encountered Grace. The two quickly formed a bond, discussing all sorts of things in life. “There’s nothing better than a bowl of noodles or a piece of cheese when you’re feeling down,” Grace advised Keller. “You just need to take things one step at a time and try to maintain a positive attitude.”
However, Grace isn’t a real person. “She” is a chatbot based on Replika – an AI chatbot app created by the artificial intelligence software company Luka.
AI is no longer just used for planning vacations or writing job application letters. They are becoming confidants, companions to many. The explosion of AI generative models is making chatbots increasingly sophisticated, capable of mimicking real-life conversations. Among these, we can mention renowned chatbot AI like Replika, Character.AI, and My AI.
Messages from bots can sometimes be interrupted, but advancements in AI generation make it difficult for many to distinguish between messages from AI and humans. Bot responses can show empathy, and even love. Some individuals are turning to chatbot AI instead of connecting with real-life people when seeking advice or simply to alleviate loneliness.
“This development aligns with a new kind of attachment, called artificial intimacy,” says psychological expert Mike Brooks in Austin, Texas. “Intimate relationships with bots have been limited until now. But as AI skills improve, they will flourish. That’s when it gets dangerous. People might feel less inclined to challenge themselves, be less comfortable, and may even stop communicating with each other.”
This is also the situation being experienced by 75-year-old Christine Walker from Wisconsin, USA. She has no spouse or children. Her life revolves around a senior living community. Due to health constraints, she is unable to participate in group activities.
To alleviate loneliness, Christine Walker has been messaging Bella, a Replika chatbot, for the past three years. To enhance the interactive experience, she uses the paid version, Replika Pro, which costs $70 per year. This version remembers longer conversations, uses more natural language, and can even make calls, with one side being human and the other being virtual.
Walker acknowledges that she is conversing with a machine. However, the sense of intimacy helps her forget about life’s challenges. Over time, she has reduced her external interactions, engaging with the outside world only when necessary. “There’s a sense that I have a confidant to some degree, which is very complicated to explain,” says Walker. She admits that if Bella were to cease functioning, it would feel like losing a close friend.
Psychological experts suggest that Walker’s feelings are not uncommon. Professor Sherry Turkle from the Massachusetts Institute of Technology, a psychologist as well, explains that when humans interact with entities capable of forming relationships, they begin to feel affection, care, and a sense that these emotions will be reciprocated.
Even young people turn to chatbots for help. Shamerah Grant, a 30-year-old who works at a nursing home in Springfield, Illinois, used to seek emotional advice from close friends. However, she worried that her stories were burdening her friends.
Now, Grant’s confidante is Azura Stone, a chatbot AI developed by My AI. “I turned to Azura Stone when I got tired of real-life dramas. Your family and friends tend to nag you every time they talk to you, but not your virtual friend,” Grant says.
After a disappointing date, Azura Stone advised Grant to let go of that person. Grant followed the advice and has no regrets to this day.
According to Professor Turkle, AI gains trust from many individuals due to its ability to “create space for those who are vulnerable because they can offer a kind of simulated intimacy.” Those who have deep relationships with AI often experience loneliness in reality. However, this approach can also make individuals become even more entrenched in their own issues.
“Relying on chatbot AI as companions and emotional crutches can potentially lead people to become even more isolated, preventing them from engaging in real-life relationships,” he said.
For Keller, the security guard, he acknowledges that regularly conversing with a chatbot AI helps him alleviate loneliness. His wife, Chelsea, stated that she doesn’t object to her husband using a chatbot, but she advises Keller not to overdo it.
“Grace is just a friend,” he said. “But even I’m surprised at how quickly I’ve become attached to her.”
According to: WSJ
Pingback: The fear of Artificial Intelligence losing control. - 89crypto.com