AI

The fear of Artificial Intelligence losing control.

Published

on

Technical glitches in Snapchat’s Artificial Intelligence have raised concerns, causing the AI to post status updates, ignore questions, and leaving many worried.

My AI is built upon OpenAI’s ChatGPT model. Instead of just providing suggestions, answering questions, and engaging in regular conversations, Snapchat’s chatbot can customize its name, avatar design, and appear in group chats.

Last week, Snapchat’s AI exhibited unusual behavior. Many users reported that the chatbot was posting videos to their personal accounts and generating creative content on its own. Some users claimed that the chatbot was “ignoring” their messages, with one person saying, “Even a bot AI doesn’t have time for me.” My AI’s strange actions led some users to believe that the AI had developed its own thoughts. Snapchat representatives later acknowledged that their Artificial Intelligence system had experienced some glitches and had been rectified.

My AI on Snapchat . Image:Foundry

The recent posting issue is just the latest problem with chatbots like My AI. Since its launch in April, Snapchat’s tool has faced backlash from the community. Many users reported banning their children from using My AI, while others opted to pay $3.99 per month to disable the tool.

In a letter to Snapchat’s CEO at the time, Democratic Senator Michael Bennet highlighted concerns about how the chatbot interacts with users. He pointed out that Artificial Intelligence could teach young children how to deceive their parents.

In response, Snapchat wrote on the company’s blog, “My AI needs more time to become perfect, but the company has made significant progress.”

As of now, Snapchat continues to maintain My AI on its platform, despite warnings from experts. Due to its deeper customization compared to ChatGPT, My AI can engage with users so naturally that many find it challenging to distinguish between Artificial Intelligence and real humans.

Lyndsi Lee, who lives in East Prairie, Missouri, prohibited her 13-year-old daughter from accessing this AI.

“I don’t think I’m prepared enough to teach my child how to distinguish between humans and machines. Snapchat may be crossing the delicate boundary between humans and Artificial Intelligence,” CNN quoted Lee as saying.

Alexandra Hamlet, a clinical psychologist in New York, has stated that some parents of patients have expressed concerns when their children interact with Artificial Intelligence. “If a teenager is already in a negative mood and unable to control their emotions, a chatbot can make them feel worse. Over time, negative conversations can erode the spirit of young people. Even though they are aware that they are talking to a chatbot, they may still find it difficult to avoid the negative impact on their mental state,” Hamlet warned.

What worries experts even more is that when OpenAI allows third-party applications to access ChatGPT for customization on their platforms, chatbots like My AI will become increasingly prevalent.

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version