Wall Street Times

Snapchat presents new concerns with My AI

Snapchat presents new concerns with My AI
Image Commercially Licensed from: Unsplash

Snapchat It’s no wonder that, with the AI revolution in full swing, practically every technological company wants to be part of the action.

Snapchat, the multimedia instant messaging service, has just launched My AI to compete in the realm of artificial intelligence.

While technology has many benefits, it can also be stressful for teenagers and parents.

Read also: Nate Chastain and lawyers try to rid the ‘insider trading’ label

What happened?

Lyndi Lee of East Prairie, Missouri, advised her 13-year-old daughter not to use the feature.

Lee, a software engineer, is concerned about how younger users comprehend My AI.

“It’s a temporary solution until I know more about it and can set some healthy boundaries and guidelines,” said Lee.

“I don’t think I’m prepared to know how to teach my kid how to emotionally separate humans and machines when they essentially look the same from her point of view.”

“I just think there is a really clear line [Snapchat] is crossing.”

My AI

Snapchat’s newest feature was recently launched.

ChatGPT, a platform that delivers ideas, answers questions, and interacts with people, powers it.

There are, however, some notable differences:

  • Users can customize the chatbots name
  • Users can design custom Bitmoji avatars for the AI
  • Users can bring conversations with friends

It may appear that communicating with the chatbot is less transactional than interacting with the ChatGPT website.

It also makes distinguishing between connecting with humans and speaking with robots difficult.

Reception

Snapchat’s new feature has garnered harsh criticism in app stores and on social media due to privacy concerns.

Other users have voiced dissatisfaction about unnerving exchanges and being unable to remove My AI from their chat feed unless they pay for a premium subscription.

Despite the fact that some people find the tool useful, the mixed response emphasizes the risks that businesses face when incorporating generative AI technology into their products, particularly for brands with young audiences such as Snapchat.

When OpenAI opened up ChatGPT to third-party firms, Snapchat was one of the early launch partners.

Snapchat almost instantly triggered previously unforeseen concerns from families and governments.

In March, shortly after My AI was made available to Snap’s membership consumers, Democratic Senator Michael Bennett wrote to the CEOs of Snap and other technical businesses.

Bennet is wary about how the chatbot interacts with younger people.

According to sources, it may show how youngsters may deceive their parents.

“These examples would be disturbing for any social media platform, but they are especially troubling for Snapchat, which almost 60 percent of American teenagers use,” Bennet wrote.

“Although Snap concedes My AI is ‘experimental,’ it has nevertheless rushed to enroll American kids and adolescents in its social experiment.”

Snap responded recently, saying:

“My AI is far from perfect, but we’ve made a lot of progress.”

Backlash

Snapchat users have been anxious since the app’s formal launch.

One user characterized his contact with the chatbot as terrifying after it lied about not knowing where he was.

The chatbot disclosed that he resided in Colorado after shifting the tone of the conversation.

In a TikTok video, Ariel combined My AI’s beginning, chorus, and piano sounds to construct a song describing what it’s like to be a chatbot.

The chatbot denied any participation when she returned the song, saying:

“I’m sorry, but as an AI language model, I don’t write songs.”

Snapchat stated that it will continue to enhance My AI based on user input while also implementing extra security measures to keep users secure.

Users can also opt not to communicate with My AI, according to the company.

Dropping My AI from conversation streams, on the other hand, necessitates a premium Snapchat+ membership.

Some gave in and disabled the tool before canceling the service.

Teens and chatbots

ChatGPT has previously been chastised for providing incorrect information, misbehaving with users, and allowing students to cheat.

Integrating Snapchat, on the other hand, may exacerbate existing issues while adding new ones.

Several patients’ parents were concerned about how their adolescent might use Snapchat, according to New York clinical psychologist Alexandra Hamlet.

Concerns concerning chatbot instruction have also been raised, notably in the context of mental health.

Artificial intelligence technologies have the potential to exacerbate a person’s confirmation bias and encourage them to seek out people who agree with their incorrect notions.

“If a teen is in a negative mood and does not have the awareness desire to feel better, they may seek out a conversation with a chatbot that they know will make them feel worse,” said Hamlet.

“Over time, having interactions like these can erode a teens’ sense of worth, despite their knowing that they are really talking to a bot.”

“In an emotional state of mind, it becomes less possible for an individual to consider this type of logic.”

According to Sinead Bovell, founder of WAYE, parents must make it clear that the chatbot is not a buddy.

“They’re also not your therapist or a trusted adviser, and anyone interacting with them needs to be very cautious, especially teenagers who may be more susceptible to believing what they say,” said Bovell.

“Parents should be talking to their kids now about how they shouldn’t share anything with a chatbot that they would a friend – even though, from a user design perspective, the chatbot exists in the same corner of Snapchat.”

Ambassador

Ambassador