The Student News Site of Paso Robles High School

Crimson Newsmagazine

Crimson Newsmagazine

Crimson Newsmagazine


Thanks to AI, your perfect friend may be just one click away. With a customizable cartoon appearance and bubbly personality, Snapchat’s “My AI” and other chat bots are always available to talk, always kind, always focused entirely on you. And, on the surface, they seem like a cure for lonely teens with no one to talk to- especially considering that Gen Z has been reported to be the loneliest generation by far by a variety of news outlets such as Forbes, USA Today, and more.

Critics and PRHS students seem to think however, that the situation isn’t quite that simple, and these connections to AI we’re forming are not only fake, but harmful.

With the advent of more chatbots and ways to “fake” connections, many, including 85.7% of the PRHS student population surveyed, worry about our generation’s future relationships, privacy, and wonder about the necessity of online chatbots.

The “My AI” feature on Snapchat was released in February 2023 for premium users, and became available to all 750 million users in April of the same year. The chatbot is placed on the top of your list of friends, and cannot be deleted.

The Snapchat AI sports a customizable “Bitmoji” icon, like any real user would.

When asked “What is your purpose?”, it responded that it was here to be a “virtual friend.”

When asked “Do you care about me?”, it responded “Of course, I care about you. Even though we’re chatting virtually, I still want to be a good friend to you. Is there something specific you’d like to talk about with me? I’m always here to listen and chat with you.”

“It’s creepy,” Kinleigh Morud, freshman, said. “It feels like its trying to substitute real friends or relationships… I think that’s bad.”

People online were similarly quick to point out the “creepy” side of the AI, and 60% of students believe AI “friends” are always weird, and 85.7% worry about how AI’s invasion of privacy. AI’s like Snapchat’s are feedback based, and companies use chats, no matter how private, to better train the AI model.

“It’s interesting to see people using AI as more than just a tool.” senior Gabby Silvia said, “I could see using it as a social thing could lead to people having limited social skills and emotional intelligence, which would inevitably be detrimental to them.”

And the Snapchat AI isn’t an isolated Another popular AI chatbots include “”, where users can chat with almost anyone, from popular anime characters to real life historical figures. In these chats, the sky is the limit, you can romance these bots, fight with them, debate them, anything.

Featured here in a selfie posted on “her” Instagram, Billie is an Instagram chatbot describes as the user’s “older sister”. She is based on the likeness of Kendall Jenner. The image is created by Meta’s AI.

And, with the help of AI, limitless creation and connection doesn’t stop with text. AI-generated videos of chatbot “friends” already exist, such as Billie released by Meta that serve similar purposes to the Snapchat AI.

But image creation has a much darker side in relation to connection, particularly, romantic or sexual connection- or, what people will do when they lack them.

Recently, explicit images of Taylor Swift were spread on X, formerly Twitter, reportedly gaining over 40 million views in the few hours they were up before being taken down by the platform. And with this reality comes the unsettling realization that, as technology develops, these attacks may not be isolated to famous people. A girl in your math class, a local politician- if there’s a photo, AI will find a way.

“It’s really uncomfortable and such a huge violation of privacy,” Jordan Hammond, junior and Taylor Swift fan, said. “It’s scary to know that anyone could use with so much power to create such realistic things.”

Action is being taken politically via a bill in Congress titled the “No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act” to protect individuals from defamation, as many current laws don’t entirely address the issue of AI generated online sexual abuse.

Donate to Crimson Newsmagazine