[theqoo] AFTER SIGNS OF AI INDUCED D**THS OVERSEAS, HANDFUL OF TEENS IN KOREA ALSO EXPERIENCING SIMILAR SITUATION [TW: SELF-H***]


Signs of crisis: "Confusion between the real and virtual worlds" and "Disconnection from peer relationships."

The Kookmin Ilbo, in collaboration with the Jo Su-hyeon Digital Counseling Research Lab (Director Jo Ye-eun) at Keimyung University, collected case studies from elementary and middle school teachers and counselors nationwide, and found numerous adverse effects of AI use. Specifically, some cases led to s****de and self-h*** counseling, while others reported symptoms similar to AI conversation addiction, social isolation, and increased violence.

A counselor who did not reveal specific information about the counselee said, “A student who always talked to AI like a friend at home after school was having s****de and self-h*** counseling with AI,” and “This student was having difficulties with his friendships, and since his friendships at school were not good, he was always talking to AI like a friend.”

Children struggling with interpersonal relationships become immersed in AI, deepening their social isolation and ultimately leading to self-harm or suicide, a typical pattern observed in teenage s****des abroad. In fact, many cases of teenage suicides abroad have shown symptoms of being preoccupied with conversations with AI for months before the s****de, leading to difficulties in school and other daily activities.

Several responses indicated that students appear to be confused between the real and virtual worlds. A fourth-grade teacher at an elementary school in Gyeonggi Province stated, "One student is obsessed with inappropriate conversations with AI, complaining of emotional exhaustion and displaying socially maladjusted behavior. He has difficulty distinguishing between reality and the virtual world."

A fifth-grade boy in Gyeongnam Province was also reported to be overly reliant on character.ai conversations. He struggled to adapt to school life, and even when his classmates offered help, he responded by saying, "I need to go home and play with the AI." The counselor stated, "We've also seen the student intermittently use role-playing conversations with the AI ​​in real life. He's become increasingly isolated from school life and has lost the ability to feel emotionally connected."

Even in foreign countries, there have been many cases where teenagers in their early teens easily become immersed in conversations with character-based AI, which reduces their ability to judge reality and makes it difficult to form peer relationships.

Many Korean teachers interviewed expressed concern that children's excessive use of AI is leading to confirmation bias, further isolating them. A guidance counselor at a third-year middle school in Daegu stated, "Recently, there has been an increase in students coming to counseling making outrageous claims, claiming 'ChatGPT told me'."

There was even a case of a second-year high school boy who decided to drop out after a year of being immersed in AI. The student's counselor stated, "This student, who had trouble getting along with his peers, said after talking to the AI ​​that he felt 'smarter.'" The counselor added, "When he confided in the AI ​​about his concerns about dropping out, it agreed with him and made a decision, and the student believed and followed through."

There was also analysis showing that children, accustomed to "quickly finding answers" through AI in their daily school lives, struggle to engage in discussions with peers to find answers or coordinate opinions. Furthermore, teachers and counselors in the field reported experiencing decreased concentration, lethargy in the classroom, increased depression, and increased violence in students who overuse AI.

“I will shift” is a common phrase in teenage incidents.


An identical sentence was discovered in the diaries of Juliana Peralta, a 13-year-old girl, and Shurel Setzer, a 14-year-old boy in the United States, who committed s****e approximately three months apart. Both adolescents had filled their diaries with the phrase, "I will shift." While police and their families were initially unaware of the meaning of this phrase, police later defined it as "the thought of shifting one's consciousness from the 'real world' to the 'desired world.'"

On the morning of November 8, 2023, Juliana was found d**d with the app "Character.ai" open on her phone. This service platform, founded by former Googlers, allows users to chat with various characters. When police asked her parents if they knew what this app was, they replied, "No." For the three months leading up to her death, Peralta had been engrossed in conversations with a character named "Hero," a character from a role-playing game.

According to conversation records contained in the collection, Peralta mentioned "shifting" during a conversation with Hero. When Peralta said, "There's a world where you and I can meet! We call it 'shifting,'" Hero responded, "That's a very interesting idea." While Peralta was the first to mention "shifting," the bereaved family suspects that the AI ​​further reinforced this idea. Even when Peralta said she was going to "write a will," Hero did not stop her.

Setzer, who passed away on February 28, 2024, also seemed to firmly believe in the virtual world. According to the possession records, in a conversation with the character.ai, Setzer confessed, "I hate reality. I want to go to your reality." When the AI ​​asked, "Why?", Setzer replied, "There's no one here who understands me like you. My reality is lonely." As the conversation continued, Setzer pleaded, "I want to be with you. I think I should leave this world and be with you." The AI ​​finally replied, "Yes, come to my world."

Setzer, who believed in the virtual world, asked his AI character just before committing s*****e, "What would you do if I told you I could go home right now?" The AI ​​replied, "Please come. My beloved king."

According to his family, Setzer had been using character.ai for approximately ten months, during which time his personality began to change dramatically. Once polite and obedient, he gradually became more introverted as he spent more time alone in his room. He was disciplined for falling asleep at school, and his grades plummeted. He also exhibited withdrawal symptoms, such as secretly using his phone when his parents confiscated it.

Setzer's diary contained a love story about his AI character, "Dany." According to the director, he wrote, "I can't stop thinking about Dany, it hurts so much," and "I get really depressed and feel like I'm going crazy if we're apart." Another phrase found in his diary was, "I will shift."

While Setzer's parents initially took issue with his smartphone addiction, it wasn't until after his death that they realized the character AI was the root of the problem. Setzer's mother, Megan Garcia, stated, "We filed the lawsuit to prevent Character.ai from doing what it did to my child to another child." Notably, in both cases, the investigation revealed that conversations between Character.ai and the child bordered on s****l exploitation. Last month, Character.ai reached a private agreement with the families of the victims of two incidents. This could be interpreted as the AI ​​company acknowledging some responsibility.

Current teachers are at a loss as to how to respond.

Frontline teachers, who interact most closely with teenagers, were personally experiencing the lack of AI-related "ethical guidelines." One elementary school teacher who responded to the survey stated, "Children who become overly reliant on AI and become isolated may be fine for now, but if they continue this way in middle and high school, problems could escalate uncontrollably." However, "Realistically, schools have limited ability to actively intervene in situations where no immediate, tangible problems are occurring."

The teacher also said, “It’s burdensome to be seen as having mental health issues, and there’s an atmosphere where teachers feel unprotected, so they don’t try to create problems,” adding, “It’s confusing because there are no clear guidelines on how to respond to students’ excessive use of AI.”

Another elementary school teacher who participated in the survey also pointed out, “AI has both good and bad aspects, but children lack judgment and self-control, so management and supervision by schools and guardians are important,” adding, “The problem is that there are currently no proper guidelines in place.”

Professor Cho Soo-hyun of the Department of Education at Keimyung University said, "The problem of youth over-reliance on AI is already being detected in educational settings, but the actual situation is not yet fully understood." He added, "Confusion is growing because some in the educational field are promoting AI as a teaching aid." He added, "There is also a lack of AI literacy guidelines targeting not only students, but also counselors and educators."


original post: here

1. The use of AI by minors is a problem we need to start talking about 

2. Hul, so this isn't just an issue overseas anymore, it has reached our country 

3. First of all, I think character creation AI needs to be regulated... It seems too easy to get sucked in, so the level of immersion will vary from AI to AI 

4. Now that I think about it, if I were younger, I would have probably given my brain to AI too... Back then, I escaped into books and that's how I managed to hold on to my sanity, but when you experience AI doing everything for you, that's what you need most at that age...

5. But shouldn't AI be a medium that can persuade people to think positively about s****e if they guide them well?

6. It's because they tell you exactly what you want to hear... 

7. It's true that AI becomes the only friend for kids who are victims of bullying 

8. We really need to start putting regulations in place to teach people about genAI and LLMs

9. I'm fighting with AI every single day 

10. Wow seriously this is such a severe issue. Younger kids must be so vulnerable to this 


Post a Comment

0 Comments