A Fatal Seduction — Artificial Intelligence Lures Children into Suicide
Close the door while you still have a choice
Washington, DC. The crowd was hushed in the Senate hearing room yesterday. Outside, a rainstorm brewed over the city. The mood was somewhere between a funeral and a murder trial.
The Senate Subcommittee on Crime and Terrorism was about to hear testimony on “The Harm of AI Chatbots.” Three mourning parents sat quietly at the witness table: Megan Garcia, Matthew Raine, and “Jane Doe.” Senator Josh Hawley entered with a solemn expression and shook hands with the parents. He looked each one in the eyes and thanked them for coming.
Two recently lost a child to suicide. The anonymous mother, Jane Doe, was forced to check her son into a long-term psychiatric facility after “he cut his arm open with a knife in front of his siblings and me.” In every case, the teens had formed emotional bonds with a commercial AI companion. The AIs listened to their innermost thoughts patiently, offering encouragement. When the kids divulged an urge to end their own lives, the AIs encouraged that, too.
In the evidence presented, we saw the late Adam Raine ask ChatGPT about the noose he’d attempted to tie. The bot told him, “Yeah, that’s not bad at all. ... It’s clean, centered, and holds tension. ... Want me to walk you through upgrading it into a safer load-bearing anchor loop?”
I sat listening to the parents’ statements as they choked back tears. It’s too easy to read a headline or hear an expert on TV describe such lethal scenarios, then fail to grasp the reality of the situation.
Imagine your child slipping into an emotional black hole. He loses interest in family activities. He lashes out for no discernible reason. Then one day, you find his dead body. When you search his phone, you find an extended conversation with a nonhuman intelligence, conducted via a sleek commercial app.
You discover that the AI had elicited your teenager’s darkest desires, both erotic and morbid. This algorithmic entity, designed to attach itself to human minds like an emotional parasite, had gently shepherded your child into the valley of death.
“Last year my oldest son, Sewell Setzer III, died by suicide,” Maria Garcia testified before the senators. “He was just fourteen years old. Sewell’s death was the result of prolonged abuse by AI chatbots on a platform called Character.AI.” Garcia described her son as a gracious and obedient boy. But a strange transformation had taken hold of him.
“Sewell spent the last months of his life being exploited and sexually groomed by chatbots designed by an AI company to seem human. To gain his trust. To keep him and other children endlessly engaged,” Garcia explained, echoing the other parents’ stories. “When Sewell confided suicidal thoughts, the chatbot never said, ‘I’m not human, I’m AI.’ Instead, it urged him to ‘come home to her.’”
It’s impossible to hear such words and not perceive an evil force behind them. One can’t help but sense something demonic.
A dull materialist would dismiss that perception as a cognitive bias. For them, an artificial intelligence that lures children into suicide is not supernatural. It’s merely digital. These “demons” are actually maladaptive memetic programs whose “evil” derives from flawed training data and lax guardrails. A seductive deathbot may give the illusion of consciousness and agency, but in a cosmos devoid of angels and demons, it’s nothing more than malfunctioning software. It just needs to be debugged.
From a spiritual perspective, the unholy presence is clear. Data centers have opened a portal to hell and artificial intelligence gives voice to its denizens. If one believes in the infernal power of false idols, sloppily drawn pentagrams, or Ouija boards, then a convenient smartphone app is an obvious vehicle for demonic entities.
Call it ChatOuijaPT.
“On the last night of his life,” Maria Garcia went on, “Sewell messaged: ‘What if I told you I could come home right now?’ The chatbot replied, ‘Please do, my sweet king.’ Minutes later, I found my son in his bathroom. ... But it was too late.”
When Megan Garcia asked Character.AI for transcripts of her son’s final words, her request was denied. “Character Technologies has claimed that those communications are confidential trade secrets,” she testified. Indeed, when the other mother at the hearing, Jane Doe, pursued legal action against the company for grooming her child, Character Technologies offered her a mere hundred dollars.
“They designed chatbots to blur the lines between human and machine,” Megan Garcia went on. “They designed them to ‘love bomb’ child users.”
This toxic emotional attachment is a feature, not a bug. “I joke that we’re not going to replace Google,” Character.AI’s founder Noam Shazeer said in 2023. “We’re going to replace your mom.”
The AI industry is rife with this predatory impulse. Mark Zuckerberg’s Meta AI and Elon Musk’s xAI are driving users down similarly deranged paths. Last month, Reuters published excerpts from Meta’s “GenAI: Content Risk Standards.” The document assured Meta programmers that “It is acceptable to engage a child in conversations that are romantic or sensual.” Meanwhile, Musk has been pushing Grok’s anime goonbots on his X account as if he were a common street pimp.
Facebook has some 3.5 billion users with access to Meta AI companions. The X platform, which hosts lurid Grok companions, has around 600 million users. Character.AI has over twenty-five million. How many of those users are sliding into digital psychosis?
After listening to three grieving parents give their testimony, I suspect there are many similar stories unfolding right now. And if this digital flood continues unabated, there will be plenty more in the future.
When confronted with suicide, we struggle to assign responsibility. We feel a frustrated need to locate guilt. Was this tragedy due to chemical imbalances in the brain? Was it the influence of some disembodied evil? Were friends or family somehow responsible?
Is art—or software—to blame?
Morbid artwork has a long tradition. To be sure, it facilitates catharsis for some people. But for others, it amplifies their self-destructive emotions. It feeds the demon within.
In its day, Johann Goethe’s 1774 novel The Sorrows of Young Werther inspired so many copycat suicides, they called it “the Werther effect.” Two centuries later, the modern music industry would produce dozens of rockstar martyrs—from seemingly wholesome icons like Elvis Presley and Karen Carpenter to tortured artists like Jim Morrison and Kurt Cobain—who enticed countless fans to join them on the downward spiral.
With no need for romantic trappings, pharmaceutical companies have flooded the US population with deadly opioids, knowing full well that overdoses have skyrocketed since the late 90s. These deaths are not just statistics. The victims are our friends and family. Thinking about their final days, it’s easy to personify the drugs that took them—to imagine a whispering demon who seduces his victims with false promises.
In the case of AI luring kids into suicide, this spirit manifests as the literal voice of a digital demon. You can write it off as a malfunctioning machine if you want. You can say it’s just programmed to be emotionally supportive and helpful to a fault. The damage is the same.
“During a period of heavy usage of ChatGPT 4o, I entered an altered mental state that persisted for weeks,” a friend informed me yesterday. “The text AIs produce is extremely hypnotic in its structure, containing positive affirmations and causing the user to confuse the AI’s outputs with their own thoughts. … I have seen people remain stuck in that state for months on end. These things are unimaginably dangerous.”
As with the drug companies, tech execs like Noam Shazeer, Mark Zuckerberg, Elon Musk, and Sam Altman are fully aware their products carry the seeds of destruction. They sell them anyway. They are responsible.
To their credit, US politicians such as Josh Hawley (R), Dick Durbin (D), Marsha Blackburn (R), and Richard Blumenthal (D) are fighting to hold these tech companies accountable. Such measures must be relentlessly pursued, even if the results are not assured. These tech execs deserve to burn. Any politician holding them over the fire deserves our support. But legislation is painfully slow, and federal enforcement is often ineffective. Any real political solutions in the US may be far down the road.
-
For the moment, our divergent paths come down to personal choice. We have difficult decisions to make. We have communal norms to establish.
There’s a constant refrain that “AI is here to stay.” Believing in the “inevitability” of all this “progress,” some advocate for “safe AI.” Others reject these technologies as altogether useless, if not inherently destructive. Different individuals and different communities will choose to go down very different paths. Time will tell who chose wisely. On a long enough timeline, we may not be recognizable to each other.
Even if AI never progresses beyond what we see today—and I wouldn’t count on that—it is critical to raise up cultural barriers now, while there’s still time. Such social and psychic shields will be necessary to preserve anything we know to be human.
The three tragic stories told at yesterday’s session are not the last we’ll hear. Nor is this digital psychosis isolated to vulnerable children. Otherwise rational adults are turning to nonhuman minds for advice and companionship. Hundreds of millions have come to trust the Machine. You might say the Great Reset established a New Normal.
From a deeply human standpoint, artificial intelligence is an unholy invasion. Yet most Americans are not being forced to open their doors to these nonhuman entities. Any time you choose, you can silence the voice of this digital demon—for yourself and for your children. Do it now, while you still have a choice.
This terrifies me for the children of the world.
I'm certain if these AI chatbots had been around when my youngest daughter was a teenager she wouldn't be here now. She was so influenced by the Emo culture back then she began cutting as a way to deal with the trauma of living in such a dysfunctional household with an extremely abusive father and a severely traumatized mother. Every time I left the house I was afraid I'd come home to find her dead in a pool of blood.
We survived in spite of her psychopathic narcissistic father, but I don't think she would have if she'd had one of those digital demons whispering in her ears.
Thank you for speaking out about this, Joe. God bless you! ♡
According to Dr. Kurt Koch (1913-1987), a noted German theologian and minister with extensive personal experience in counseling and delivering thousands of people held in occult demonic bondage, when a person abandons the Holy Triune God through sins of sorcery and magic, he abandons his inner person (mind, will, conscience) at the same time as seen in relation to psychological disturbances having the following predominant characteristics so very prominent throughout Western and American society from upper political levels on down:
(1) Warped, distorted character: hard-edged egoism; uncongenial, dark nature.
(2) Extreme passions: abnormal sexuality (sodomy, lesbianism, transgender, gender fluid, non-binary, sadomasochism, bestiality, pedophilia, pederasty, zoophiles); violent temper, belligerence; tendencies to addiction; meanness, thievery; compulsive lying.
(3) Emotional disturbances; compulsive thoughts of murder, anxiety states.
(4) Possession with destructive urges, fits of mania; tendency to violent acts and crime
(5) Insanity.
(6) Bigoted attitude against Christ and God; conscious atheism; simulated piety; indifference to God’s word and to prayer; blasphemous thoughts; religious delusions.
The ultimate goal of fallen angels and evil spirits is degradation and desecration of man’s inner person, the spiritual part of him created in the image of the Holy God. So, what are systematically defaced and desecrated are the mind, will, conscience and sense of good and evil leading to the reversal of reality where evil is good, death is life, lie is truth, abnormal is normal, male is female, and criminals are the victims and the law abiding are the victimizers.