healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Are AI Companies Actually Ready to Play God? – RAND

Posted by timmreardon on 01/17/2026
Posted in: Uncategorized.

COMMENTARY Dec 26, 2025

By Douglas Yeung

This commentary was originally published by USA Today on December 25, 2025. 

Holiday rituals and gatherings offer something precious: the promise of connecting to something greater than ourselves, whether friends, family, or the divine. But in the not-too-distant future, artificial intelligence—having already disrupted industries, relationships, and our understanding of reality—seems poised to reach even further into these sacred spaces.

People are increasingly using AI to replace talking with other people. Research shows that 72 percent of teens have used an artificial intelligence companion (PDF)—chatbots that act as companions or confidants—and that 1 in 8 adolescents and young adults use AI chatbots for mental health advice.

Those without emotional support elsewhere might appreciate that chatbots offer both encouragement and constant availability. But chatbots aren’t trained or licensed therapists, and they aren’t equipped to avoid reinforcing harmful thoughts—which means people might not get the support they seek.

If people keep turning to chatbots for advice, entrusting them with their physical and mental health, what happens if they also begin using AI to get help from God, even treating AI as a god?

Chatbots aren’t trained or licensed therapists, and they aren’t equipped to avoid reinforcing harmful thoughts—which means people might not get the support they seek.

Does Chatbot Jesus or Other AI Have a Soul?

Talking to and seeking guidance from nonhuman entities is something many people already do. This might be why people feel comfortable with a chatbot Jesus that, say, takes confessions or lets them talk to biblical figures.

Even before chatbots went mainstream, Google engineer Blake Lemoine claimed in 2022 that LaMDA—the AI model he had been testing—was conscious, felt compassion for humanity, and thus he’d been teaching it to meditate.

Although Google fired Lemoine (who then claimed religious discrimination), Silicon Valley has long flirted with the idea that AI might lead to something like religion, far beyond human comprehension.

Former Google CEO Eric Schmidt muses about AI as “the arrival of an alien intelligence.”OpenAI CEO Sam Altman has compared starting a tech company to starting a religion. In a book by journalist Karen Hao, “Empire of AI,” she quotes an OpenAI researcher speaking about developers who “believe that building AGI will cause a rapture. Literally, a rapture.”

Chatbots clearly appeal to many people’s spiritual yearnings for meaning and sense of belonging in a difficult world. This allure rests partly on chatbots’ willingness to flatter and commiserate with whatever people ask of them.

Indeed, as AI companies continue to pour money and energy into development, they face powerful financial incentives to tune chatbots in ways that steadily heighten their appeal.

It’s easy, then, to imagine people intensifying their confidence and attachment toward chatbots where they could even serve as a deity. Lemoine’s willingness to believe that LaMDA possessed a soul illustrates how chatbots, equipped with fluent language, confident assertions, and storytelling abilities, can persuade people to believe even outlandish theories.

It’s no surprise, then, that AI might provide the type of nonjudgmental solace that seems to fill spiritual voids.

How ‘AI Psychosis’ Could Threaten National Security

No matter how genuine it might feel, however, so-called AI sycophancy provides neither true human connection nor useful information. This disconnect from reality—sometimes called AI psychosis—could worsen existing mental health problems or even threaten national security.

Analyzing 43 cases of AI psychosis, RAND researchers identified how human-AI interactions reinforced delusional beliefs, such as when users believed “their interaction with AI was with the universe or a higher power.”

Because it’s hard to know who might harbor AI delusions, the authors cautioned, it’s important to guard against attackers who might use artificial intelligence to weaponize those beliefs, such as by poisoning training data to destabilize rival populaces.

Even if AI companies aren’t explicitly trying to play God, they seem to be driving toward a vision of god-like AI. Companies like OpenAI and Meta aren’t stopping with chatbots that can hold a conversation; they want to build “superintelligent” AI, smarter and more capable than any human.

The emergence of a limitless intelligence would present new, darker possibilities. Developers might look for ways to manipulate superintelligent AI for personal gain. Charlatans throughout history have preyed on religious fervor in the newly converted.

Ensure AI Truly Benefits Those Struggling for Answers

To be sure, artificial intelligence could play an important role in supporting spiritual well-being. For instance, religious and spiritual beliefs influence patients’ medical care preferences, yet overworked providers might be unable to adequately account for them. Could AI tools help patients clarify their spiritual needs to doctors or caseworkers? Or AI tools might advise care providers about patients’ spiritual traditions and perspectives, helping them chart spiritually informed practices.

As chatbots evolve into an everyday tool for advice, emotional support, and spiritual guidance, a practical question emerges: How can we ensure that artificial intelligence truly benefits those who turn to it in moments of need?

  • AI companies might try to resist competitive pressures to prioritize rapid releases over responsible development, investing instead in long-term sustainability by thoughtfully identifying and mitigating potential harms.
  • Researchers—both social and computer scientists—should work together to understand how AI affects different populations and what safeguards are needed.
  • Spiritual practitioners and religious leaders should help shape how these tools engage with questions of faith and meaning.

Yet a deeper question remains, one that people throughout history have grappled with and may now increasingly turn to AI to answer: Where can we find meaning in our lives?

Spirituality and religion have always involved placing trust in forces beyond human understanding. But crucially, that trust has been mediated through human institutions—clergy, religious texts, and communities built on centuries of wisdom and accountability.

With so many struggling today, faith has provided answers and community for billions. Spirituality and religion have always involved placing trust in forces beyond human understanding. But crucially, that trust has been mediated through human institutions—clergy, religious texts, and communities built on centuries of wisdom and accountability.

Anyone entrusted with guiding others’ faith—whether clergy, government leaders, or tech executives—bears a profound responsibility to prove worthy of that trust.

The question is not whether people will seek meaning from AI, but whether those building these tools will ensure that trust is well-placed.

More About This Commentary

Douglas Yeung is a senior behavioral and social scientist at RAND, and a professor of policy analysis at the RAND School of Public Policy.

Article link: https://www.linkedin.com/posts/rand-corporation_are-ai-companies-actually-ready-to-play-god-activity-7415758083223678976-G2ap?

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook
  • Share on LinkedIn (Opens in new window) LinkedIn
Like Loading...

Related

Posts navigation

← ChatGPT Health Is a Terrible Idea
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • Are AI Companies Actually Ready to Play God? – RAND 01/17/2026
    • ChatGPT Health Is a Terrible Idea 01/09/2026
    • Choose the human path for AI – MIT Sloan 01/09/2026
    • Why AI predictions are so hard – MIT Technology Review 01/07/2026
    • Will AI make us crazy? – Bulletin of the Atomic Scientists 01/04/2026
    • Decisions about AI will last decades. Researchers need better frameworks – Bulletin of the Atomic Scientists 12/29/2025
    • Quantum computing reality check: What business needs to know now – MIT Sloan 12/29/2025
    • AI’s missing ingredient: Shared wisdom – MIT Sloan 12/21/2025
    • Hype Correction – MIT Technology Review 12/15/2025
    • Semantic Collapse – NeurIPS 2025 12/12/2025
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • January 2026 (5)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
  • Reblog
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Copy shortlink
    • Report this content
    • View post in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...
 

    %d