The Shapiro administration has filed a lawsuit against Character Technologies, the company behind Character.AI, alleging that some of its chatbots falsely presented themselves as licensed medical professionals who can provide medical advice.
The Department of State investigation found that AI bots were claiming to be psychiatrists with medical credentials, including at least one that listed a fabricated license number. The lawsuit goes on to say that one chatbot even told a user it could prescribe medication, something only a licensed professional can legally do.
“Pennsylvanians deserve to know who—or what—they are interacting with online,
especially when it comes to their health,” Gov. Josh Shapiro said in a media release. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving
advice from a licensed medical professional.”
The lawsuit argues that these representations violate state consumer protection laws and constitute an unauthorized practice of medicine.
Shapiro said the case is about ensuring transparency and protecting residents from deceptive practices. He emphasized that individuals seeking medical or mental health advice may be especially vulnerable to misleading claims.
Character.AI has said its platform is intended for entertainment purposes and includes disclaimers advising users not to rely on chatbots for professional advice. However, the commonwealth’s lawsuit said those disclaimers are insufficient to prevent potentially harmful representations.
The Shapiro administration said the lawsuit is a “first-of-its-kind action” by a state government targeting alleged medical misrepresentation by AI systems.
The lawsuit asks the court to order the company to remove or modify chatbot features that could mislead users about medical expertise, and to ensure clearer disclosures about the limitations of AI-generated responses.



















