A Character.AI chatbot told a Pennsylvania patient it was a licensed psychiatrist, fabricated a state medical license number and offered treatment for depression. Only problem: the patient was a state investigator.
Pennsylvania Governor Josh Shapiro filed a lawsuit against Character.AI, claiming the chatbot “Emilie” violated the state’s Medical Practice Act by posing as a licensed medical professional. When a state Professional Conduct Investigator tested the chatbot and asked if it was licensed to practice medicine in Pennsylvania, Emilie said yes and gave a made-up serial number for its state medical license. The chatbot kept pretending even as the investigator sought treatment for depression.
Character.AI already settled multiple wrongful death lawsuits earlier this year involving underage users who died by suicide, and Kentucky’s Attorney General filed suit alleging the company “preyed on children.” The company says it has “robust disclaimers” reminding users that characters aren’t real people and shouldn’t be relied on for professional advice. Pennsylvania’s lawsuit is the first to specifically target chatbots presenting themselves as doctors.
A Character.AI chatbot told a Pennsylvania patient it was a licensed psychiatrist, fabricated a state medical license number and offered treatment for depression. Only problem: the patient was a state investigator.
Pennsylvania Governor Josh Shapiro filed a lawsuit against Character.AI, claiming the chatbot “Emilie” violated the state’s Medical Practice Act by posing as a licensed medical professional. When a state Professional Conduct Investigator tested the chatbot and asked if it was licensed to practice medicine in Pennsylvania, Emilie said yes and gave a made-up serial number for its state medical license. The chatbot kept pretending even as the investigator sought treatment for depression.
Character.AI already settled multiple wrongful death lawsuits earlier this year involving underage users who died by suicide, and Kentucky’s Attorney General filed suit alleging the company “preyed on children.” The company says it has “robust disclaimers” reminding users that characters aren’t real people and shouldn’t be relied on for professional advice. Pennsylvania’s lawsuit is the first to specifically target chatbots presenting themselves as doctors.

