New State Law Slaps $10,000 Fines on Anyone Using AI for Therapy
Illinois has officially become the first US state to ban artificial intelligence from providing mental health advice, introducing fines of up to $10,000 for anyone who violates the law. Governor JB Pritzker signed the Wellness and Oversight for Psychological Resources (WOPR) Act earlier this week, setting strict boundaries for how AI can be used in mental health services.
Under the new rules, AI-powered platforms are prohibited from delivering any mental health guidance, making diagnoses, or suggesting treatment strategies. The state’s regulatory body will oversee enforcement.
Human therapists can still use AI for behind-the-scenes administrative tasks such as scheduling appointments and writing session notes. But the law makes it clear that AI cannot engage with clients in a therapeutic role.
“If you would have opened up a shop claiming to be a clinical social worker, you’d be shut down quickly. But somehow we were letting algorithms operate without oversight,” said Kyle Hillman, legislative director of the National Association of Social Workers, in an interview with Axios.
The legislation also separates wellness-focused apps from therapeutic AI services. Meditation and mindfulness platforms such as Calm will remain unaffected. However, AI programs that present themselves as always-available mental health tools will now be subject to the ban.
Illinois has officially become the first US state to ban artificial intelligence from providing mental health advice, introducing fines of up to $10,000 for anyone who violates the law.
Growing reliance on AI for emotional support
With therapy costs rising and access to mental health care often limited, more people have turned to AI-powered tools, including ChatGPT, for emotional support. These platforms are seen by some as a cheaper, more accessible alternative to traditional therapy.
Yet experts stress that AI cannot replicate the nuanced, empathetic understanding that human therapists provide.
The legislation also separates wellness-focused apps from therapeutic AI services.

Illinois’ decision could become a model for other states grappling with how to balance technological innovation with patient safety.

Dr Robin Lawrence, a psychotherapist and counsellor at 96 Harley Street with more than 30 years of experience, has repeatedly emphasized the risks of relying on AI in mental health care.
"It does not have the emotional intelligence of a human," Dr Lawrence says. He argues that vulnerable individuals should never be placed in a position where they must depend on AI due to financial constraints.
In the worst cases, he warns, clients may feel "no better than when they started," and in more severe situations, the consequences could be tragic.
Illinois’ decision could become a model for other states grappling with how to balance technological innovation with patient safety, especially as AI continues to evolve and integrate into everyday life.