Research gives 15 reasons to tell everyone that why using ChatGPT as therapist can be dangerous
A new study has raised concern about using AI chatbots like ChatGPT for mental health support, identifying multiple risks linked to such use. Researchers from Brown University found that these systems may not meet professional ethics standards followed by trained therapists. The study was presented at the AAAI/ACM Conference on Artificial Intelligence, Ethics and Society. It outlines 15 key risks associated with using ChatGPT as a therapist. The study comes at a time when more people are turning to AI tools for emotional advice and support.
Research highlights 15 risks associated with AI therapy use
