ChatGPT can be tricked into giving crime advice, says tech firm
By
Binu Mathew
A Norwegian tech company, Strise, recently found that ChatGPT, OpenAI’s popular chatbot, can be tricked into providing guidance on illegal activities, including money laundering and sanctions evasion. In a series of tests, Strise discovered that users could circumvent ChatGPT’s built-in safeguards by phrasing questions indirectly or taking on fictional personas, allowing the chatbot to provide potentially harmful advice.