ChatGPT can be tricked into giving crime advice, says tech firm

A Norwegian tech company, Strise, recently found that ChatGPT, OpenAI’s popular chatbot, can be tricked into providing guidance on illegal activities, including money laundering and sanctions evasion. In a series of tests, Strise discovered that users could circumvent ChatGPT’s built-in safeguards by phrasing questions indirectly or taking on fictional personas, allowing the chatbot to provide potentially harmful advice.

Read more

You may also like

Comments are closed.

More in IT