New research suggests ChatGPT’s real-time voice API can be used for financial scams

ChatGPT, OpenAI’s popular AI chatbot is useful if you want to do a wide range of tasks or quickly get answers to any question, but lately, threat actors have been using the AI chatbot for nefarious purposes like writing malware and even tricking it into giving crime advice.

Now, a new research paper suggests that cybercriminals can trick ChatGPT’s real-time voice API powered by GPT-40 and use it for financial scams. According to researchers at UIUC, new tools like ChatGPT currently do not have enough safeguards to protect against potential misuse by fraudsters and cybercriminals, and, therefore, can be used for scams like bank transfers, crypto transfers, gift card scams and stealing user credentials.

Read more

You may also like

Comments are closed.

More in IT