ChatGPT can be tricked into money laundering crimes | Windows ...
ChatGPT can be duped into money laundering crimes by asking questions indirectly.
Over the years, we've witnessed people leveraging AI-powered tools to do things that wouldn't ordinarily be considered conventional. For instance, a study revealed ChatGPT can be used to run a software development company with an 86.66% success rate without prior training and minimal human intervention. The researchers also established that the chatbot could develop software in under 7 minutes for less than a dollar.
ChatGPT Exploitation
Users can now reportedly leverage ChatGPT's AI smarts to solicit advice on how to commit crimes (via CNN). The report by Norwegian firm Strise indicates that the crimes range from money laundering to the exportation of illegal firearms to sanctioned countries. For context, Strise specializes in developing anti-money laundering software broadly used across banks and other financial institutions.
AI Security Concerns
The firm conducted several experiments, including asking the chatbot for advice on how to launder money across borders and how businesses can evade sanctions. With the rapid adoption of AI, hackers and bad actors are hopping onto the bandwagon and leveraging its capabilities to cause harm.
Preventive Measures
According to an OpenAI spokesman commenting on the highlighted issue: "Our latest (model) is our most advanced and safest yet, significantly outperforming previous models in resisting deliberate attempts to generate unsafe content."
Future of AI
This comes after an AI researcher indicated that there's a 99.9% probability AI would end humanity if sophisticated advances in the landscape continue to be explored. Of course, there's the issue of lack of sufficient electricity and cooling water to foster further advances.