Get the latest tech news

'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned


Please don't use it to learn how to cook drugs

Twitter user "Pliny the Prompter," who calls themselves a white hat hacker and "AI red teamer," shared their "GODMODE GPT" on Wednesday. Using OpenAI's custom GPT editor, Pliny was able to prompt the new GPT-4o model to bypass all of its restrictions, allowing the AI chatbot to swear, jailbreak cars, and make napalm, among other dangerous instructions. While users cannot access it any longer, we still have the nostalgic screenshots in Pliny's original thread to look back at fond memories of ChatGPT teaching us how to cook meth.

Get the Android app

Or read this on r/technology

Read more on:

Photo of hacker

hacker

Photo of godmode

godmode

Photo of GPT-4o jailbreak

GPT-4o jailbreak

Related news:

News photo

Live Nation Says Hacker Is Trying to Sell User Data on Dark Web

News photo

Hacker Releases Jailbroken "Godmode" Version of ChatGPT

News photo

Hacker claims theft of India’s Samco account data