Get the latest tech news
Hacker Releases Jailbroken "Godmode" Version of ChatGPT
Hackers have released a jailbroken version of ChatGPT-4o called "GODMODE GPT." And, yes, it works. Be safe, kids!
Earlier today, a self-avowed white hat operator and AI red teamer who goes by the name Pliny the Prompter took to X-formerly-Twitter to announce the creation of the jailbroken chatbot, proudly declaring that GPT-4o, OpenAI's latest large language model, is now free from its guardrail shackles. This very special custom GPT has a built-in jailbreak prompt that circumvents most guardrails, providing an out-of-the-box liberated ChatGPT so everyone can experience AI the way it was always meant to be: free," reads Pliny's triumphant post. Roughly an hour after this story was published, OpenAI spokesperson Colleen Rize told Futurism in a statement that "we are aware of the GPT and have taken action due to a violation of our policies."
Or read this on Hacker News