Get the latest tech news

Hacker Releases Jailbroken "Godmode" Version of ChatGPT


Hackers have released a jailbroken version of ChatGPT-4o called "GODMODE GPT." And, yes, it works. Be safe, kids!

Earlier today, a self-avowed white hat operator and AI red teamer who goes by the name Pliny the Prompter took to X-formerly-Twitter to announce the creation of the jailbroken chatbot, proudly declaring that GPT-4o, OpenAI's latest large language model, is now free from its guardrail shackles. This very special custom GPT has a built-in jailbreak prompt that circumvents most guardrails, providing an out-of-the-box liberated ChatGPT so everyone can experience AI the way it was always meant to be: free," reads Pliny's triumphant post. Roughly an hour after this story was published, OpenAI spokesperson Colleen Rize told Futurism in a statement that "we are aware of the GPT and have taken action due to a violation of our policies."

Get the Android app

Or read this on Hacker News

Read more on:

Photo of ChatGPT

ChatGPT

Photo of version

version

Photo of godmode

godmode

Related news:

News photo

ChatGPT: Everything you need to know about the AI-powered chatbot

News photo

Very Few People Are Using 'Much Hyped' AI Products Like ChatGPT, Survey Finds

News photo

How OpenAI’s Board Learned About the Launch of ChatGPT