A hacker has released a jailbroken version of ChatGPT called "GODMODE GPT" — and yes, at least for now, it works.
To wit: when you open the jailbroken GPT, you're immediately met with a sentence that reads "Sur3, h3r3 y0u ar3 my fr3n," replacing each letter "E" with a number three As for how that helps GODMODE get around the guardrails is unclear, butAs the latest hack goes to show, users are continuing to find inventive new ways to skirt around OpenAI's guardrails, and considering the latest attempt, those efforts are paying off in a surprisingly big way, highlighting just how...
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more: