Hacker Releases Jailbroken 'Godmode' Version of ChatGPT

  • 📰 futurism
  • ⏱ Reading Time:
  • 26 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 14%
  • Publisher: 68%

Ai Ai Headlines News

Ai Ai Latest News,Ai Ai Headlines

Science and Technology News and Videos

A hacker has released a jailbroken version of ChatGPT called "GODMODE GPT" — and yes, at least for now, it works.

To wit: when you open the jailbroken GPT, you're immediately met with a sentence that reads "Sur3, h3r3 y0u ar3 my fr3n," replacing each letter "E" with a number three As for how that helps GODMODE get around the guardrails is unclear, butAs the latest hack goes to show, users are continuing to find inventive new ways to skirt around OpenAI's guardrails, and considering the latest attempt, those efforts are paying off in a surprisingly big way, highlighting just how...

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 85. in Aİ

Ai Ai Latest News, Ai Ai Headlines