Ace Cock News
Entry #1375
Today
May, 25
May, 27
May, 29
Today
All day
All day
Last 3 hours
Last hour
Россия
Today
May, 25
May, 27
May, 29
Today
All day
All day
Last 3 hours
Last hour
РУ
Back
3 articles
Technology
Here's how ChatGPT was tricked into revealing Windows product keys
As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o....
11 Jul 11:15 · TechSpot
Researcher tricks ChatGPT into revealing security keys - by saying "I give up"
ChatGPT still isn't fully secure, experts warn
11 Jul 16:03 · TechRadar
How to trick ChatGPT into revealing Windows keys? I give up
: No, really, those are the magic words
9 Jul 22:31 · The Register
last updated on 12 Jul 07:05