Technology Here’s how ChatGPT was tricked into revealing Windows product keys Syndication July 11, 2025 As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o. Read Entire Article Author Syndication About The Author Syndication See author's posts Post navigation Previous $8.8 trillion protected: How one CISO went from ‘that’s BS’ to bulletproof in 90 daysNext Intel CEO Lip-Bu Tan admits company is no longer a top 10 chipmaker Related Stories AMD prepares Ryzen 7 9700X3D as budget-friendly 3D V-Cache CPU, AI Max+ 388 also appears on PassMark Technology AMD prepares Ryzen 7 9700X3D as budget-friendly 3D V-Cache CPU, AI Max+ 388 also appears on PassMark November 5, 2025 IBM cuts thousands of jobs as it pushes deeper into AI and cloud computing Technology IBM cuts thousands of jobs as it pushes deeper into AI and cloud computing November 5, 2025 Amazon warns Perplexity with cease-and-desist over AI shopping agent Technology Amazon warns Perplexity with cease-and-desist over AI shopping agent November 5, 2025