Technology Here’s how ChatGPT was tricked into revealing Windows product keys Syndication July 11, 2025 As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o. Read Entire Article Author Syndication About The Author Syndication See author's posts Post navigation Previous $8.8 trillion protected: How one CISO went from ‘that’s BS’ to bulletproof in 90 daysNext Intel CEO Lip-Bu Tan admits company is no longer a top 10 chipmaker Related Stories Skyrim and Fallout 4 designer warns raising game prices beyond $70 could backfire Technology Skyrim and Fallout 4 designer warns raising game prices beyond $70 could backfire February 7, 2026 Waymo admits that its autopilot is often just guys from the Philippines Technology Waymo admits that its autopilot is often just guys from the Philippines February 7, 2026 3D-printed batteries aim to reshape energy storage in small devices Technology 3D-printed batteries aim to reshape energy storage in small devices February 7, 2026