Technology Here’s how ChatGPT was tricked into revealing Windows product keys Syndication July 11, 2025 As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o. Read Entire Article Author Syndication About The Author Syndication See author's posts Post navigation Previous $8.8 trillion protected: How one CISO went from ‘that’s BS’ to bulletproof in 90 daysNext Intel CEO Lip-Bu Tan admits company is no longer a top 10 chipmaker Related Stories South Korea moves to curb the meteoritic rise of DRAM and PC hardware prices Technology South Korea moves to curb the meteoritic rise of DRAM and PC hardware prices April 10, 2026 Keychron shares 3D keyboard blueprints on GitHub, opening hardware to modders Technology Keychron shares 3D keyboard blueprints on GitHub, opening hardware to modders April 10, 2026 Nvidia’s mythical N1 SoC surfaces on a real motherboard, and it’s packing 128GB of LPDDR5X Technology Nvidia’s mythical N1 SoC surfaces on a real motherboard, and it’s packing 128GB of LPDDR5X April 10, 2026