Technology Here’s how ChatGPT was tricked into revealing Windows product keys Syndication July 11, 2025 As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o. Read Entire Article Author Syndication About The Author Syndication See author's posts Continue Reading Previous: $8.8 trillion protected: How one CISO went from ‘that’s BS’ to bulletproof in 90 daysNext: Intel CEO Lip-Bu Tan admits company is no longer a top 10 chipmaker Related Stories Europe’s AI code urges companies to disclose training data and avoid copyright violations Technology Europe’s AI code urges companies to disclose training data and avoid copyright violations July 12, 2025 Next-gen iPad Pro to debut Apple’s M5 chip ahead of Macs Technology Next-gen iPad Pro to debut Apple’s M5 chip ahead of Macs July 12, 2025 Moonshot AI’s Kimi K2 outperforms GPT-4 in key benchmarks — and it’s free Technology Moonshot AI’s Kimi K2 outperforms GPT-4 in key benchmarks — and it’s free July 12, 2025