Technology Here’s how ChatGPT was tricked into revealing Windows product keys Syndication July 11, 2025 As explained by 0DIN GenAI Bug Bounty Technical Product Manager Marco Figueroa, the jailbreak works by leveraging the game mechanics of large language models such as GPT-4o. Read Entire Article Author Syndication About The Author Syndication See author's posts Post navigation Previous $8.8 trillion protected: How one CISO went from ‘that’s BS’ to bulletproof in 90 daysNext Intel CEO Lip-Bu Tan admits company is no longer a top 10 chipmaker Related Stories Palona goes vertical, launching Vision, Workflow features: 4 key lessons for AI builders Technology Palona goes vertical, launching Vision, Workflow features: 4 key lessons for AI builders December 19, 2025 Anthropic launches enterprise ‘Agent Skills’ and opens the standard, challenging OpenAI in workplace AI Technology Anthropic launches enterprise ‘Agent Skills’ and opens the standard, challenging OpenAI in workplace AI December 19, 2025 OpenAI now accepting ChatGPT app submissions from third-party devs, launches App Directory Technology OpenAI now accepting ChatGPT app submissions from third-party devs, launches App Directory December 19, 2025