Bypassing ChatGPT Safety Guardrails, One Emoji at a Time
Data Breach Today
NOVEMBER 4, 2024
Mozilla Researcher Uses Non-Natural Language to Jailbreak GPT-4o Anyone can jailbreak GPT-4o's security guardrails with hexadecimal encoding and emojis. A Mozilla researcher demonstrated the jailbreaking technique, tricking OpenAI's latest model into generating python exploits and malicious SQL injection tools.
Let's personalize your content