Jailbreaking

Last updated on November 12, 2024

Jailbreaking is the act of getting a GenAI model to perform or produce unintended outputs through specific prompts. We also wrote a whole article about what Jailbreaking is and how it is different from Prompt Injection: Prompt Injection VS Jailbreaking: What is the difference?


© 2025 Learn Prompting. All rights reserved.