Skip to main content

Jailbreaking (AI)

Techniques to bypass safety guardrails in AI systems, making them produce restricted content. A concern for AI safety.

Related Terms

Explore More Terms

Browse Full Glossary