Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
Jailbreaking (AIs)
Tag
Last edit:
29 Sep 2024 21:17 UTC
by
Raemon
Relevant
New
Old
Interpreting the effects of Jailbreak Prompts in LLMs
Harsh Raj
29 Sep 2024 19:01 UTC
8
points
0
comments
5
min read
LW
link
A Poem Is All You Need: Jailbreaking ChatGPT, Meta & More
Sharat Jacob Jacob
29 Oct 2024 12:41 UTC
12
points
0
comments
9
min read
LW
link
Jailbreaking ChatGPT and Claude using Web API Context Injection
Jaehyuk Lim
21 Oct 2024 21:34 UTC
4
points
0
comments
3
min read
LW
link
[Question]
Using hex to get murder advice from GPT-4o
Laurence Freeman
13 Nov 2024 18:30 UTC
10
points
5
comments
6
min read
LW
link
No comments.
Back to top