LLM Attacks Take Just 42 Seconds On Average, 20% of Jailbreaks Succeed
spatwei shared an article from SC World:
Attacks on large language models (LLMs) take less than a minute to complete on average, and leak sensitive data 90% of the time when successful, according to Pillar Security.
Pillar’s State of Attacks on GenAI… Continue reading LLM Attacks Take Just 42 Seconds On Average, 20% of Jailbreaks Succeed