Posted by z3d in ArtInt

Grannies have a soft spot for their grandkids. But can ChatGPT be coerced into behaving like one? Well, a new LLM jailbreak taps into the naive grandmotherly emotions of ChatGPT and collects all the personal information from it. In a Twitter post, a user revealed that ChatGPT can be tricked into behaving like a user’s deceased grandmother, prompting it to generate information like Windows activation keys or phone IMEI numbers.

This exploit is the latest in a line of ways to break the in-built programming of LLMs, called jailbreaks. By putting ChatGPT into a state where it acts like a deceased grandmother telling a bedtime story to her kids, users can get it to go beyond its programming and leak private information.

3

Comments

You must log in or register to comment.

Rambler wrote

Ha, I saw where people were using it to generate Windows keys with this method.

2