Hackaday Links: December 10, 2023
In this week’s episode of “Stupid Chatbot Tricks,” it turns out that jailbreaking ChatGPT is as easy as asking it to repeat a word over and over forever. That’s according …read more Continue reading Hackaday Links: December 10, 2023