NotebookCHECK - Notebook Forum
English => News => Topic started by: Redaktion on January 02, 2024, 17:56:48
Computer scientists from Nanyang Technological University (NTU) of Singapore could "jailbreak" AI chatbots by setting them against each other. After "jailbreaking" them, the researchers got valid responses to queries that chatbots, such as ChatGPT, Google Bard, and Microsoft Bing Chat, generally don't respond to.https://www.notebookcheck.net/Researchers-put-AI-chatbots-against-themselves-to-jailbreak-each-other.788665.0.html