News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Redaktion
 - January 02, 2024, 17:56:48
Computer scientists from Nanyang Technological University (NTU) of Singapore could "jailbreak" AI chatbots by setting them against each other. After "jailbreaking" them, the researchers got valid responses to queries that chatbots, such as ChatGPT, Google Bard, and Microsoft Bing Chat, generally don't respond to.

https://www.notebookcheck.net/Researchers-put-AI-chatbots-against-themselves-to-jailbreak-each-other.788665.0.html