News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by The Werewolf
 - Yesterday at 20:34:37
First, the actual paper doesn't address the issue of hallucinations and the term is being misused here anyway. A hallucination is output generated by an AI neural network when it either has no input, input that is out of scope of its training, or input that is too vague and cannot trigger strong enough matching.

The gist of the paper is that if the training material for the LLM is styled a certain way - like having a specific tone - being formal or polite, then prompts which are written in a different style (terse, informal or lacking in politeness) will cause these problems as the tokens won't induce a strong trigger in the neural network.

However, much like a layperson talking to say a PhD physicist, unless the layperson has enough understanding of the genAI's training material, which would be difficult and time consuming, it would be difficult to adjust prompt style to maximise matching.

It also would introduce a kind of self-bias where the prompter will tend to prefilter the AI output to match what they want, rather than being a source of external knowledge.

So in the end, the users are NOT the real cause of hallucinations: it's an intrinsic feature of any neural network based system, even the human brain.
Posted by heffeque
 - Yesterday at 18:10:48
Yeah... Nope.
Posted by Redaktion
 - Yesterday at 14:20:36
A recently published study shows that the prompts given to AI assistants play a major role in the occurrence of so-called AI hallucinations. This is good news for users, as it suggests they can actively reduce false or fabricated responses through more effective prompt design.

https://www.notebookcheck.net/Study-finds-Users-are-the-real-cause-of-AI-hallucinations.1141622.0.html