No, these are not the hallucinations that one physically experiences in life…these are AI hallucinations… what are they? Let us see more in this post;
What are AI hallucinations?
AI hallucinations are misleading results generated by AI models. When AI perceives patterns that are not in existence and generates results in a totally nonsensical way for human observers, it is known as AI hallucination.
Why do AI hallucinations occur?
P.S: Is AI hallucinating in this picture?
It is obvious that these inaccurate results are generated because of misleading, biased and lopsided data that is fed to the AI engines. Since AI models learn from what is taught to it, if it is fed incorrect information, it will generate incorrect results. There are three types of AI hallucinations. They are – false positive(generating a positive when there is a negative), false negative(generating a negative when there is a positive) and incorrect results
How do we reduce AI hallucinations?
As a starting point, it is good to train the models with correct, diverse and unbiased data
Train the model with a template
We can query the AI model with the correct set of phrases…(will we get a AI query language in the future or is ‘prompt engineering’ already making inroads?)
It is good to verify all the outputs that AI generates for hallucinations.
References:
- https://cloud.google.com/discover/what-are-ai-hallucinations#:~:text=AI%20hallucinations%20are%20incorrect%20or,used%20to%20train%20the%20model.
- https://www.ibm.com/topics/ai-hallucinations
This post is for BlogchatterA2Z 2024!
Interesting. Sometimes they are enjoyable. Sometimes they waste our time. Outputs to be verified. yes. The inputs (prompts) should be carefully crafted too.
At all times, we have to wait and see what is wrong with the picture or any AI generated output… and then only use it wherever we want…
Agree!
Very insightful post amma, once in a while, even ChatGPT generates incorrect data. There is still a long way for AI to go…..
Yes…Paapu…
Very interesting!
Learned a new word today. I have seen how AI sometimes takes your words literally and misses the nuance so the results often aren’t what you’re expecting.
Yes.. AI is hallucinating.. 😅