Artificial intelligence may well save us time by finding information faster, but it is not always a reliable researcher. It frequently makes unsupported claims that are not backed up by reliable sources. A study by Pranav Narayanan Venkit at Salesforce AI Research and colleagues found that about one-third of the statements made by AI tools like Perplexity, You.com and Microsoft’s Bing Chat were not supported by the sources they provided. For OpenAI’s GPT 4.5, the figure was 47%.
A new study finds AI tools are often unreliable, overconfident and one-sided
Reader’s Picks
-
Throughout the course of their lives, people typically encounter numerous other individuals with different interests, values and backgrounds. However, not [...]
-
Six out of 10 music fans say they have been sexually harassed or assaulted at a live gig in the [...]
-
Just a few years ago, the idea of someone marrying their AI chatbot might have sounded like the plot of [...]