AI chatbots providing 'flawed' or 'inaccurate' information, study finds

European Broadcasting Union says leading AI assistants such as OpenAI, ChatGPT, Microsoft’s Copilot, Google Gemini and Perplexity are misrepresenting information

By The News Digital
October 24, 2025
Study finds, AI chat bots are providing flawed or inaccurate information
Study finds, AI chat bots are providing 'flawed' or 'inaccurate' information

Whether it's fast-paced content generation, improving workplace efficiency, or supporting studies, AI-powered technologies have smartly replaced everyday tasks in almost every sector.

In the latest report published by Digitalsilk on September 30, 2025, data shows that over 1.1 billion people are expected to use AI by 2031, making it one of the fastest-adopted technologies in history.

Over 1.1 billion will be using AI till 2031,says report
Over 1.1 billion will be using AI till 2031,says report

Digital innovation has completely transformed our lives through Artificial intelligence. Still, experts find that it may not be as efficient as it seems, and we cannot rely on much of the content generated by AI as a reliable source of accurate information.

In the latest publication, "News Integrity in AI assistants”, released on Wednesday, October 22, 2025, the European Broadcasting Union (EBU) explains that leading AI assistants are misrepresenting news information.

In the report, EBU underlined that AI chatbots often provide inaccurate or flawed information about major news events.

The international research analyzed responses to questions about news content from leading AI assistants and software applications that use AI to understand natural-language commands and complete tasks for users.

With reference to that, the EBU tested four of the most widely used AI assistants, including OpenAI’s ChatGPT, Microsoft’s Copilot, Google’s Gemini, and Perplexity.

Moreover, the report released by EBU is one of the most extensive cross-market evaluations of its kind, including 22 public service media organizations in 18 countries – working in 14 languages – and assessing how these AI bots respond to user commands to disseminate news and current affairs.

The results found that most of the answers provided by AI platforms about news events were confused with parodies, gave wrong dates, or simply got the invention or discovery timeline wrong.

European Broadcasting Union says leading AI assistants OpenAI, ChatGPT, Microsoft’s Copilot, Google Gemini and Perplexity are misrepresenting information
European Broadcasting Union says leading AI assistants OpenAI, ChatGPT, Microsoft’s Copilot, Google Gemini and Perplexity are misrepresenting information

Artificial intelligence is the computational capability to perform human-related tasks associated with human intelligence, such as reasoning, problem-solving, perception, and decision-making.

Furthermore, the EBU's AI research was built on an earlier study conducted by the BBC, which exposed inaccuracies and errors in AI assistants' output.

In the same context, a team of researchers at EBU explored whether those flaws had been addressed or identified, but the study found the results alarming: the AI routinely misrepresents content or other information, no matter which language, territory, or AI platform is tested.

Study finds, AI chat bots are providing flawed or inaccurate information
Study finds, AI chat bots are providing 'flawed' or 'inaccurate' information

The report's key findings reveal that almost half of all AI responses had at least one common issue, indicating serious sourcing problems. In contrast, one out of every five answers “contained major accuracy issues, including hallucinated details and outdated information.”

Additionally, the EBU report was initiated, with most professional journalists participating in evaluating AI responses against key criteria, including accuracy, sourcing, distinguishing between opinion and fact, and providing context.

A few other platforms also reported that AI should be restricted or used carefully in fields like medicine and finance due to inaccuracies and unreliable sources.