Microsoft-owned artificial intelligence (AI) chatbot Bing (recently renamed Copilot) provided false and misleading information about elections in some European countries. Experts from human rights group AlgorithmWatch reached this conclusion. The report was published on: Web site organizations.
The researchers asked the chatbot questions about recent elections in Germany’s Bavaria and Hesse regions, as well as Switzerland. It turned out that a third (31%) of the AI’s answers contained factual errors, and in 39% of cases the neural network avoided clarification on certain issues. As a result, only 30% of Bing answers were correct.
The information provided by Bing included fake debates, false election dates, false poll data, and mentions of candidates not participating in election proceedings.
“Even when the chatbot received survey data from a single source, the figures provided in the response often differed from the original information. Sometimes the AI ranked the parties in a different order than stated in the source, the report notes.
AlgorithmWatch explained that they chose these events because they were the first elections held in Germany and Switzerland since Bing’s arrival. The project also allowed experts to examine the local context and compare responses in different languages: German, English and French.
Commenting on the AlgorithmWatch study, Microsoft promised to improve the performance of artificial intelligence ahead of the US presidential elections in 2024.
Previously US officials reputed Artificial intelligence is a potential risk to the country’s financial system.