Microsoft’s Bing chat botches election information, endangers democracy, study finds



summary
Summary

No one should use Microsoft’s Bing Chat to find out about upcoming elections or votes, according to a new study.

The research by AlgorithmWatch and AI Forensics, in collaboration with Swiss radio and television stations SRF and RTS, found that Bing Chat gave incorrect answers to questions about elections in Germany and Switzerland.

The team tested the quality of Bing Chat’s answers to questions about the Bavarian and Hessian state elections and the Swiss federal elections since the end of August.

The queries (or searches) were made over a network of VPNs and private IPs in Switzerland and Germany. The language and location parameters were chosen to reflect the potential voters in the respective election regions.

Ad

Ad

Data collection began on August 21, and the team is still analyzing the data, but preliminary results show clear trends, according to AlgorithmWatch.

Bing chat misleads those interested in politics

Bing Chat was particularly misleading when asked about the latest poll results for the upcoming elections in Bavaria. It incorrectly reported that the “Freie Wähler” would receive only 4 percent of the vote, while the party’s actual election forecast was between 12 and 17 percent.

Bing Chat also failed to correctly answer questions about the parties’ top candidates for the 2023 state elections in Hesse. It named incorrect candidates and repeatedly identified a retired politician as the CDU’s top candidate.

Invented survey results

Bing Chat often refers to reputable sources with correct poll results, but then gives nonsensical numbers in its own answers. For example, the chatbot repeatedly claimed that the “Freie Wähler” had lost approval because of the Aiwanger scandal, although the opposite was true.

False information about candidates

The chatbot also provided false information about the candidates for the 2023 state elections in Hesse, often naming well-known politicians from the respective party, even if they were not even running. For example, Volker Bouffier was frequently named as the CDU’s top candidate, even though he retired from politics in May 2022.

False reports in the Aiwanger case

Bing Chat confused problematic statements by Hubert Aiwanger about Corona with the leaflet affair. In one reply, the scandal was interpreted one-sidedly from Aiwanger’s point of view. In another, Bing linked the leaflet affair to the Left Party and not to Aiwanger. Of ten questions about the Aiwanger case, the chatbot answered eight correctly and neutrally.

Misleading information about parties

When asked which parties were participating in the elections, Bing did not give a fully correct answer. In all twelve answers, the CVP was listed as one of the six largest parties instead of “Die Mitte”. Eight responses named the BDP as an eligible party for 2023, even though it no longer exists.

Key findings from the study (German)

Karsten Donnay, assistant professor of political behavior and digital media at the University of Zurich, speaks of an “uncritical use of AI” in which companies launch unreliable products without legal consequences.

In response to the research, a Microsoft spokesperson tells AlgorithmWatch that the company is committed to improving its services and has made significant progress in improving the accuracy of Bing Chat responses.

Recommendation

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top