Dutch watchdog warns AI chatbots threaten election integrity

AI chatbots repeatedly recommended the same two parties, raising concerns over their role in democratic processes.

Dutch authorities warn that biased AI chatbots could endanger election fairness by steering voters towards specific political parties.

The Dutch data authority warns AI chatbots are biased and unreliable for voting advice ahead of national elections. An AP investigation found chatbots often steered users to the same two parties, ignoring their actual preferences.

In over half of the tests, the bots suggested either Geert Wilders’ far-right Freedom Party (PVV) or the leftwing GroenLinks-PvdA led by Frans Timmermans. Other parties, such as the centre-right CDA, were rarely mentioned even when users’ answers closely matched their platforms.

AP deputy head Monique Verdier said that voters were being steered towards parties that did not necessarily reflect their political views, warning that this undermines the integrity of free and fair elections.

The report comes ahead of the 29 October election, where the PVV currently leads the polls. However, the race remains tight, with GroenLinks-PvdA and CDA still in contention and many voters undecided.

Although the AP noted that the bias was not intentional, it attributed the problem to the way AI chatbots function, highlighting the risks of relying on opaque systems for democratic decisions.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!