German watchdog demands Meta stop AI training with EU user data

A German data watchdog says Meta must ask for consent—not just offer an opt-out.

Meta, AI training, EU, user data

The Verbraucherzentrale North Rhine-Westphalia (NRW), a regional data protection authority in Germany, has issued a formal warning to Meta, urging the tech giant to stop training its AI models using data from European users.

The regulator argues that Meta’s current approach violates EU privacy laws and may lead to legal action if not halted. Meta recently announced that it would use content from Facebook, Instagram, WhatsApp, and Messenger—including posts, comments, and public interactions—to train its AI systems in Europe.

The company claims this will improve the performance of Meta AI by helping it better understand European languages, culture, and history.

However, data protection authorities from several EU countries, including Belgium, France, and the Netherlands, have expressed concern and encouraged users to act before Meta’s new privacy policy takes effect on 27 May.

The NRW DPA took the additional step of sending Meta a cease-and-desist letter on 30 April. Should Meta ignore the request, legal action could follow.

Christine Steffen, data protection expert at NRW, said that once personal data is used to train AI, it becomes nearly impossible to reverse. She criticised Meta’s opt-out model and insisted that meaningful user consent is legally required.

Austrian privacy advocate Max Schrems, head of the NGO Noyb, also condemned Meta’s strategy, accusing the company of ignoring EU privacy law in favour of commercial gain.

‘Meta should simply ask the affected people for their consent,’ he said, warning that failure to do so could have consequences across the EU.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!