Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR

Noyb Digital Omnibus

Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR

Last week we at EFRI wrote about the Digital Omnibus leak and warned that the European Commission was preparing a stealth attack on the GDPR

Since then, two things have happened:

  • The Commission has now officially published its Digital Omnibus proposal.

  • noyb (Max Schrems’ organisation) has released a detailed legal analysis and new campaigning material that confirms our worst fears: this is not harmless “simplification”, it is a deregulation package that cuts into the core of the GDPR and ePrivacy. 

What noyb has now put on the table

On 19 November 2025, noyb published a new piece with the blunt headline: “Digital Omnibus: EU Commission wants to wreck core GDPR principles”

Here’s a focused summary of the four core points from noyb’s announcement, in plain language:

New GDPR loophole via “pseudonyms” and IDs

The Commission wants to narrow the definition of “personal data” so that much data under pseudonyms or random IDs (ad-tech, data brokers, etc.) might no longer fall under the GDPR.

This would mean a shift from an objective test (“can a person be identified, directly or indirectly?”) to a subjective test (“does this company currently want or claim to be able to identify someone?”).

 

Therefore, whether the GDPR applies would depend on what a company says about its own capabilities and intentions.

Different companies handling the same dataset could fall inside or outside the GDPR.

For users and authorities, it becomes almost impossible to know ex ante whether the GDPR applies – endless arguments over a company’s “true intentions”.

Schrems’ analogy: it’s like a gun law that only applies if the gun owner admits he can handle the gun and intends to shoot – obviously absurd as a regulatory concept.

Weakening ePrivacy protection for data on your device

Today, Article 5(3) ePrivacy protects against remote access to data on your devices (PCs, smartphones, etc.) – based on the Charter right to the confidentiality of communications.

The Commission now wants to add broad “white-listed” exceptions for access to terminal equipment, including “aggregated statistics” and “security purposes”.

Max Schrems finds the wording of the new rule to be extremely permissive and could effectively allow extensive remote scanning or “searches” of user devices,ces as long as they are framed as minimal “security” or “statistics” operations – undermining the current strong protection against device-level snooping.

Opening the door for AI training on EU personal data (Meta, Google, etc.)

Despite clear public resistance (only a tiny minority wants Meta to use their data for AI), the Commission wants to allow Big Tech to train AI on highly personal data, e.g. 15+ years of social-media history.

Schrems’ core argument:

People were told their data is for “connecting” or advertising – now it is fed into opaque AI models, enabling those systems to infer intimate details and manipulate users.

The main beneficiaries are US Big Tech firms building base models from Europeans’ personal data.

The Commission relies on an opt-out approach, but in practice:

Companies often don’t know which specific users’ data are in a training dataset.

Users don’t know which companies are training on their data.

Realistically, people would need to send thousands of opt-outs per year – impossible.

Schrems calls this opt-out a “fig leaf” to cover fundamentally unlawful processing.

On top of training, the proposal would also privilege the “operation” of AI systems as a legal basis – effectively a wildcard: processing that would be illegal under normal GDPR rules becomes legal if it’s done “for AI”. Resulting in an inversion of normal logic: riskier technology (AI) gets lower, not higher, legal standards.

Cutting user rights back to almost zero – driven by German demands

The starting point for this attack on user rights is a debate in Germany about people using GDPR access rights in employment disputes, for example to prove unpaid overtime. The German government chose to label such use as “abuse” and pushed in Brussels for sharp limits on these rights. The Commission has now taken over this line of argument and proposes to restrict the GDPR access right to situations where it is exercised for “data protection purposes” only.

In practice, this would mean that employees could be refused access to their own working-time records in labour disputes. Journalists and researchers could be blocked from using access rights to obtain internal documents and data that are crucial for investigative work. Consumers who want to challenge and correct wrong credit scores in order to obtain better loan conditions could be told that their request is “not a data-protection purpose” and therefore can be rejected.

This approach directly contradicts both CJEU case law and Article 8(2) of the Charter of Fundamental Rights. The Court has repeatedly confirmed that data-subject rights may be exercised for any purpose, including litigation and gathering evidence against a company. As Max Schrems points out, there is no evidence of widespread abuse of GDPR rights by citizens; what we actually see in practice is widespread non-compliance by companies. Cutting back user rights in this situation shifts the balance even further in favour of controllers and demonstrates how detached the Commission has become from the day-to-day reality of users trying to defend themselves.

EFRI’s take: when Big Tech lobbying becomes lawmaking

For EFRI, the message is clear: the Commission has decided that instead of forcing Big Tech and financial intermediaries to finally comply with the GDPR, it is easier to move the goalposts and rewrite the rules in their favour. The result is a quiet but very real redistribution of power – away from citizens, victims, workers and journalists, and towards those who already control the data and the infrastructure. If this package goes through in anything like its current form, it will confirm that well-organised corporate lobbying can systematically erode even the EU’s flagship fundamental-rights legislation. That makes it all the more important for consumer organisations, victim groups and digital-rights advocates to push back – loudly, publicly and with concrete case stories – before the interests of Big Tech are permanently written into EU law.

Leave a Comment