by
Yug Raman Srivastava
CyJurII Scholar
on 24 February 2024
Citation Number: (C-252/21) EU:C:2023:537 (04 July 2023)
Case C-252/21 Meta Platforms Inc. and Others v Bundeskartellamt has reached a fundamental moment where both Data Protection Regulations and Competition Law have come together to form one pivotal event which will change digital marketplace regulation entirely. The Court of Justice of the European Union's decision has set out new and far-reaching principles that provide the ability to enforce GDPR compliance as part of competition law. The ruling concludes a 10-year process to establish the definition of Data-Driven Market Power, creating both a representation of market dominance and a way to affirm dominance through the collection and use of individuals’ data. The ruling also provides competition authorities with the jurisdictional ability to review GDPR compliance, effectively breaking down the historical divide between both areas, therefore allowing regulators a more comprehensive view of the digital marketplace.
The Genesis of the Case
The Bundeskartellamt's 2019 Intervention The legal issue started with an innovative and controversial decision by Germany's Federal Cartel Office (Bundeskartellamt or BKA), which announced on February 6, 2019, that Meta Platforms (formerly known as Facebook Inc.) could not condition the use of its social media platform, Facebook.com, on the collection and aggregation of personal data from users outside the Facebook service (off-Facebook data) unless users had given their informed and voluntary consent for such aggregation. This off-Facebook data consisted of all data collected by Meta from other services that it controls, including Instagram, WhatsApp, Messenger, and Oculus, and the data collected by Meta or third parties using Facebook Business Tools such as the Like button, Share button, Facebook Login, and Facebook Analytics. In March 2016, the BKA began its inquiry into Meta with the insight that the terms related to Meta's processing of data through the use of a dominant market position were an abuse of the dominant position in violation of Section 19 of the GWB, and that in a properly functioning market, no user would have to agree to a take-it-or-leave-it (also known as an "all-or-nothing") proposition that places a requirement on the user to give up their data on all areas of the internet in exchange for social connections. Since Meta operates in the German market for Social Networks without the pressure from competition, it was able to impose on its users privacy-flouting terms that violate the fundamental principles of the General Data Protection Regulation (GDPR).
Market Definition and Dominance Assessment
The decision made by the Federal Cartel Office of Germany was based on a detailed characterization of the relevant market and a thorough evaluation of the market power that Meta has. The BKA determined that the product market was the private social networking service (PSNS), where people connect to other users in the PSNS to hash out their daily lives, with friends and family. Notably, the BKA did not define professional networks such as LinkedIn and Xing as part of this market because LinkedIn and Xing provide a limited set of functions offered by the PSNS and are tailored to distinct user segments. The BKA identified that Meta was dominant in Germany, with a market share of at least 95% of daily active users, and that Meta's position was backed by the strong network effects linked to Meta and the other significant barriers that users face in entering or remaining active in that market. As a result, the average user of Meta's service has a "lock-in" effect—a real or perceived economic barrier that prevents the user from leaving the platform without losing their entire social graph and digital history. In determining whether Meta was violating German competition law, the BKA concluded that Meta had a "special responsibility" to its users in the context of competition law, and that this concept was being applied systematically for the first time in the enforcement of data protection standards.
Procedural Path: From Düsseldorf to the Grand Chamber
In August 2019, Meta’s appeal against the BKA was granted a suspensive effect by the OLG Düsseldorf but was then overturned on 11 June 2020 by the BGH who restored the BKA’s decision until a final ruling could be made. Also, Because of the significant implications for EU law, the questions referred to the CJEU by OLG Düsseldorf were based upon three main areas of concern: (a) Is it possible for a national competition authority to determine whether a breach of GDPR has occurred during an antitrust procedure; (b) If an undertaking holds a dominant position does this not preclude it from processing personal data legitimately under one of the legal bases contained in Article 6 GDPR, and (c) Is it possible to derive the processing of sensitive data contained within Article 9 GDPR from the personal browsing activity of users. This high-profile case was heard by the Grand Chamber of CJEU due to the constitutional significance of this case in relation to the European Internal Market.
The Jurisdictional Breakthrough: NCAs as Data Protection Enforcers
The Court of Justice of the European Union (CJEU) held on 4 July 2023 that the framework established by the National Competition Authority (NCA) was appropriate to use in this instance. The Court stated that, during an investigation for the legal framework regarding the Abuse of Dominance, the NCA can determine whether the way an undertaking has processed personal data fell short of the requirements of the General Data Protection Regulation (GDPR). This finding reflects the fact that in today's digital economy, data protection is an important aspect of competitiveness for undertakings.
To maintain uniformity across the EU, and to enable the coherent application of GDPR across Member States, the CJEU has required all NCAs and Data Protection Authorities (DPAs) to cooperate with each other. According to Article 4(3) of the Treaty of the European Union (TEU), a cooperation agreement between the NCA and the DPA is required to achieve the goals of the GDPR.
When conducting investigations into alleged GDPR violations, NCAs must:
- Follow the decisions of either a competent DPA or the CJEU that previously ruled on identical conduct.
- Consult and seek cooperation from the competent DPA prior to commencing an assessment of a potential violation of the GDPR if it has not yet been ruled on by a DPA.
- Continue their investigations once the DPA has either not responded within a reasonable time frame or expressly declined to commence an investigation.
Given the findings of the CJEU, NCA has been provided with authority to use as “evidence” any potential violations of GDPR in establishing an “abuse of dominance”, but does not replace the enforcement function of the DPA. The powers granted to NCAs and DPAs may enable them to enforce the GDPR against data-related abuses, even when the DPAs choose to ignore such violations or lack the resources necessary to investigate them, unless the NCAs are outside their jurisdiction.
Substantive Analysis: The Collapse of "Contractual Necessity" and "Legitimate Interest"
The interpretation of Article 6 (1), relating to GDPR, of the CJEU has harmed negatively the way in which Meta had tried to justify their legal position regarding the use of data collected from outside Facebook for its targeted advertising model. Historically, Meta has argued that their use of data collected outside of Facebook aligns with Article 6 (1)(b), that the data processing was necessary for providing services based on contractual obligations (i.e., a contract with Facebook users), or Article 6 (1)(f), which claims the companies' legitimate interests for processing the data.
Rejection of Contractual Necessity (Article 6(1)(b))
The Court interpreted “necessity” in a very narrow and functional way. It stated that for processing to be “necessary” for the purposes of a contract, it has to be “objectively indispensable” to carry out a function that is “integral” to fulfill the contractual requirements of the data subject. The Court found that while some people may find personalising content beneficial, it is not “integral” to the main purpose of providing a social network and that a social network can also function by delivering services in a non-personalised way, meaning that the vast amount of collected data from combining different types of service providers does not have to be “objectively indispensable” for someone to access or use a social network’s functionality.
Limitation of Legitimate Interest (Article 6(1)(f))
The court accepted that the legitimate interests of Meta in respect of data processing are primarily related to the monetisation of their "free" service by way of delivering targeted advertisements. However, the legitimate interests of Meta will never be enough to outweigh an individual's right to privacy or other fundamental rights and freedoms, especially given the vast amounts of data that Meta processes about individuals and the extent to which the data collected about individuals is highly detailed and invasive in nature. Therefore, the power imbalance created by Meta's inability to provide individuals with any form of control over what they choose to share about themselves "off-Facebook" and the potential for extensive profiling of individuals creates a serious interference with individuals' privacy rights that is not justified by commercial interests.
The Article 9 Revolution: Indirect Inferences as Sensitive Data
The judgement made by the CJEU is transformative because it broadly interprets how Article 9 GDPR applies to "special categories of personal data". The Court assessed whether an individual’s browsing of websites associated with sensitive topics (i.e. websites associated with political parties, healthcare portals, and dating applications) would qualify for the protections stated in Article 9, even when the individual has not expressly stated their political or sexual orientation via Facebook. The Court concluded that Article 9 not only prohibits the use of sensitive data but also applies to any data that may lead to conclusions regarding sensitivity, even if the information was arrived at through “intellectual operation of collation or deduction”. Therefore, whenever Meta tracks an individual’s visit to a particularly sensitive website through one of its integrated business tools, then it is “processing” sensitive information, as visiting that particular site would “indirectly disclose” an individual’s potential status of health, religious beliefs, or sexual orientation. Based on these conclusions, a number of important legal inferences arise:
(1) Broad Prohibition: it is prohibited to process sensitive data without any intention to collect it for a particular purpose. Meta’s intention in conjunction with the accuracy of any inferred information is irrelevant, because the prohibition is absolute and purpose-neutral.
(2) The "Whole Pool" Principle: Meta may not combine non-sensitive data with inferred sensitive data in such a manner that it would be impossible to separate the sensitive data from non-sensitive data. If this occurs, then the entire dataset, including inferred sensitive data, must be treated as if it were all subject to Article 9.
(3) Explicit Consent Required: Since the legal bases under Article 6 do not generally apply to Article 9 type of data, whoever processes this type of indirect inference will require the explicit consent of the user.
(4) Rejection of "Manifestly Made Public": The Court rejected Meta’s assertion that an individual “manifests” his or her sensitive data by simply visiting a website or clicking a “Like” button. The Court held that, in order to qualify for this exception, an individual must have specifically chosen, through granular choices, to make his or her data publicly accessible.
The Impact of Dominance on Consent: The "Consent Paradox"
In its decision, the CJEU determined that the right to grant or refuse consent is not automatic when a person is in a dominant position in a given market. There has to be an imbalance of power between the dominant and non-dominant parties, so while granting or refusing consent does not result in cancellation of the contract due to the dominant position, it is important to create an opportunity for non-dominant parties to have access to consent freely. There is a need for the existence of "comparable" contracts, which allow for users who do not consent to the combination of their personal data with other personal data for purposes of direct marketing, to continue to use the dominant party's service without losing their access to it. These "comparable" contracts can be offered for an appropriate fee to the user. The requirement of "comparable" contracts for non-dominant parties that do not require consent will serve as a bridge between the post hoc investigation into anti-competitive behaviour by the BKA and the licensing of providers of digital services through the DMA.
Implementation and Settlement: The 2024 Finalization
The last chapter of the legal case concluded with a ruling on October 10th, 2024, when the (BKART) officially ended its investigation into Meta after years of conversations about how Meta would comply with the BKART's 2019 ruling against Meta. By withdrawing its appeal before the Düsseldorf High Regional Court, Meta acknowledged the changing regulatory landscape, even if it was forced to do so.
Key Implementation Measures (2024)
In order to empower users and eliminate the harmful practice of collecting and selling their personal data, Meta has implemented several strategies designed to ensure transparency, create choice architecture, and eliminate "deceptive design patterns" (dark patterns) that lead users towards more intrusive privacy options. Although significantly improved with respect to the above-mentioned objectives, the BKA pointed out that the considerable lack of clarity surrounding such terms as "cookies", and the potential for "choice fatigue" among users still exists and, therefore, further enhancements are still needed. The terms of the settlement mark a significant transition from "absolute prohibition" on the combination of data to "effective choice architecture", with the understanding that while it is not possible to ban the combination of data entirely, it must ultimately be placed under the user's affirmative control.
Transition to the Digital Markets Act (DMA): Codifying the Ruling
The Digital Markets Act (DMA), which became effective in March 2024, was directly inspired by the Meta v. Bundeskartellamt case. While under the previous regulation the burden of proof was on regulators, under the DMA the 'gatekeeper' companies (many of which are included as 'core platform services') will bear the burden of proof for the specific actions being regulated under the DMA (which is essentially a transposition of the Bundeskartellamt's eight years of investigative work).
In addition, the DMA sets a minimum standard for consent to use personal data from multiple core platform services. The DMA prohibits gatekeepers from combining or cross-utilizing a user's personal data from two or more different core platform services unless the user has been given the opportunity to make a "specific choice" and has provided GDPR-compliant consent. As part of these provisions, the DMA implements the CJEU's mandate that gatekeepers offer a "less personalized but equivalent alternative" to users who do not provide consent.
The DMA is a significant development in digital regulation:
• Efficiency: The previous approach to regulating conduct through years of litigation under Article 102 TFEU, which deals with abuse of dominance, has now been replaced by a clear regulatory rule under the DMA.
• Enforcement Intensity: The European Commission will be able to impose fines of up to 10% (20% for repeat violations) of total worldwide turnover for breaches of the DMA. This far exceeds the potential sanctions available to the DPAs.
• Polycentric Oversight: The DMA creates a "dual-track" enforcement system, where the European Commission will be responsible for enacting the antitrust laws related to digital markets, while national DPAs will continue to enforce the GDPR.
Current Developments (2025-2026): The "Consent or Pay" Contention
New outcomes of the implementation of Meta’s “Pay or Consent” (or “Pay for No Ads”) policy in 2023 to comply with the appropriate fee requirement from the CJEU are anticipated as we move into 2025 & 2026. According to the CJEU, Meta claims that by providing an ad free alternative for a payment, they comply with the expectance established in both the CJEU and DMA on equivalent alternatives.
This viewpoint has not been shared by the Commission. On April 22, 2025, after fulfilling their obligations under the DMA, the Commission submitted their first non-compliance procedure against Meta with a fine of 200 million euro, and based on EDPB Opinion 08/2024, concluded that providing the choice between either paying or consenting is not sufficient to satisfy the requirement for "freely given" consent. The Commission goes on and states that gatekeepers must also provide individuals with another option - that is, a free service with less personalized (ie., contextual) advertising.
2025-2026 Regulatory Landscape and Major Fines
Various jurisdictions around the world were exerting increasing amounts of regulatory pressure from April 2025 to early 2026 against Big-Tech Companies, such as Meta and Apple. In April 2025, the European Commission fined Meta €200 million for some non-compliance with the DMA (Digital Markets Act) because of Meta's controversial "pay or consent" policy. Shortly thereafter, Apple received €500 million in fines also from the European Commission for violating the DMA due to Apple's improper anti-steering policies and causing limitations on the use of personal data. Later in 2025, the European Commission decided to expand their investigations by opening a formal investigation against Meta'sAI policies because of allegations concerning the scraping and use of WhatsApp user data in the training of AI. In early 2026, the process began that would eventually lead to the release of internal documents within the industry that would reveal sustained lobbying efforts by major technology companies to enact a Digital Omnibus Act aimed at diluting the regulatory strength of various digital regulations. Additionally, recent unrelated litigation regarding targeted advertisement practices in the United States has also exposed some of Meta's internal practices and procedures, which have raised major global concerns regarding the lack of controls and regulations over data exploitation in the tech industry. It is of note that during this period a significant conflict arose between the Court of Justice of the European Union (CJEU) Paragraph 150 regarding allowing fees and the European Commission's efforts regarding enforcement under the DMA to require alternatives free of charge; as Meta has indicated that they have disputed the enforcement actions taken by the European Commission by alleging that the Commission is "rewriting the GDPR" (General Data Protection Regulation) to conflict with Case Law set by the CJEU.
Broader Implications: AI Training and the "Digital Omnibus"
The Meta v Bundeskartellamt counselled the development of a third generation of markets being developed with generative AI platforms in mind. In this environment, Generative AI platforms will use user-generated data such as posts, comments and profile pictures to develop large language models, so therefore, the "Purpose Limitation" and "Sensitive Data Inference" rules established in Meta v Bundeskartellamt (Case C-252/21 of 2021) continue to be relevant.
In April 2025, an interim case at the Higher Regional Court of Cologne addressed the question of whether Meta's use of first party data for AI training was violating Article 5(2)(b) of the DMA by engaging in the process of combining data, but Meta argued that because the data it was using was partially de-identified, that it was not combining it with their own data and therefore not violating the DMA. In contrast, critics contend that since the CJEU's Article 9 reasoning prohibits using any social media data to develop AI without the explicit and active consent of the individual, it follows that even though the first party data is being de-identified it is still being used to develop AI and must therefore be subject to the rules of Article 9.
The 2026 political climate has seen an attempt by big technology companies to "roll back" the definition of personal data as established under GDPR by passing legislation, such as the proposed "Digital Omnibus", that would restrict the definition of personal data to only include data that can be used to identify an individual. This legislation would also allow big technology companies to create "cascading monopolies" over AI through the use of data they obtained in violation of the spirit of GDPR.
Economic Analysis: Data Protection as a Parameter of Competition
The CJEU ruling provides the key economic insight that data protection can also be considered a measure of consumer welfare and product quality, especially within zero-cost markets, where traditional antitrust enforcement efforts centred on price consequences have not been sufficient in regard to digital platforms. In other words, when Meta reduces its privacy protection standards, it is, in effect, decreasing the quality of its product for consumers. This so-called "quality degradation" theory of harm bridges the gap between competition authorities' responsibilities to ensure that people are not harmed by and continually exploited by monopolistic practices (i.e., the companies that have a monopoly over the market) and to protect fundamental rights of consumers to privacy. If firms can continue to build a user base by violating the rights to privacy of users through the existence of high switching costs and network effects that bind those users to the firm, then those violations are directly related to the exercise of market power (i.e., these violations cannot exist without the existence of a monopoly). The CJEU ruling provides a clear legal framework for treating privacy-invasive practices as exclusionary or exploitative abuses of dominance. The progression from the Bundeskartellamt's (BKA) 2019 decision to the state of the regulatory environment in 2026 demonstrates a significant change in the focus of regulators from "consent" as a simple procedural exercise to "control" as a substantive requirement to participate in the marketplace. Furthermore, the CJEU ruling in Case C-252/21 Meta Platforms v Bundeskartellamt acts as the cornerstone of the transition from the "consent" standard to the "control" standard, as it provides legal certainty regarding the jurisdiction and substantive scope of dominant platforms.
The ongoing debate about the "Pay or Consent" dilemma continues to challenge the boundary between fundamental rights and commercial autonomy. However, it is clear that EU law has evolved and continues to evolve to reflect the fact that data protection has become an indispensable element of fair competition in the EU's digital internethere is "evidence" that data protection was once viewed as an "ancillary" element but is now viewed as an essential component of ensuring that fair competition is maintained within the digital internal marker. Additionally, the inclusion of GDPR protections as part of the competition analysis completed under the Digital Markets Act (DMA) provides a new constitutional standard for data accumulation and the right to determine how an individual uses that data. While there are still many issues associated with the regulation of AI and other examples of data manipulation technologies, the only way to prevent the establishment of a cascade of monopolies within the 21st century is through the "sincere cooperation" of antitrust authorities and data protection regulators.
SOURCES: -
• Sources: - Court of Jus ce of the European Union (CJEU), Case C-252/21, Meta Pla orms Inc. and Others v Bundeskartellamt : The Grand Chamber ruling establishing that compe on authori es can assess GDPR compliance and defining strict limits on data combina on and sensi ve data inferences.
• CJEU, Case C-21/23, Lindenapotheke : A significant judgment that extends the "Meta Model" to national unfair competition law, allowing competitors to sue for GDPR violations and providing a broad interpretation of sensitive health data.
• CJEU, Case C-307/22, FT : Confirms that data subject access rights are fundamental and cannot be restricted by a controller's motivations.
• Higher Regional Court of Cologne (OLG Köln) : Interim proceedings regarding Meta’s use of "first-party data" for AI training under the Digital Markets Act.
• Bundeskartellamt (BKA), Case B6-22/16 (Facebook), Decision of 6 February 2019: The original German intervention against Meta’s cross-service data aggregation.
• Bundeskartellamt (BKA), Implementation and Closure of Proceedings (October 2024): Official finalization of the 2019 order following Meta’s withdrawal of its appeal and implementation of user control measures.
• European Commission, DMA Non-Compliance Decision (April 2025): The first major enforcement under the Digital Markets Act, fining Meta €200 million for its "binary" pay-orconsent subscription model.
• European Data Protection Board (EDPB), Opinion 08/2024: Binding guidance on the "Consent or Pay" model, requiring gatekeepers to offer a "third option" of a free version with less personalized advertising.
• EDPB, Guidelines 1/2024: Updated standards for the use of "legitimate interest" as a legal basis under Article 6(1)(f) of the GDPR.
• Grafenstein, M. von (2024/2025): "From Consent to Control by Closing the Feedback Loop," scholarly work and expert reports focusing on technical-organizational measures to restore consumer autonomy.
• Brook, O., and Eben, M. (2024): "Another Missed Opportunity? Case C-252/21 Meta Platforms v Bundeskartellamt," Journal of European Competition Law & Practice, critiquing the ruling’s failure to clarify jurisdictional boundaries under Regulation 1/2003.
• Barczentewicz, M. (2025): "A First Take on the European Commission's DMA Decision Against Meta," analysis of the legal and economic reasoning behind the Commission’s first gatekeeper fines.
• Corporate Europe Observatory & LobbyControl (2026): "Digital Omnibus: How Big Tech Shaped EU’s Roll-back of Digital Rights," a report detailing 2026 lobbying efforts to weaken GDPR definitions of personal data.
• Tech Policy Press (2026): "Reviewing European Antitrust Activity in 2025," providing context on current investigations into WhatsApp AI policy and Google’s AI Overviews.