Strange Bedfellows: Data Protection, Privacy and Competition – CLI – April 2017

Competition Law International homepage  »

Giovanni Buttarelli

 

Introduction

For many antitrust practitioners, data privacy is the after-dinner speaker who doesn’t know when to wrap up and resume their seat. A topic that, until recently, was little more than a niche question for academia, is now the subject of in-depth studies by prominent competition authorities and keynote speeches by European Commissioners. The urgency of discussions can be attributed to the speed and complexity with which markets are adjusting their behaviour to data-driven technology, and unease at the lack of trust and sense of powerlessness felt by people who must navigate a new digital reality.[1]

The European Data Protection Supervisor (EDPS) has sought to deepen and inform this debate from the perspective of the individual and her rights and interests, whether as a consumer, citizen or data subject. In a series of policy papers and speeches since 2014, we have argued that competition and privacy/data protection norms[2] in Europe, as well as in the United States, share common roots and remarkably similar objectives, namely fairness, ensuring freedom and choice and the prevention of harm.[3] Regulators responsible for the separate legal regimes however have tended to work in silos. We have urged more dialogue between regulators on common challenges and on cases where cooperation could make the work of these agencies effective. This article reviews the state of discussions so far and the role that our proposed Digital Clearing House could play in bringing together existing knowledge and expertise.

A decade of concentration in digital markets

It is now a decade ago that the Federal Trade Commission (FTC) cleared the acquisition of DoubleClick, an ad-serving company, by Google, the dominant internet search engine. There was one dissenting statement in which Commissioner Jones Harbour set out several objections to the merger on the grounds not only that it would harm future competition but also privacy: ‘The truth is’, she contended, ‘we really do not know what Google/DoubleClick can or will do with its trove of information about consumers’ Internet habits. The merger creates a firm with vast knowledge of consumer preferences, subject to very little accountability.’[4]

More recently, in 2014, the merger of Facebook and WhatsApp Messenger presented an even starker challenge for regulators, with one of the tech giants acquiring a startup whose annual turnover was only about US$10m[5] but, with around 1 billion users, was valued at over US$20bn. These companies, like Google and DoubleClick, were not direct competitors, but the merger implied the combination of massive personal data sets, names, contact details, messages and connections supplied on the basis of consent to terms very different to those signed up to by Facebook users. Once more the FTC, acting as a competition authority, cleared the merger, but in parallel the Commission’s Bureau of Consumer Protection considered it necessary to remind WhatsApp of its obligation to honour promises contained in its privacy policy, otherwise both companies would be susceptible to enforcement action.[6] In the EU, the European Commission as competition authority also cleared the merger, and there were only a few, isolated, cautionary statements from European data protection authorities.[7]

In 2016, WhatsApp announced the possibility to link user phone numbers with Facebook user identities, a practice that Facebook had told European Commission in 2014 would not be possible.[8] The company is now under investigation by the Commission and a number of national data protection authorities, with the Article 29 Working Party (the group of EU data protection supervisory bodies) writing formally urging it to suspend its intended linking of data.[9]

Both of these ‘big data mergers’ are emblematic of an unprecedented level of concentration in the ICT and other sectors over the past ten years, which if anything is only accelerating.[10] October 2016 set a record with almost half-a-trillion worth of mergers and acquisitions.[11] Acquisitions of artificial intelligence (AI) start-ups have increased sevenfold between 2011 and 2015,[12] and already there are reports that a handful of companies have dominant control over the talent and IP behind AI.[13]

The policy debate

Since Pamela Jones Harbour registered her concerns, the theme of data and competition has become the stock-in-trade for almost every programme in the antitrust conference calendar. Across this vigorous debate a clear fault line can be detected between experts and regulators on the implications of the remarkable digital turn in the economy and society in the past 20 years. Crudely put, there are two schools of thought, with perhaps a third, which is equivocal and non-committal.[14]

The first school of thought holds that ‘big data’ is just the latest fad in public policy and that, just like with past fads, competition principles and enforcement practice will once again prove robust enough to prevail. Insofar as behaviour in the market indicates a privacy violation, it should be dealt with by privacy and consumer enforcers, not by competition authorities who lack the tools and legal competence to assess such a violation. The legal regimes are distinct and it is essential for competition law to maintain its integrity and avoid contamination by other public policy concerns, however legitimate they may be. This has been affirmed by the Court of Justice of the European Union in the Asnef-Equifax case when the court held that ‘since any possible issues relating to the sensitivity of personal data are not, as such, a matter for competition law, they may be resolved on the basis of the relevant provisions governing data protection’.[15] The Court in the AstraZeneca case has also made the clear the criterion for any case to be brought on the grounds of a competition law violation: ‘the classification as an abuse of a dominant position of conduct… which consists in the use of regulatory procedures without any basis in competition on the merits, requires at the very least evidence that, in view of the economic or regulatory context surrounding that conduct, that conduct is such as to restrict competition’.[16] A few examples of such cases exist, mainly concerning former state monopolies, which have been forced to disclose customer databases acquired to their competitors.[17]

According to the second school of thought, competition enforcers – whether the regulatory bodies or the courts – are judged to have lost sight of the founding objective for antitrust in the late 19th and early 20th centuries. Originally conceived as a means for breaking up overbearing monopolies deemed to pose a threat to democracy and freedom, competition enforcement in the US and the European Union has now been captured by the theoretical models developed by the Chicago School in the 1970s.
As a result, monopolies are no longer considered to be intrinsically bad; on the contrary they could be a good thing for consumers, so long as they keep innovating and driving down prices. Worse, it is argued, antitrust economists are uncomfortable or unwilling to consider hard-to-quantify competition parameters other than price, such as quality or choice, despite the fact that web-based services are typically provided at very low or zero price on condition that personal information can be gathered through monitoring of online activity. These services in effect, therefore, escape proper regulatory scrutiny, leaving consumers open to malpractice and abuse. This situation endangers basic values from dignity to privacy, and even democracy itself, and demands a change in how these services are regulated, not only sector-by-sector but also by antitrust authorities.

There is a third school, which takes a more agnostic, strictly empirical position, accepting that there may be reason for competition enforcers to be concerned with privacy violations, but that a concrete case has yet to present itself.[18]

Digital markets: the good, the bad and the ugly

The reality of course is somewhere between the ‘nothing new under the sun’ and digital dystopia posited by the opposing camps. Revolution and evolution are not mutually exclusive and a lot of the time most people do not realise the longer-term significance of changes occurring around them. Nevertheless, the impact of the digital turn on freedom and privacy may be compared to human threats to the environment and climate change, which tend to remain abstract and unimportant to most people until they are themselves directly affected, through severe air pollution, erosion of coastal habitats and so on. Recent scholarship and economic modelling indicates that, in the advertising environment at least, platform intermediaries may gain disproportionately at the expense of individual consumers and firms.[19] People tend to be unaware of the constant tracking that they undergo whenever they are connected to the internet and as a condition for using web-based services, until something goes wrong, in the form of a data breach or recently with the concern about ‘fake news’ and ‘filter bubbles’. This explains why the UN Human Rights Council adopted a resolution in 2016 that people’s privacy online must be safeguarded as well as offline.[20]

In the past 12 months or so, big data is being eclipsed as a trending phenomenon by AI. AI relies on big data. It is the most intriguing opportunity for monetising the extraordinary volumes of data that have been farmed from individuals or are the by-product, the exhaust, the ‘digital breadcrumbs’ of the rampant digitisation of markets as well as the creation of new ones: social interaction, for instance, did not used to be mediated so lucratively by technology before the digital era. An algorithm-driven economy presents the possibility of perfect discrimination of consumers on the basis of data that discloses to only one side of the transaction the willingness to pay. Machines trained to maximise profits would likely have an unassailable advantage over consumers. So just like the ‘free’ services enabled by big data, AI should not escape proper regulation and accountability. Whoever sets the business objectives for machines should be held responsible.[21]

Common competition and privacy challenges

There has been a debate about competition and privacy law because each of the principles of public life that they seek to preserve have been disrupted by the behaviour of companies and individuals as they explore and exploit the possibilities opened up by technology. Increasing concentration does not necessarily show under enforcement of existing rules, but it certainly indicates growing disparities between individuals and powerful companies. With many people living their lives almost always online, concentration of ownership of the infrastructure and the means of provision of essential services should worry all regulators and policy-makers, not only those working in the area of human rights. This concerns not only the companies exercising market power themselves, but also the entities that own those companies, where perhaps an even more remarkable degree of concentration has also occurred: in those digital sectors where concentration is starkest, even the few competitors that exist have common owners.[22]

From a purely antitrust perspective, concentrations tend to lead to higher prices.[23] Where there are low or zero prices for web-based services, it is logical to assume an equivalent effect on the actual price being extracted from customers with a low ability to know or understand the nature of the transaction. This real price may take the form of additional personal data disclosure, limitations on freedom of expression or choice to consume content, limitations on freedom to download and port content that they have uploaded to platforms elsewhere. All regulators need to understand the value of personal data in digital markets especially where services are offered for ‘free’; in the EU, moreover, there is an implicit obligation in the Charter for government to foster privacy enhancing technologies.[24] The issue is not data ownership but individual control over what is happening to data as a projection of their personalities. Enforcers need to take a longer-term perspective: in the jargon of data protection, it is time to weigh the potential impact of current trends on the dignity and freedom of individuals; for competition enforcers, it is time to assess the impact on aggregate consumer welfare of collusion, abuse of dominance and mergers in the big data space.

Part of a response: the Digital Clearing House

Competition law has a limited but powerful range of tools for addressing structural socio-economic problems. Some of these tools may be underutilised. For example, the notion of abuse of dominance through exploitation of consumers, which exists in EU law but not in the US, is only rarely tested or invoked in practice. Addressing market behaviour that may harm individuals is therefore left to the more recent and – until now – less harmonised and weaker regulatory arms of consumer protection and personal data protection.[25] Consequently, eyebrows are raised at the disparity between the highest sanctions applied for antitrust violations – running to billions of euros – and those applied for data protection violations – rarely reaching €1m. Violations of data protection rules other than data breaches, which may be less visible but probably equally harmful, such as unlawful processing or the imposition of unfair or misleading terms and conditions, have typically attracted much more negligible sanctions.[26] These imbalances may lessen with the General Data Protection Regulation, which provides for sanctions of up to four per cent of annual global revenue for violations, as compared to a maximum of ten per cent for offences under competition rules. Commissioner Vestager has recently asserted her willingness to intervene in cases of excessive pricing under Article 102 Treaty on the Functioning of the European Union (TFEU).[27] Meanwhile, the German authority is currently investigating whether Facebook’s terms and conditions amount to an abuse of market power.[28]

The fragmented approach indicates the need for a space for dialogue about general themes and specific cases and for learning lessons from past actions. In 2016 the EDPS launched the Digital Clearing House. This network of willing agencies will share information and collaborate within the boundaries of legal competences and the confidentiality of their respective investigatory activities. The aim is to combine the expertise of regulatory bodies to make their activities more effective, and to help avoid silo-thinking by considering, for example, joint guidance on assessing whether specific cases should be dealt with under one legal regime or another.[29] The Digital Clearing House will be able to look at the long- term implications of proposed mergers on fairness of data processing and impact on fundamental rights.[30]

Conclusion

A decade ago, Commissioner Jones Harbour referred to the risks of a ‘database of intentions’ whereby every individual’s ‘desires, needs, wants, and likes that can be discovered, subpoenaed, archived, tracked, and exploited to all sorts of ends’.[31] In December 2016, an opinion piece in The Economist remarked that ‘The most striking business trend today is not competition but consolidation.’[32] This juxtaposition of market, informational and algorithmic power are among the biggest challenges facing society. Our values in the EU imply a need for a common space on the web without tracking and where basic freedoms can be guaranteed; something that does not exist yet, and that the market alone is unlikely to deliver.

In the next decade, regulators cannot allow themselves to drift to a situation where the individual wants to exercise their right to information held about them,[33] but a company cannot meet the request because the information is controlled by an algorithm. A person must always be accountable. Antitrust, just like data protection law, recognises this. The sustainability of our digital future may depend, in part, on how well the two can work together to achieve their common objectives.

About the author

Giovanni Buttarelli was appointed European Data Protection Supervisor (EDPS) on 4 December 2014 by a joint decision of the European Parliament and the Council for a term of five years.

Before joining as EDPS, he worked as Secretary General at the Italian Data Protection Authority, a position he occupied between 1997 and 2009. A member of the Italian judiciary with the rank of Cassation judge, he has worked on many initiatives and committees on data protection and related issues at international level.

Buttarelli’s experience on data protection includes the participation in many bodies at European Union level (including Art 31 Committee of Directive n 95/46/EC and Taiex programs), and at the Council of Europe (in particular, also as a consultant, T-PD; CJ- PD, DH- S- Ac, Venice Commission). He has also contributed to hearings, meetings and workshops as well as books, journals and papers. He currently teaches on privacy at the Luiss University, Rome.



[1]     See for example, Flash Eurobarometer 443 on ePrivacy (December 2016); Special Eurobarometer 431 on Data Protection (June 2015); and Pew Research Panel Survey January 2014 on Public Perceptions of Privacy and Security in the Post-Snowden Era.

[2]     Data protection and privacy are related but distinct rights in the European Union. For a brief analysis, see, for example, EDPS Guidelines on data protection in EU financial services regulation (November 2014) 6–9.

[3]     EDPS Preliminary Opinion Preliminary Opinion on Privacy and competitiveness in the age of big data: The interplay between data protection, competition law and consumer protection in the Digital Economy(March 2014); EDPS Opinion 8/2016, Coherent Enforcement of Fundamental Rights in the Age of Big Data. Other documents can be found at: https://secure.edps.europa.eu/EDPSWEB/edps/Consultation/big_data, accessed 13 January 2017.

[4]     The matter of Google/DoubleClick FTC, File No 071-0170 Dissenting Statement of Commissioner Pamela Jones Harbour, 2007.

[5]     Source: www.statista.com/statistics/346269/whatsapp-annual-revenue, accessed 13 January 2017.

[6]     Letter From Jessica L Rich, Director of the Federal Trade Commission Bureau of Consumer Protection, to Erin Egan, Chief Privacy Officer, Facebook, and to Anne Hoge, General Counsel, WhatsApp Inc, 10 April 2014.

[7]     For example, see comments from the head of the data protection authority of the German state of Schleswig-Holstein; Weichert: ‘Am schlimmsten ist die Kombination: www.shz.de/regionales/schleswig-holstein/meldungen/weichert-am-schlimmsten-ist-die-kombination-id5788916.html, accessed 13 January 2017.

[8]     European Commission – Press release ‘Mergers: Commission alleges Facebook provided misleading information about WhatsApp takeover’ (20 December 2016).

[10]   OECD, Data-driven innovation: Big data for growth and well-being (2015) 23–25.

[11]   ‘October Smashes Merger Records as Companies Turn to Megadeals’ (31 October 2016) Bloomberg.

[14]   For a comprehensive list of literature on both sides of the debate, see Cyril Ritter, Bibliography of Materials Relevant to the Interaction of Competition Policy, Big Data and Personal Data (29 September 2016), available at SSRN: https://ssrn.com/abstract=2845590 or http://dx.doi.org/10.2139/ssrn.2845590.

[15]   C-238/05, Asnef-Equifax, Servicios de Información sobre Solvencia y Crédito, SL v Asociación de Usuarios de Servicios Bancarios (Ausbanc) [2006] ECR I-11125.

[16]   T-321/05, AstraZeneca AB and AstraZeneca plc v European Commission [2010] ECR II-02805, at para 845.

[17]   See 14-MC-02 Mesure conservatoire du 9 Septembre 2014 relative à une demande de mesures conservatoires présentée par la société Direct Energie dans les secteurs du gaz et de l’électricité; Auditoraat Beslissing No BMA-2015-P/K-28-AUD van 22 September 2015 Zaken MEDEP/K-13/0012en CONC-P/K-13/0013 Stanleybet Belgium NV/Stanley International Betting Ltd en Sagevas SA/World Football Association SPRL/Samenwerkende Nevenmaatschappij Belgische PMU SCRL t Nationale Loterij NV. For a discussion of these cases, see EDPS Opinion 8/2016, 8–9.

[18]   See, for instance, speech from Commission Vestager, Competition in a big data world (17 January 2016): ‘We continue to look carefully at this issue, but we haven’t found a competition problem yet. This certainly doesn’t mean we never will’; https://ec.europa.eu/commission/2014-2019/vestager/announcements/competition-big-data-world_en, accessed 13 January 2017.

[19]   See preliminary draft article by Veronica Marotta, Kaifu Zhang, and Alessandro Acquisti, ‘Who Benefits from Targeted Advertising?’ presented to FTC PrivacyCon 2017.

[20]   Resolution on The promotion, protection and enjoyment of human rights on the Internet, The Human Rights Council.

[21]   See Ariel Ezrachi and Maurice E Stucke, ‘Artificial Intelligence & Collusion: When Computers Inhibit Competition’ (2015) Oxford Legal Studies Research Paper No 18/2015; University of Tennessee Legal Studies Research Paper No 267.

[22]   José Azar, Martin Schmalz and Isabel Tecu, ‘Anti-competitive effects of common ownership’ (July 2016) Ross School of Business Paper No 1235.

[23]   See M E Stucke and A P Grunes, Big Data and Competition Policy (OUP 2016). 223–224; ‘Too much of a good thing’ (26 March 2016) The Economist.

[24]   Von Hannover v Germany2004, Application No 59320/0; KU v Finland, no 2872/02, paras 43 and 48, ECHR 2008.

[25]   The European Union is in the process of a major reinforcement of data protection, privacy and consumer law: the General Data Protection Regulation was adopted in May 2016; the European Commission in 2017 intends to propose reform of the rules on confidentiality of communications (ePrivacy) and consumer law acquis.

[26]   To date, the highest fines applied in the EU for violations appear to be: €2.9bn under Article 101 (cartel behaviour) against a truck cartel in 2016; €1bn under Article 102 abuse of dominance against Intel in 2009; £3m imposed by the Financial Services Authority, which had stronger sanctioning powers than the data protection authority for a data breach committed by HSBC in 2009; for a violation of solely data protection rules, the highest fine appears to be €1m imposed by the Italian DPA in the Google Street View case.

[27]   Speech by Commissioner Vestager, ‘Protecting consumers from exploitation’ 21 November 2016: http://ec.europa.eu/commission/2014-2019/vestager/announcements/protecting-consumers-exploitation_en, accessed 13 January 2017.

[28]   Press release about the German investigation into Facebook is available at: www.bundeskartellamt.de/SharedDocs/Meldung/EN/Pressemitteilungen/2016/02_03_2016_Facebook.html?nn=3591568.

[29]   EDPS suggested five tasks of the Digital Clearing House: (1) discussing (but not allocating) the most appropriate legal regime for pursuing specific cases or complaints related to services online, especially for cross-border cases where there is a possible violation of more than one legal framework, and identifying potential coordinated actions or awareness initiatives at European level, which could stop or deter harmful practices; (2) using data protection and consumer protection standards to determine ‘theories of harm’ relevant to merger control cases and to cases of exploitative abuse as understood by competition law under Article 102 TFEU 83, with a view to developing guidance similar to what already exists for abusive exclusionary conduct; (3) discussing regulatory solutions for certain markets where personal data is a key input as an efficient alternative to legislation on digital markets, which might stifle innovation; (4) assessing the impact on digital rights and interests of the individual of sanctions and remedies, which are proposed to resolve specific cases; and (5) generally identifying synergies and fostering cooperation between enforcement bodies and their mutual understanding of the applicable legal frameworks, including through more informal and formal contact between the European Competition Network, the Consumer Protection Cooperation Network and the Article 29 Working Party (in 2018 to be replaced by the European Data Protection Board). EDPS Opinion 8/2016.

[30]   In an interview with Politico Pro Technology (reported 1 October 2016), the outgoing President of the Autorité de la Concurrence said of the effectiveness of regulatory scrutiny: ‘We do not spend enough time checking if consumers benefit from our solutions. When looking at the purchase of WhatsApp by Facebook, or LinkedIn by Microsoft or many other recent deals, you are dealing with fast-moving industries, and making a bet on what the future will look like in four or five years. Shouldn’t we spend time after to assess how it has worked and correct or adapt if the results are not as expected?’

[31]   See n4 above.

[32]   ‘Management theory is becoming a compendium of dead ideas’ (17 December 2016) The Economist.

[33]   General Data Protection Regulation Arts 13 and 14.