Data’s ever-increasing value

Tom WickerThursday 4 February 2021

Interest in user data – and in protecting it – is greater than ever. Global Insight examines how data’s value is driving litigation, new regulatory approaches and innovation.

In September 2020, a father lodged a complaint in the United Kingdom High Court against social media platform YouTube’s parent company Google, alleging that the multinational company had collected children’s data without parental consent. In doing so, the claimant argues, the company has breached the UK’s Data Protection Act and the European Union’s General Data Protection Regulation (GDPR).

This is the first class action in Europe to be brought against a technology company on behalf of children, according to the claimant. The case isn’t expected to be heard until late 2021. Responding to the case at the time, a YouTube spokesperson said: ‘We don’t comment on pending litigation.’ YouTube has since confirmed to Global Insight that they have no further comment.

Globally, concerns about data privacy abound. For example, the Indian government banned TikTok – an app that’s popular with young people – in June 2020, alongside numerous other apps owned by Chinese companies. The government claims these apps pose a threat to India’s security and sovereignty.

In reaction to the ban, the companies behind the affected apps – including ByteDance, which owns TikTok – insisted they’d complied with Indian law. ByteDance did not respond to Global Insight’s own requests for comment.

In October, the UK Information Commissioner’s Office (ICO) issued a £20m fine – its largest ever – against British Airways following a 2018 data breach in which the personal and financial details of 400,000 customers were hacked.

Data as a critical asset

So why the increased global interest in data privacy – and in lawsuits to protect it? For Adam Rose, a commercial and data protection partner at UK-based Mishcon de Reya, there’s a twofold answer. ‘One, people are more aware of their rights now,’ he says. ‘But, also, the more data gets collected, the more it has a tendency to upset people who feel affronted by stuff that simply wasn’t possible 20 years ago.’

The collection of personal data by companies and governments didn’t suddenly start in the past decade. However, ‘hyper-connectivity and the consolidation of mobile devices as the main gateway to the internet [have led to a situation] where knowing more about a user has become faster, easier and less expensive,’ says Albert Agustinoy, Chair of the IBA Disputes and Rights Subcommittee and a partner at Spanish firm Cuatrecasas.

For those doing business at the digital frontiers of our ever-growing online landscape, personal data has become one of the most valuable commodities around. ‘It’s one of the critical assets in our digital economy, in the same manner that coal was crucial for the industrial revolution back in the nineteenth century and petrol was vital for the economic development of the world during the twentieth century,’ says Agustinoy.

In 2019, the UK Competition and Markets Authority (CMA) reported that, in 2018, Google accounted for more than 90 per cent of all revenues earned from search advertising in the UK, at around £6bn. In the same year, Facebook accounted for almost half of all display advertising revenues in the UK, at more than £2bn.

The CMA expressed concerns that sites like Facebook, which lacked an ‘opt-out’ option, were forcing people to share considerable amounts of personal data as a condition of use. Facebook did not provide comment in response to Global Insight’s request, although a spokesperson at the time said, ‘Giving people meaningful controls over how their data is collected and used is important, which is why we have introduced industry leading tools for people to control how their data is used to inform the ads they see’.

‘Most consumers of social media platforms still believe that services provided by these platforms are free,’ says Sajai Singh, Chair of the IBA Technology Law Committee and a partner at Indian firm J Sagar Associates. However, he emphasises, ‘the services are being paid for – there is an ostensible barter exchange. Crudely put, the barter is between the consumer’s personal data and the services of the platform.’

Personal data can help companies accurately target consumers or it can be sold to third parties; it can be bought on the dark web; if it is hacked, it can be used for extortion; and it can play a critical role in global politics. The banning of Chinese social media apps or United States sanctions against Chinese telecoms giant Huawei combine national security concerns with ongoing trade wars between countries.

Our details on digital platforms that the general public enjoys daily, from social media to health services, are subject to multiple uses and, sometimes, abuses.


Companies with the largest data sets are becoming more powerful than certain states

Elisa Henry
Publications Officer, IBA Technology Law Committee


Elisa Henry is Publications Officer of the IBA Technology Law Committee and a partner at Quebec-headquartered law firm Borden Ladner Gervais. ‘Companies with the largest data sets are becoming more powerful than certain states,’ she says. The ‘shift of power’ created by data ownership and control – particularly when it’s unclear how the data will be used – ‘can create threats to democracy, which is why there needs to be regulation, but also guidance on the ethics of its processing’.

Sharpening regulators’ teeth

Historically, the biggest challenge facing national data protection agencies (DPAs) seeking to rein in tech companies has been their lack of ‘teeth’. They haven’t had the ability to impose the fines that might make major multinationals pause. Henry points to Canadian regulators, who ‘could go after a big company from time to time and make a noise,’ but who realised that ‘the impact of their efforts was very limited without enforcement powers’.

In 2019, the Privacy Commissioner of Canada, Daniel Therrien, and the Information and Privacy Commissioner of British Columbia, Michael McEvoy, reported following an investigation that Facebook had seriously contravened Canadian privacy laws when the personal data of over 87 million users globally (and 622,000 Canadians) was revealed to have been leaked and leveraged in the Cambridge Analytica scandal.

Facebook’s ‘vague terms were so elastic that they were not meaningful for privacy protection,’ said Therrien at the time. But while it acknowledged ‘a major breach of trust’ when the scandal broke, Facebook disputed the findings of the Canadian watchdog’s report and refused to implement recommendations to address those issues. Canada has lacked the same enforcement muscle as regulators in, for example, Europe.

In June 2020, the Québec government presented Bill 64, which, if enacted, would modernise legislative provisions for the protection of personal information. Significantly, it would greatly increase the fines that could be levied against private and public sector entities who fail to comply with the province’s privacy legislation.

The best-known development is the EU’s GDPR, which came into effect in 2018. It is intended to give individuals control over their personal information within the EU and European Economic Area, to simplify the regulatory framework for businesses operating in the EU and to enable EU Member States to harmonise their respective approaches.

TikTok, an app popular with young people, was banned by the Indian government in June 2020. Wankaner, India, 2020. Shutterstock.com/ shiv.mer

The GDPR has also significantly upped the ante when it comes to fines. In early October 2020, the Hamburg Commissioner for Data Protection and Freedom of Information fined clothing store H&M €35.2m for breaching the GDPR by logging personal details to create profiles of staff in its Nuremburg service centre.

The fine handed to H&M represents the largest GDPR penalty since France’s data protection authority fined Google €50m in January 2019.

Commentators have differing views on the GDPR’s effectiveness. Erik Valgaeren, a partner at Benelux law firm Stibbe highlights its ‘impressive sanctioning and fining system, which can now go to two per cent of a company’s worldwide turnover and – in certain cases where “bad faith” is established – four per cent’. This brings the GDPR ‘a little bit on par with what the European Commission can do in terms of sanctioning in competition laws’.


Valgaeren also notes the increasing size of GDPR-related fines. ‘There’s more sanctioning, more heat and more attention from the regulators,’ he says. ‘We are seeing European data agencies starting to use sanctions because it’s a way of naming and shaming. It’s a way of drawing a line. And I guess, and even hope, it’ll work. Typically, companies don’t like such bad press.’

But Valgaeren acknowledges the clout of major tech companies. Mishcon de Reya’s Rose says that ‘the reality’ behind the ‘big story with GDPR’ – the high level of fines possible – is that there still haven’t been that many issued so far.

In the UK, two sizeable fines that are – as of early November 2020 – currently threatened by the Information Commissioner’s Office will, he thinks, ‘come to nothing, or very little’.

Regarding the GDPR, Rose thought ‘there would be an appetite in the UK to want to issue big fines, to show Europe we’re really serious about this,’ he says. ‘In fact, I think the ICO is under-resourced.’

Bringing enforcement actions is time-consuming and expensive. ‘If you’re a multi-billion-pound company with no limit to your resources, it’s worth spending £10m on lawyers to reduce a £100m fine. That’s a total bargain.’

Differing approaches

While GDPR seeks to harmonise the approach taken by states and companies in the EU, other jurisdictions have differing laws around – and levels of – data protection. For example, the 2010s ‘marked the beginning of privacy being taken seriously in Asia,’ says Singh.

It is rapidly gaining traction, he suggests, as awareness of terms like ‘consent’ and ‘informed choice’ grows. However, India will gain a DPA only after 2019’s Personal Data Protection Bill becomes law.

Paula Barrett, who co-leads the Global Cybersecurity and Privacy Practice at Eversheds Sutherland, highlights that the ongoing Covid-19 pandemic has even ‘shone a light on the differences within Europe of interpretation of the same [GDPR] regulation – particularly when it came to testing’.

Some countries allowed checks of staff health to be seen as compliant with an employer’s duty of care, for example, while Barrett says ‘others simply said no’.

Differing approaches limit the universal applicability of data protection enforcement and practice. Nonetheless, the transnational flow of personal data makes achieving some international consensus important. However, several legal experts in this field identify some fundamental cultural differences between Europe and the US – two of the biggest actors on this stage.

Rose believes that the traumatic 20th century consequences of ‘being on the wrong list’ for people living in mainland European countries and who were caught up in war or occupation has yielded a more deep-seated conservative attitude towards personal data rules than in the US. ‘America has never set out those rules,’ he says. ‘Or, with the California Consumer Privacy Act, has only just started to look at this.’

Divergence ahead

The most conspicuous example of this schism is the European Court of Justice’s landmark ruling in July that the EU-US Privacy Shield – one of the most commonly-used data-sharing mechanisms across the Atlantic – was invalid because it did not afford EU citizens the same data privacy protections in the US as in Europe.

‘More than 5,000 companies had registered under this regime,’ says Valgaeren. ‘Since the use of Standard Contractual Clauses has been tightened, we now have a problem in terms of instruments allowing international data transfers, in particular between the EU and the US.’

Against this backdrop, one question that interests observers is what the UK will do after Brexit. As the country seeks to forge new trade relationships once it leaves Europe, will it find itself at a crossroads as to which approach to data privacy it will adopt? Currently, the UK Data Protection Act 2018 – under which the High Court case against YouTube has partly been brought – mirrors key aspects of GDPR to ensure continuity.

However, the Data Protection Act is designed to allow interpretation. Rose perceives ‘a risk’ that those in in government most keen on regaining British sovereignty may wish to move away from stricter, GDPR-style European privacy laws. The idea would be to increase the country’s attractiveness to US tech companies like Facebook and Google, by ‘restructuring things in a way that takes advantage of American-style data protection law’.

Personally, though, Barrett doesn’t see it as ‘something where there will be radical change’. She thinks ‘it would be difficult politically – internally as well as externally’.

Further, Barrett finds it interesting that ‘data has appeared at such a high level in these important trade discussions between the UK and the EU’. She believes there’s ‘an understanding that our ability to trade is heavily intertwined with how we treat and share data’.

A relaxing of data laws, with an emphasis on self-regulation over government oversight, might even deter some companies from basing themselves in the UK. ‘Ironically, it could prejudice trade with the EU,’ says Barrett. ‘If they’re going to take a look and say “well, we can’t trust you with our data,” that’s not a good outcome for our tech industry.’

She adds, the UK’s ICO is ‘a strong regulator in this space’. When it comes to data privacy, ‘it’s at the forefront from a European standpoint’.

Future challenges – and litigation

What happens after Brexit is just one of the many issues people are wrestling with in the ever-developing landscape of data privacy regulation and, increasingly, litigation. Court cases concerning database breaches are comparatively straightforward when compared with lawsuits that seek to argue that the ways social media sites and internet companies collect, use and share personal data are intrinsically harmful.

‘The more data that organisations capture and the more they do with it, the more likely it is that they are going to be in breach of some provisions of the GDPR,’ says Rose. This, in turn, makes litigation more likely. However, he continues that ‘the test is if “I” can recover damages that are essentially compensated damages, where it’s hard to show loss or distress suffered’.


The more data that organisations capture and the more they do with it, the more likely it is that they are going to be in breach of some provisions of the GDPR

Adam Rose
Partner, Mishcon de Reya


How to define ‘consent’ is a thorny issue in our all-encompassing digital world. In lieu of a payment to a social media site such as YouTube, is a user’s personal data quid pro quo for a free service? What if that user has unwittingly provided that data in some way?

‘These are really interesting questions,’ says Rose. As a lawyer, ‘who can never say something is one hundred percent set,’ he is ‘one hundred percent certain’ we will see many more cases seeking to litigate the answers.

And we are also only just beginning to grapple with the impact of social media algorithms and artificial intelligence (AI) on user privacy. Such technologies ‘read’ our website clicks and page likes, enabling service providers to shape our online experience. They can provide an insight into our preferences as personal – and as commercially valuable – as any data we share via a sign-up page.

Søren Skibsted is Immediate Past Chair of the IBA Technology Law Committee and heads Danish firm Kromann Reumert’s Technology and Outsourcing Group. He says that the advent of AI is the ‘technological development that has enabled us to make much better and more effective use of personal data for commercial purposes’. This is ‘probably one of the main value drivers of a company today, almost regardless of industry’.

For Agustinoy at Cuatrecasas, as data is such ‘a critical asset for companies,’ logically, its increased use ‘leads to a constant clash between commercial exploitation and a demanding framework as defined by GDPR’.


Data’s use leads to a constant clash between commercial exploitation and a demanding framework as defined by GDPR

Albert Agustinoy
Chair, IBA Disputes and Rights Subcommittee


But as tech companies keep testing the limits of obtaining and using people’s data, the GDPR and data protection legislation ‘necessarily adopts an abstract approach that [repeatedly] requires interpretation and decisions that may be in a grey zone,’ says Agustinoy.

Looking to the future, one of the major challenges – in terms of legislation and litigation – will be how to reconcile new types of technology with existing regulatory frameworks. Skibsted believes we need ‘clear and simple legislation’ that doesn’t become overly caught up in differing details and interpretations.

He cautions against losing sight of the wood for the trees. ‘A place to start might be what are not key privacy considerations?’ he suggests.

In September, the European Commission unveiled a new draft regulation, the Digital Operational Resilience Act (DORA), for the financial services sector. Among its package of digital-related controls, ‘there’s another regime that looks, in essence, at yet more security controls,’ says Barrett. ‘It’s reaching into direct responsibility for the vendors, so adding a broader layer.’

While the DORA is at present only a draft and apparently partially aimed at harmonising existing legislation, ‘there’s a lot you can look at and ask: do we really need that?’ says Barrett.

As new situations arise, Barrett cautions against an instant reaction to look to implement new regulations, arguing that they are not always needed.

She cites news stories around the use of facial recognition technology that argue for greater controls. ‘Actually, if you look at the existing framework, it has a lot of controls,’ says Barrett. ‘It’s more about making people aware of those controls and how they apply.’

Agustinoy says we shouldn’t be surprised to see increased litigation in this area ‘shaping the interpretation of data protection regulations’. By its nature, he says, ‘law is reactive to the reality it is called to regulate’.

He hopes that, combined with higher standards of protection adopted by social media operators in response to intensifying scrutiny, we will see ‘the progressive shaping of data protection compliance’.

Ideally, people’s data would be safeguarded without the benefits of technological innovation being stifled and without data protection regulations becoming buried in complication or in competing regional priorities and politics. As Singh sums up when discussing Asia’s progress, ‘if the past decade was about asking questions – and, more importantly, the right questions – one may hope that the next would focus on getting the right answers’.

Tom Wicker is a freelance journalist and can be contacted at tomw@tomwicker.org