The stacked agenda for data protection

In-house counsel handling privacy and data protection matters are grappling with a raft of legislative changes, court judgments and new guidelines. In-House Perspective takes stock of the privacy landscape.
The UK rings the changes
The Data (Use and Access) Act 2025 (DUAA) received Royal Assent in the UK in June and brings notable revisions to the country’s privacy framework. The DUAA introduces amendments to the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA) and the Privacy and Electronic Communications Regulations. Its provisions are expected to take effect in stages – two, six and 12 months after the date of entering force – which means the onus is now on companies to ensure they’re compliant with the new requirements.
Rather than a major overhaul, many of the changes being introduced by the DUAA are aimed at simplifying compliance for businesses. Nonetheless, organisations will need to review their records of processing activities, their direct marketing practices and their use of cookies to address the relevant changes. Among other things, the legislation introduces changes to the meaning of ‘research’ and ‘statistical purposes’ in the context of processing, as well as to the rights of data subjects, to automated decision-making and regarding transfers of personal data to third countries.
Of particular note, the DUAA introduces ‘recognised legitimate interests’ as a new lawful basis for data processing. This means certain security-related activities, such as fraud prevention, public safety and national security, can be considered legitimate interests by default, potentially without requiring a legitimate interests assessment.
One of the most notable changes in the context of innovation, meanwhile, is the relaxation of rules around the use of personal data for automated decision-making. The DUAA applies a more permissive framework in this area, subject to the adoption of safeguards. Specifically, these are that the data subject has been given information about the decisions that’ll be made; is able to make representations about those decisions; that they can obtain human intervention in relation to the decision; and that they can contest the decision. This more lenient approach to automated decision-making will be beneficial for businesses implementing AI systems that incorporate such features.
Abhijit Mukhopadhyay, an officer of the IBA Corporate Counsel Forum, explains that ‘organisations are currently grappling with the […] challenges presented by the rapid integration of AI, while ensuring compliance with privacy standards, including automated decision-making, data minimisation and transparency. Additionally, managing international data transfers post-Brexit and aligning with UK-specific regulations are critical priorities’.
‘While the DUAA makes strides in enabling innovation, particularly for AI and research, there are concerns that it may dilute individual rights, especially around transparency and consent,’ says Mukhopadhyay, who’s President (Legal) & General Counsel at the Hinduja Group. ‘The risk is that in favouring business efficiency, public trust in data usage could be undermined.’ There are also concerns that the new rules don’t address ambiguity regarding accountability for biased outcomes or explainability in regards to decisions made by AI systems.
“While the UK’s Data (Use and Access) Act 2025 makes strides in enabling innovation, there are concerns that it may dilute individual rights, especially around transparency and consent
Abhijit Mukhopadhyay
Officer, IBA Corporate Counsel Forum
James Castro-Edwards, counsel at Arnold and Porter in London, believes that the DUAA, on the whole, is ‘generally favourable’ for UK businesses, as it relaxes some rules. He highlights however that it aligns the fines under the Privacy and Electronic Communications Regulations – previously set at a maximum amount of £500,000 – with those of the UK GDPR, under which the maximum penalties are the greater of four per cent annual turnover, or £17.5m. ‘This will escalate businesses’ need to ensure that their direct marketing activities are compliant,’ says Castro-Edwards.
The UK is also awaiting a final decision by the European Commission on the renewal of its adequacy finding, which enables the continuation of the free flow of personal data from the EU to the UK post-Brexit. In making this assessment, the EU is concerned with the balance achieved by the DUAA between promoting innovation and preserving the rights of individuals.
The adequacy finding was due to expire on 27 June, but following the UK’s adoption of the DUAA, the European Commission extended the deadline for its decision in order to assess the UK’s level of data protection in the context of the new legislation. In July, the European Commission issued a draft adequacy decision, finding that the UK ensures a level of protection for personal data that’s essentially equivalent to the EU’s. This decision will need to be approved by the European Data Protection Board and the European Parliament before it’s adopted.
European convergence
Within the EU, privacy professionals point to the increasing convergence of data protection law with new legislative frameworks, particularly the bloc’s AI Act and the Data Act. ‘Both instruments substantially affect how personal data is processed, shared and protected,’ says Roland Marko, a partner at Wolf Theiss in Vienna. ‘For legal counsel, one of the major challenges now is reconciling obligations across these intersecting legal regimes, as their underlying rationales and operational logic are not always easily aligned.’
In addition, cybersecurity legislation is increasingly shaping the data protection landscape, with the EU’s revised Network and Information Systems Security Directive (NIS2) and its Digital Operational Resilience Act both expanding requirements for businesses in respect of data security standards, which will, in turn, reinforce personal data protection.
In-house counsel may welcome the guidance being issued by national data protection authorities on AI. For example, the German Federal Commissioner for Data Protection and Freedom of Information published updated guidance on technical and organisational measures for AI systems in June and the French Data Protection Authority finalised its recommendations on the development of AI systems in July. Such guidance is particularly helpful where clarity may still be lacking in terms of bloc-wide AI regulation.
The European Commission did publish guidelines to assist providers of general-purpose AI models in meeting the AI Act’s obligations, which came into force in early August. From this date, providers must comply with transparency and copyright obligations when placing general-purpose AI models on the EU market.
The increased focus on AI regulation has brought data protection issues into sharper focus, explains Marko. ‘AI systems often rely on large-scale personal data processing, including profiling and inference-making. This raises significant challenges around purpose limitation, transparency, fairness and data minimisation, which are core GDPR principles,’ he says. ‘Companies are responding in different ways. Larger players tend to establish multidisciplinary teams to manage risks, while smaller actors may rely more heavily on external expertise. In any case, legal teams are under pressure to ensure that AI deployments comply not just with the AI Act but also with the GDPR, which isn’t always a straightforward exercise.’
“Legal teams are under pressure to ensure that AI deployments comply not just with the EU AI Act but also with the General Data Protection Regulation, which isn’t always a straightforward exercise
Roland Marko
Partner, Wolf Theiss
‘My sense is that a lot of companies are not yet truly grappling with AI,’ says Kelly Hagedorn, a partner at Alston & Bird in London. ‘Many will have adopted Co-Pilot, a version of ChatGPT, or some other workplace assistant software to improve productivity. We see uptake of HR-related tools [at] a number of clients, and these of course come with challenges. However, the real explosion of AI development and adoption within the workplace is yet to come.’
Simplification and clarification
In May, the European Commission proposed a set of simplification measures aimed at increasing the competitiveness of EU businesses and saving them an additional €400m in administrative costs per year. One of the proposals involves simplifying the risk-based record-keeping obligations under the GDPR for small and medium-sized enterprises (SMEs), as well as for a new category of companies, small mid-caps (SMCs). The latter are defined by the Commission as companies with fewer than 750 employees and either up to €150m in turnover or up to €129m in total assets. The European Commission’s proposal, then, would mean that these organisations are only required to maintain records when the processing of personal data is deemed to be ‘high risk’, according to the definition outlined in Article 35 of the GDPR.
Other proposed amendments include taking into account the specific needs of SMEs when developing codes of conduct and when granting data protection certifications.
The Council of the European Union and the European Parliament reached a provisional agreement in June on the Commission’s proposal laying down additional procedural rules relating to GDPR enforcement. This new regulation would seek to improve cooperation between national data protection authorities, particularly with regard to enforcement in cross-border cases. The key innovation is the harmonisation of admissibility checks for cross-border complaints. In addition, affected companies or organisations will be given the right to a fair hearing in regard to the relevant procedural steps.
According to legal commentators, Europe is also seeing an increase in data subject access and other privacy-related requests, which is placing additional pressure on businesses. And adding to the considerations for professionals, there have also been several recent rulings from the Court of Justice of the European Union (CJEU) on privacy matters.
In February the CJEU, in CK v Dun & Bradstreet, clarified that data subjects have the right to receive meaningful information about automated decision-making in a concise and easily accessible form, ensuring transparency and enabling them to challenge such decisions.
Meanwhile, the CJEU’s January ruling in Bindl v European Commission represents the first time an EU court has awarded non-material damages for a violation of the bloc’s data transfer rules. In this case, the European Commission was ordered to pay €400 in damages to an individual whose data had been transferred to the US unlawfully without adequate protections.
Matthias Orthwein, a partner at SKW Schwarz in Munich, highlights that following several prominent rulings in support of claims for non-material damages after violations of GDPR requirements, there has been a significant increase in litigation for damages by data subjects. ‘The increasing number of inquiries from data subjects and claims for non-material damages due to data incidents or failures to meet the GDPR requirements are very relevant for businesses,’ he explains.
‘Especially in cases where large amounts of data from a large number of individuals are processed, the sums of potential claims for damages can quickly skyrocket due to the sheer volume,’ adds Hannah Mugler, a counsel also at SKW Schwarz. ‘Customers care more and more about finding businesses that they can trust with their personal data, which is also reflected in the increasing amount of legal requirements concerning cybersecurity. This is why compliance with data security requirements has also become a crucial task for businesses.’
What to watch
Mukhopadhyay says the key reforms he’ll be monitoring are the development of the UK’s AI regulatory framework, which is expected to take shape following the enactment of the DUAA, as well as the EU’s enforcement of the AI Act beginning in 2026. He’ll also be looking at potential new adequacy decisions or challenges from the EU as well as pending updates to the global data transfer rules following the CJEU’s ruling in Schrems II and under new US–EU frameworks.
Marko will be closely monitoring the impact of the EU’s Data Act and the AI Act, as both will have systemic effects on how personal data is handled in the bloc. The interface between these regimes and the GDPR is already prompting legal uncertainty, and it’s anticipated that practical harmonisation efforts will require significant attention in the new future, he says. Orthwein believes that going forward, the increase in regulatory and legislative developments involving data security requirements will certainly raise the bar for handling personal data. These requirements will define the way data should be handled safely and, thus, in a compliant manner.
Another significant area to watch concerns the low thresholds for damages as interpreted by the CJEU in UI v Österreichische Post, combined with the procedural tools available under the EU Collective Redress Directive, which may create a foundation for future class actions based on GDPR violations throughout Europe. Privacy experts are predicting that litigation in this area will increase over time.
‘The global data protection landscape is becoming more complex, as more countries adopt data protection legislation and appoint data protection authorities,’ explains Castro-Edwards. ‘In addition, privacy rights groups, such as NOYB [founded by privacy campaigner Max Schrems], are bringing actions on behalf of affected individuals. NOYB should not be underestimated, having been instrumental in the annulment of the US Safe Harbor in 2016 and the US Privacy Shield in 2020.’
“The global data protection landscape is becoming more complex, as more countries adopt legislation and appoint [relevant] authorities
James Castro-Edwards
Counsel, Arnold and Porter
Sophie Cameron is a freelance journalist and can be contacted at sophiecameron2@googlemail.com