Social media: AI makes it tougher to hold influencers to account

Sophie CameronWednesday 19 July 2023

Social media influencers are finding themselves in the sights of legislators and regulators around the world, who are keen to curb what they perceive to be the excesses of the industry. Recent developments include the adoption in early June of a new law in France, marking the first time the role of an influencer has been legally defined in Europe, and new guidance from the UK’s Advertising Standards Authority (ASA), published in the spring.

The ASA defines an influencer as someone who’s active on a social media platform and who gives their opinions on products or services to their followers. Influencers leverage their influence to collaborate with brands, promote products or services and engage in sponsored content partnerships, and therefore have the ability to affect the opinions, behaviours and purchasing decisions of their followers. The ASA finds no requirement to have a particular number of followers to be deemed an influencer. The regulator also notes that an influencer needn’t be a human but could be an animal or, crucially in an age where artificial intelligence (AI) is a pressing concern for society, a virtually produced persona.

The ASA’s rules also include guidance around deceptive practices and false claims, offensive material, intellectual property rights and anti-conceptive conduct.

France’s new law aims to regulate social media influencers and increase accountability within the industry. It’s designed to target a highly visible minority of influencers who are said to be involved in abusive practices and/or scams. Clément Monnet, counsel at Norton Rose Fulbright in Paris, welcomes the new law. ‘The fact that it actually grants a unique legal status to influencers and legally recognises the existence of this marketing trend that had so far been ignored by the legislator, even though influencer marketing now impacts the lives of a massive number of consumers on a daily basis, makes the adoption of this law a significant legislative event in France’, explains Monnet.

Specifically, the law aims to combat the rise of certain abuses and scams, such as influencers encouraging people to follow dangerous diets, undergo cosmetic surgery and become involved in excessive gambling, as well as the promotion of counterfeit goods. The law provides different penalties depending on the infringement and its severity. In the majority of cases, a violation of the rules is punishable by two years’ imprisonment and a €300,000 fine, which may be supplemented by a permanent or temporary ban from the profession. The new regulations also strengthen the powers of France’s Directorate-General for Competition, Consumer Affairs and Fraud Prevention (DGCCRF), which now has the ability to impose fines and formal notices on influencers. 

AI could be used to create fake imitations of influencers, with activity that is not compliant. This will create challenges for regulators and influencers

Brinsley Dresden
Partner, Lewis Silkin

In early July, Sarah Lacoche, President of the DGCCRF, highlighted during an interview about the new law the €3.3m fine imposed on Amazon in December for alleged delays in reviewing unbalanced contractual conditions imposed on professionals selling products through its platform. A statement by Amazon released at the time said that the company disputed the findings and would appeal. Raphaël Dana, Vice-Chair of the IBA Internet Business Subcommittee and a partner at Dana Associés in Paris, surmises that Lacoche ‘indicates that in the near future, influencers that do not act fast enough concerning compliance with the new law or further to a summons from the regulator will face similar sanctions based on the speed in which they comply’. Daniela De Pasquale, Chair of the IBA User Generated Content Subcommittee and a partner at Ughi e Nunziante in Milan, adds that ‘such a possibility is also expressly provided for in Article 13 of the law’.

Olivia O’Kane, a partner and specialist media lawyer at DWF (Northern Ireland), explains there are instances where influencers unintentionally engage in disseminating misleading content, for example by omitting key information or exaggerating claims. Even where the claim is factually accurate, the results – of, say, using a specific product – might only be achievable in certain circumstances, and failing to explain those circumstances could cause the ad to be misleading.

Where influencers have been found to spread misinformation, make intentional false claims, or engage in deceptive practices, the consequences can be serious. ‘These issues are challenging and the dissemination of inaccurate information that is misleading or deceptive can fall within a regulatory framework but has also been litigated in the courts as alleged breaches of consumer protection laws and oftentimes the platform as well as the influencer can be targets of litigation’, says O’Kane. ‘Legislators and regulators are exploring more efficient ways to hold influencers accountable for misleading content and to ensure that the information shared is accurate and reliable.’

Europe has adopted the Digital Markets Act and the Digital Services Act, two pieces of legislation that will change the online landscape. The latter, which will come into force in February 2024, aims to modernise the EU’s Electronic Commerce Directive by addressing illegal content, transparent advertising and disinformation.

In Australia, Angela Flannery, Vice-Chair of the IBA Communications Law Committee and a partner at Quay Law Partners in Sydney, explains that while there are a range of generally applicable laws that already apply to influencers in Australia, ‘there are some gaps’. The Australian government is considering new proposals that will impact this sector. For example, in June, it released a consultation on proposed legislation to provide new powers to the country’s Communications and Media Authority to regulate misinformation and disinformation.

In the UK, Brinsley Dresden, a partner at Lewis Silkin in London, explains that the main challenge is the lack of enforcement of existing laws, particularly the Consumer Protection Regulations 2008. However, changes to the law, expected later this year through the enactment of the Digital Markets, Competition and Consumers Bill, will mean that the UK’s Competition and Markets Authority will be able to impose fines directly without having to go to court.

Dresden says that new technologies could create further challenges for regulators in this space. ‘On the one hand, regulators like the ASA are using machine-learning technologies to ramp up their enforcement activity’, he says. ‘On the other hand, we are seeing AI and deep fakes being used to create fraudulent imitations of individuals online and so AI could be used to create fake imitations of influencers, with activity that is not compliant. This will create challenges for regulators and influencers.’

Image credit: sitthiphong/AdobeStock.com