Trending now
An international guide to social media regulations
Social media began its rise to popularity in the 1980s and 1990s with the adoption of message boards, email and online messaging. In the early 2000s, networking sites like MySpace and Zynga came on to the scene, offering users a new and interactive way to communicate electronically. By the late 2000s, Facebook began to eclipse these platforms and quickly became one of the most utilized platforms in the world. 59% of the world’s population, or 4.7 billion people, are social media users. That number is projected to increase to almost 6 billion by 2027.*
Contents
Social media activity and usage globally
Current laws and regulations relevant to social media by jurisdiction
Legislative developments on the horizon
Data protection laws applicable to social media
Contributors
Select topic by clicking icon
* Statista - "Number of social media users worldwide from 2017 to 2027"
2023 Edition
Social media use by employees, customers and competitors is inevitable for businesses in this era. As with any new technology or global adoption of a product, businesses must navigate a shifting legal and ethical landscape. This guide aims to provide an overview of some of the relevant laws, regulations and other considerations for companies operating in many regions around the globe. Jurisdictions included in this guide are Australia, France, Hong Kong, Ireland, Spain, the Netherlands, the United Arab Emirates (UAE), the United Kingdom and the United States. The European Union (EU) is also included given that EU laws will be applicable in each of the EU member states. While this guide addresses many of the key laws governing social media platforms, it does not cover every law that may impact them and should not be considered a substitute for legal advice. For example, an issue of particular importance to social media companies and businesses engaging with social media is the treatment and targeting of children or youth on or through these platforms. This is a particularly intricate area of law which involves differing approaches across the different jurisdictions, not to mention differing approaches within different jurisdictions depending on the legal question. This topic is not covered in this guide in any detail, but we are happy to provide further specific advice on this topic if required.
Note: This guide has been updated as of September 2023
Platforms ranked by usage 2022/2023 (Source): — YouTube — Facebook — WhatsApp — Instagram — Snapchat
France
An estimated 81% of the population was active on social media in 2022/2023 (Source).
Platforms ranked by usage in 2022/2023 (Source): — WhatsApp — Facebook — Instagram — WeChat — Facebook Messenger
An estimated 90% of the population was active on social media in 2022/2023 (Source).
Hong Kong
Platforms ranked by usage in 2022/2023 (Source): — Facebook — Instagram — YouTube — TikTok — SnapChat
An estimated 87% of the population was active on social media in 2022/2023 (Source).
Ireland
Platforms ranked by usage in 2022/2023 (Source): — WhatsApp — Facebook — Instagram — YouTube — Twitter
An estimated 85% of the population was active on social media in 2022/2023 (Source).
Spain
Platforms ranked by usage in 2022/2023 (Source): — Facebook — YouTube — Instagram — TikTok — LinkedIn
An estimated 85% of the population are active on social media in 2022/2023 (Source).
The Netherlands
Platforms ranked by usage in 2022/2023: — WhatsApp — Facebook — Instagram — TikTok — Facebook Messenger
The UAE was home to 10 million social media users in January 2022/2023 - which equates to an estimated 105% of the population (note social media users may not represent unique individuals). (Source)
UAE
United Kingdom
United States
Scroll down
Platforms ranked by usage in 2022/2023: — WhatsApp — Facebook — Instagram — Facebook Messenger — Twitter
An estimated 91% of the population was active on social media in 2022/2023 (Source).
Platforms ranked by usage in 2022/2023 (Source): — Facebook — YouTube — Instagram — Tik Tok — SnapChat
An estimated 92% of the population was active on social media in 2022/2023 (Source).
Below are some of the key laws, regulations and codes across the jurisdictions that will be pertinent to social media companies, but excluding data protection laws. We have a separate dedicated section for data protection considerations.
European Union
Australia
Click on maps for information
The Online Safety Act 2021 (the Online Safety Act). The Online Safety Act, which includes a Social Media Services Code that will take effect in December 2023, is the primary legislation under which the eSafety Commissioner exercises its powers to protect consumers in Australia – both adults and children – across most online platforms and forums, where they are susceptible to abuse or exposure to harmful content. Amongst others, it provides for (i) a Cyberbullying Scheme, which is a complaints system for cyber-bullying material targeted at a child; (ii) an Adult Cyber Abuse Scheme, which is a complaints system for cyber-abuse material targeted at an adult; (iii) an Image-Based Abuse Scheme, which is a complaints and objections system for non-consensual sharing of intimate images; and (iv) an Online Content Scheme, which regulates illegal and restricted content wherever it is hosted. Further, it sets out the Basic Online Safety Expectations for online service providers that establish their accountability for the safety of their service users, and requires the tech industry to develop new codes to detect and remove illegal content, which would apply to social media platforms.
This section specifically lists various EU laws that may be relevant to social media companies. EU laws are applicable to each EU member state and may be supplemented further by local national law in each member state. As such, all of the laws set out below will be applicable in each EU member state. EU e-Commerce Directive The e-Commerce Directive is a set of rules on commercial communications for online contracting. According to the e-Commerce Directive, online intermediaries are exempt from liability for hosting or transmitting information that may infringe legal rights under three exemptions. Mere conduit service providers, which are only passively involved in the transmission of data, are not liable provided they do not initiate the transmission, do not select the receiver of the transmission and do not select or modify the information contained in the transmission. Caching providers are not liable for the automatic and temporary storage of information that is performed for the sole purpose of making it more efficient for that information’s onward transmission. Hosting providers are exempt from liability provided they have no actual knowledge of the unlawful activity or information, or facts or circumstances from which that unlawful activity is apparent. It is important to note that each of these exemptions preserves the power of the courts to grant injunctive relief. In addition, caching and hosting providers can only avoid liability if, upon notice of infringement, they act expeditiously to remove or disable access to that information. Once again, this highlights the importance of an effective notice-and-takedown procedure. EU Unfair Commercial Practices Directive (Directive) The Unfair Commercial Practices Directive regulates unfair business practices and applies to online intermediaries, including social media companies. The Directive provides a general prohibition on unfair commercial practices. Some commercial practices that may be problematic on social media platforms are hidden advertising, including misleading influencer marketing; unfair standard contract terms; practices related to in-platform purchases, such as virtual items; and commercial practices put in place by third party traders through social media platforms, including fake or misleading user reviews or endorsement. The Directive requires that all forms of commercial communications on social media platforms must be clearly disclosed. EU Platform to Business Regulation (P2B Regulation) The P2B Regulation applies to certain social media where they fall under the definition of an online platform provider by providing online intermediation services. Some of the obligations introduced by the P2B Regulation include that a platform’s terms and conditions (T&Cs) must be easily available to business users and be written in plain and intelligible language; if the platform decides to restrict, suspend or terminate its service to a business user, it must give the business user a statement of reasons; the platform’s T&Cs must describe the main parameters which affect the ranking of goods and services on the platform; and platforms must establish an internal complaint-handling system for business users. EU Directive on Copyright in the Digital Single Market (Copyright Directive) The Copyright Directive introduced obligations for online content-sharing platforms to prevent infringing content. This includes, for example, the use of technology to spot and prevent the uploading of infringing content. It grants publishers and news organizations the right to negotiate licenses and receive compensation when online platforms use snippets of their content. The Copyright Directive also allows for certain exceptions and limitations to copyright such as for education, text and data mining and research. EU Terrorist Content Online Regulation (TCO Regulation) The TCO Regulation was passed in April 2021 to tackle the spread of unlawful content promoting or facilitating terrorist activity. The TCO Regulation provides a legal framework to ensure that hosting service providers that make content available to the public address the potential misuse of their services for the dissemination of terrorist content online. EU Digital Single Market Strategy The aim of the EU Digital Single Market Strategy is to maintain the EU’s position in digital markets globally under the “three pillars” approach:
Access to online products and services for consumers and businesses Shaping the environment for digital networks and services to grow and thrive Maximizing the growth potential of the European digital economy
- - -
In this respect, the European Commission has developed the DSA package, which includes the Digital Services Act (Regulation (EU) 2022/2065) (DSA) and the Digital Markets Act (Regulation (EU) 2022/1925) (DMA). DSA The DSA will create additional obligations for digital service providers, including specific obligations for “very large platforms,” which are defined as platforms with over 45 million users. The key dates for the DSA under the current timeline include:
February 17, 2023. Platforms and search engines must publish their user numbers. August 25, 2023. Very large search engines and very large platforms, as designated by the European Commission, must comply with the DSA from this date.. February 17, 2024. The DSA rules shall be in force for all applicable companies, and EU member states must have established their digital service coordinators (as defined under the DSA) by this date.
The DSA regulates the following areas:
Content liability. With respect to content liability, the DSA will essentially reproduce the liability safe haven that is currently provided for in the e-Commerce Directive. Therefore, a hosting service, if it has actual knowledge of illegal activity or content, must act expeditiously to remove it or it will be held liable. Reporting obligations. Very large platforms will also have specific reporting obligations to enforcement authorities in certain cases—for example, when people’s safety is at stake. Accountability. Like the EU GDPR, the DSA will introduce elements of accountability and financial fines. Such fines can reach 6% of the provider’s annual turnover, which is even higher than those that can be issued under the EU GDPR. This shows that the DSA is not just an invitation for platforms to include mechanisms against hate speech; it actually creates binding obligations, with potential fines at stake. Notification systems. Online platforms must put mechanisms in place to enable any individual or entity to notify them of the presence of illegal content. Such mechanisms must be easy to access and user friendly. The DSA also introduces the notion of “trusted flaggers” for very large platforms. The platforms will have to treat notifications from trusted flaggers as a priority based on an internal complaints-handling system that they must put in place. Information and transparency obligations. The DSA also increases information and transparency obligations, including, for example, the obligation for very large platforms to provide transparency over the main parameters of the decision-making algorithm that is used to offer content. This is in response to the aforementioned concern over freedom of speech.
- - - - -
DMA The DMA targets large technology companies and implements new obligations on “gatekeepers” to ensure smaller businesses are treated fairly. The purpose of the new legislation is to encourage competition in the digital markets by prohibiting certain commercial practices and requiring bigger players to adhere to positive obligations in order to promote competitiveness, such as providing fair access to application developers. Key dates for the DMA under the include:
May 1, 2023. Potential Gatekeepers had 2 months to identify themselves to the European Commission if they met the DMA thresholds. July 3, 2023. European Commission has 45 working days to assess whether the companies identifying as potential Gatekeepers actually met the thresholds and if they did to formally designate them as Gatekeepers for the purposes of the DMA. September 6, 2023. European Commission will notify Designated Gatekeepers following their assessment. Companies notified as Designated Gatekeepers have 6 months from the date of notification to comply with the DMA. March, 2024. Gatekeepers required to publish their DMA compliance reports.
- - - -
On 5 September 2023, the European Commissioner published the following list of entities as being Designated Gatekeepers for the purposes of the DMA:
Alphabet Inc.
-
• AmApp Stores: Google Play • Google Maps • Google Shopping • Google Search • YouTube • Android Mobile • Alphabet’s online advertising service • Google Chrome
• Marketplace • Amazon Advertising
Amazon.com Inc.
• AppStore • iOS • Safari
Apple Inc.
• TikTok
ByteDance Ltd.
• Facebook Marketplace • Facebook • Instagram • WhatsApp Messenger • Meta Ads
Meta Platforms, Inc.
• LinkedIn • Windows PC OS
Microsoft Corporation
All EU laws mentioned in the EU section French Constitution Freedom of speech is a constitutional right in France. It is included in articles 10 and 11 of the Declaration of Human Rights of 1798, which is part of the French Constitution. Social media platforms have brought new challenges. As in many other countries, regulating speech on social media platforms is complicated by the tension caused between protecting freedom of speech and preventing hate speech and other unlawful content.
Hong Kong National Security Law (NSL) The NSL was introduced on June 30, 2020, and prohibits a wide range of activities under the four main offenses of separatism, subversion, terrorism and colluding with foreign forces in Hong Kong. In particular, Article 43 of the NSL and the implementing measures give Hong Kong police the power to order the blocking and deletion of content by message publishers, platform service providers, hosting service providers and/or network service providers (including those operating social media platforms). Trade Description Ordinance (TDO) While social media marketing is increasingly prevalent, the TDO protects consumers from false trade descriptions in respect of goods and services, misleading omissions, aggressive commercial practices, bait advertising and bait-and-switch tactics, etc., involved in any form of marketing (including those on social media). Basic Law In Hong Kong, freedom of speech is enshrined under Article 27 of the Basic Law, a constitutional document of Hong Kong. It stipulates: “Hong Kong residents shall have freedom of speech, of the press and of publication; freedom of association, of assembly, of procession and of demonstration; and the right and freedom to form and join trade unions, and to strike.”
All EU laws mentioned in the EU section Irish Constitution Each piece of Irish legislation should be read and understood in the context of Irish constitutional rights, particularly freedom of expression. Constitutional rights are the most important and protected rights in the Irish legal system. Freedom of expression is a key constitutional right in Ireland under Article 40.6.1 of the Irish Constitution and is therefore protected in the Irish legal system. The right to freedom of expression has been developed through case law, and all legislation that is passed in the country must ensure that this right is not unconstitutionally prejudiced. This is a key consideration for the Irish legislature as it continues to address the challenges of regulating social media companies. However, like all constitutional rights freedom of expression is not an absolute – it can be limited in the interests of public order and morality and must be weighed against competing constitutional rights, such as the right to privacy. While a right to privacy is not specifically referenced in the Irish Constitution, the Irish courts have placed privacy on equal footing with the fundamental rights enshrined in the Irish Constitution. Online Safety and Media Regulation Act 2022 (OSM Act) The OSM Act was signed into law in December 2022 and subsequently came into force on March 15, 2023. The OSM Act creates a framework for the regulation of online safety and defines “harmful online content.” The OSM Act also provides for the establishment of a new media commission in Ireland (Media Commission), known locally as Coimisiún na Meán. Under the OSM Act, the Media Commission has the power to develop binding online safety codes (Codes) and will designate which online services these Codes apply to, with the purpose of reducing the risk of harmful and illegal online content. Through the implementation of these Codes, the Media Commission shall enforce rules on how online services or platforms, including social media platforms, should deal with harmful and illegal content. The Media Commission can require relevant online services to provide information on their compliance with any relevant Codes, and a failure to comply with such Codes may result in investigations and administrative sanctions. Electoral Reform Act 2022 (2022 Act) The 2022 Act was signed into law on July 25, 2022, and parts 4 and 5 were subsequently notified to the European Commission pursuant to the technical regulation information system (known as TRIS) notification process, so as to ensure the draft legislation complies with EU law and market rules. However, as of the date of publication of this guide, parts 4 and 5 have not yet taken effect. Parts 4 and 5 of the 2022 Act are relevant in this scenario because they will regulate political advertising on online platforms, including social media platforms. Online platforms will be obliged to accompany political adverts on their platforms with certain information, provide a transparency notice and conduct identity verification in respect of those who purchase political adverts. Online platforms are also obliged to provide the Electoral Commission with specific information, including whether there is misinformation on their services. The Electoral Commission will be responsible for monitoring online platforms’ compliance with this Act once it has taken effect. Harassment, Harmful Communications and Related Offences Act 2020 (2020 Act) The 2020 Act criminalized the act of publishing or sharing intimate images of another person without their knowledge or consent. The 2020 Act also established a new offense that makes it illegal to share, send or publish “threatening or grossly offensive communication” about a person if there is intent by the publisher to cause harm to that person. Finally, the 2020 Act extended the definition of “harassment” under Section 10 of the Non-Fatal Offences Against the Person Act 1997, which now means that communication that is not addressed to the person involved can still be considered harassment. All online intermediaries, including social media providers, will have to refresh and update their policies on prohibited communications given the increased scope for criminal prosecution of users committing offenses under the 2020 Act. The Defamation Act of 2009 The Defamation Act makes it an offense to make a defamatory statement about another person. A defamatory statement is defined as one that “tends to injure a person’s reputation in the eyes of reasonable members of society.” This applies to the publication of a defamatory statement on a social media platform. Prohibition of Incitement to Hatred Act 1989 (1989 Act) The 1989 Act refers to “hatred” as the discrimination of another person based on that person’s background—for example, attacking a person as a result of their sexuality, race or religion. The 1989 Act states it is an offense for a person to “publish or distribute written material, to use words, behave or display written material … to distribute show or play a recording of visual images or sounds, if the written material, words, visual images or sounds, as the case may be, are threatening, abusive or insulting and are intended or, having regard to all the circumstances, are likely to stir up hatred.” The provisions of the 1989 Act can be directly applied to online comments and hatred published on social media platforms. Consumer Protection Act 2007 (2007 Act) The 2007 Act regulates marketing and protects consumers in Ireland by prohibiting traders from engaging in “unfair,” “misleading” and “aggressive” commercial practices. The Irish Competition and Consumer Protection Commission (CCPC) has noted that those who promote goods or services on behalf of a business, including the promotion of goods and services via social media platforms, may be considered a trader under the 2007 Act. Consumer Rights Act 2022 (CRA 2022) CRA 2022 came into force recently and introduces several enhanced statutory protections for consumers and new obligations on traders, as well as obligations on online marketplaces. Social media providers should be aware of these new obligations, as CRA 2022 applies to contracts where the consumer does not pay a monetary price but provides personal data that the trader can commercialize. CRA 2022 also grants new enforcement powers to the CCPC, with increased penalties available for the prosecution of EU-wide infringements. Advertising Standards Authority for Ireland (ASAI) Guidance Note on Recognizability of Influencer Marketing Communications (Guidance) The ASAI is a non-statutory self-regulatory body that aims to promote standards of marketing communications, including by way of the publication of its Code of Standards. In 2017, the ASAI published the Guidance in order to bring clarity on how the Code of Standards applies to influencers, including when certain influencer posts are considered marketing communications. If an influencer post is considered a marketing communication, it must be clearly labeled as such before users engage with the content. The Guidance provides specific examples for different scenarios, such as when free products are provided to influencers, use of affiliate links by influencers and sponsorship relationships with influencers, which are all relevant given the increased use of social media by influencers to promote products. Copyright and Related Rights Act 2000 (2000 Act) The 2000 Act regulates copyright and copyright infringement in Ireland. The act states that those who provide facilities for the making available of public work shall also be liable for the infringement if they fail to remove infringing material as soon as practical following notification from the owner of the copyright. In this regard, it is important for social media providers to have a “notice and takedown” procedure that is quick and effective so as to avoid liability under the 2000 Act.
The 2000 Act has since been amended by S.I. No. 59/2012 European Union (Copyright and Related Rights) Regulations 2012 (2012 Regulations). The 2012 Regulations allow for the application of injunctions against intermediaries whose services are used by a third party to infringe a copyright or related right. The 2000 Act was further amended by S.I. No. 567/2021 European Union (Copyright and Related Rights in the Digital Single Market) Regulations 2021, which directly transposed the Copyright Directive.
- -
S.I. No. 68/2003 European Communities (Directive 2000/31/EC) Regulations 2003 (Irish e-Commerce Regulations). The Irish e-Commerce Regulations transposed the e-Commerce Directive into Irish law.
All EU laws mentioned in the EU section Spanish Law on the 31/2000 EU Directive on e-Commerce and Information Society Services as now complemented by the EU 2022/2065 Data Services Regulation (Spanish DSA) The Spanish DSA sets out what liability social networks are subject to regarding the content exchanged through them. Courts have consistently exempted social networks from application of regulations applying to other media (mainly the Spanish 1966 Press Act) and are liable for the content published through them only within the strict limits of the “safe harbor” provisions of the above-mentioned directive and, now, of the DSA. The aforementioned principle, though by and large still standing, has incrementally been eroded on several fronts, such as privacy, copyright and the Audiovisual Law. 34/2002 Spanish e-Commerce and Information Society Services Act (Information Services Act) The Information Services Act generally establishes the transparency obligations of all information society services (generally, every page on the internet), provides for safe-harbor exceptions to liability in favor of intermediary services, and regulates cookies and spam and the requirements for electronic contracts to be effective. This is likely to be superseded if and when the DSA is passed by the EU. Spanish 1/1996 Copyright Act (Copyright Act) The Copyright Act regulates exploitation of copyrighted subject matter, including specifically its exploitation by social networks, which are now beyond certain thresholds, subject to payment of compensation to copyright-collecting societies. Spanish 3/1991 Spanish Unfair Competition Act and the Spanish 34/1988 General Advertising Act (Competition and Advertising Acts) The Competition and Advertising Acts regulate advertising and commercial communications and prohibit certain practices, such as misleading advertising or derogatory statements. A significant deal of conflict has arisen in relation to such prohibitions in the context of social networks (such as the use of false references and personal opinions on social networks compiling opinions on hotels and restaurants), with a particular focus in the past few years on the activity of influencers (now subject to the Spanish General Audiovisual Act and to a special Code of Conduct administered by Autocontrol, the independent advertising self-regulatory organization in Spain). Spanish 1/1982 Organic Act on Civil Protection of Honor, Image and Privacy (Organic Act) The Organic Act regulates civil actions for infringement of image rights (similar to publicity rights), honor (defamation) and privacy (in its stricter sense, as protected by Section 8 of the European Convention of Human Rights). This may be one of the most litigated areas around social networks, the use of which has given rise to a significant amount of defamation (particularly on X). 1/2007 Spanish Consumer Protection Act (Consumer Act) The Consumer Act regulates the contracts between social network operators , in particular it regulates the basis and users and mandates that contracts include certain information, and forbids certain clauses being included within contracts which are deemed abusive. Spanish Criminal Code The Criminal Code regulates criminal offenses and, especially regarding social networks, serious privacy breaches (with a particular focus on unauthorized sharing of private sexual communications, images or recordings), criminal defamation and cybercrime in general (including cyberbullying). Spanish 1973 Constitution Freedom of speech is enshrined as a fundamental right of all citizens and people in general within Section 20 of the Spanish 1973 Constitution. A wide body of case law stemming from both the Spanish Supreme Court and the Spanish Constitutional Court pertains to the extension of this right and balancing this right on the one hand and other fundamental rights (such as honor and privacy) on the other. Mostly, Spanish case law follows the lead of the doctrine laid down by the European Court of Human Rights.
All EU laws mentioned in the EU section The Dutch Media Act (the “Media Act”) As of July 1, 2022, influencers on social media will be obliged to comply with the Media Act. The Media Act implements the European Audiovisual Media Services Directive, which came into force on November 1, 2018. Whereas influencers were previously not covered by the scope of the previous Media Act, they are now. Under this Act, influencers are now considered an on-demand commercial media service. For influencers to additionally fall under the supervision of the Dutch Media Authority, the following criteria all need to be met:
The influencer must be active on YouTube, Instagram and/or TikTok; The influencer must have 500,000 or more followers/subscribers on at least one of these platforms; The influencer must have posted at least 24 videos with their account with 500,000 or more followers in the past 12 months; The influencer earns money, receives products or services or gains others benefits of making and posting videos from their account with 500,000 or more followers; The benefit the influencer enjoys of making and posting videos from their account with 500,000 or more followers benefits a company registered with the Dutch Chamber of Commerce.
If these criteria are met, the influencer falls under the supervision of the Media Authority and must report to the media service to the Media Authority (register with the authority, the Dutch self-regulatory body for advertising (Stichting Reclame Code) and report to the Netherlands Institute for the Classification of Audiovisual Media (NICAM)). The Media Act contains rules on recognizable advertising, product placement, sponsorship and protection of minors from harmful content. The Media Authority has recently announced that it will soon revisit the above mentioned criteria.
Dutch Consumer Law -The Unfair Commercial Practices Directive is implemented in the Dutch Civil Code Advertising that is not recognizable as such constitutes a misleading and therefore unfair commercial practice when it limits the average consumer's ability to make an informed decision about a product or service. This also applies to all advertising on social media. The Dutch Civil Code contains a list of commercial practices that are considered misleading and therefore prohibited under all circumstances. An example of such misleading and prohibited practice is using editorial content on social media to advertise a certain product or service without mentioning it concerns advertisement. Dutch Advertising Code and the Advertising Code for Social Media and Influencer Marketing (the “Code”) The Code applies to all companies in the Netherlands that use social media to promote their product and/or services. The Code was drawn up by the Advertising Code Foundation (SRC). The SRC promotes responsible advertising to ensure the reliability and credibility of advertising. The SRC deals with the self-regulation system of advertising. Both the advertising industry and consumers are represented in the board of SRC and in the Advertising Code Committee and the Board of Appeal. Anyone may submit a complaint to the Advertising Code Committee. This independent body then decides whether an advertisement is in conflict with the Dutch Advertising Code. In case of a violation of the Code, the Committee will recommend the advertiser(s)/company(ies)/influencer(s) involved to discontinue such a way of advertising. The Compliance department will thereupon check whether the advertiser has put the recommendation into effect. The Advertising Code Committee cannot impose fines but publish its rulings (so via naming and shaming) encouraging advertisers to advertise in conformity with the Code. Misleading and/or fake news There are parties operating on the internet dealing in fake reviews, fake likes and fake followers. These misleading reviews are used to influence consumers on social media and other online services such as online shops and marketplaces. The Consumer & Market Authority (ACM) considers it very important that consumers receive good information on the basis of which they decide whether to buy a product or service. That is why the ACM takes action against parties offering and using misleading reviews online. In the “Guideline for Online Consumer Protection,” adopted by the ACM earlier this year, the ACM describes where seduction turns into deception. Previously, the ACM has investigated reviews and drafted rules that reviews must comply with. The ACM is also looking at the role and responsibility of platforms in combating misleading reviews. Sustainability Sustainability claims are high on the ACM’s agenda, and the ACM actively monitors sustainability claims made by companies, both offline and online. To this end, the ACM published a guide in 2021 that contains rules of thumb and examples to help companies formulate sustainability claims. This guide was updated by the ACM in June 2023. As companies often spread information regarding sustainability via their social media channels, they should carefully verify their claims. The ACM may charge high fines, which could amount to EUR900,000 per offense or could amount up to a percentage of their annual turnover. Currently, a major case is pending against the largest national airline, KLM, regarding its Fly Responsibly sustainability campaign. Dutch Code for Children’s Rights (“Code voor Kinderrechten”) The Code for Children’s Right (the “CfCR”) consists of ten principles with practical examples which designers and developers can use to safeguard the fundamental righst of children in digital services. The CfCR helps designers and developers to take account of children’s rights when designing and developing apps, websites, games, smart devices and other digital technology. Although the 10 principles in the CfCR are in themselves not legally enforceable rules, the principles are based on laws and regulations which are legally binding. The principles derive from the fundamental rights of children in the UN Convention on the Rights ofthe Child 1989 (UNCRC) and the EU’s GDPR.). The CfCR is endorsed by the Netherlands Authority for Consumers and Markets. The leading principle throughout the CfCR is to put the child’s best interest first. Designers and developers should, among others, make the best interests of the child the primary consideration when designing , ensure the legitimate processing of personal data of childrenand provide transparency in a way that is easy understandable and accessible to children. Dutch Constitution Freedom of expression is a crucial constitutional right in the Netherlands, enshrined in Article 7 of the Dutch Constitution. This provision emphasizes the significance of free expression as a fundamental human right, granting individuals the liberty to freely express their thoughts and opinions through various channels, including the press, radio, television and other forms of communication. The Dutch Government is prohibited from engaging in censorship or requiring prior approval of content, ensuring the free flow of ideas and information without unnecessary interference. Nevertheless, it’s important to acknowledge that, as in other democratic societies, this right is not absolute. Certain limitations are imposed by law to safeguard public interests, such as public order, health and morals. These limitations strike a balance between individual liberties and the collective welfare of society. When addressing challenges related to social media, Dutch legislators consider the right to freedom of expression while ensuring that potential risks and harms are appropriately addressed.
Federal Law No. 15 for 1980 Concerning Press and Publications (Publishing Law) The Publishing Law governs printing and publishing licensing and activities in the UAE and applies to both traditional and digital media content, including newspapers, magazines and television broadcasts. It lays out guidelines for materials that are prohibited from publication as well as penalties for publishing companies and their employees who violate them. Due to the proliferation of new types of digital media and technology, its relevance is diminishing. It is now primarily interpreted in light of the guidelines of the Media Regulatory Office (MRO), previously known as the National Media Council. National Media Council Resolution No. 20 of 2010 (Resolution) The Resolution reaffirms that all media, including audio, visual and physical, must be in compliance with the provisions of the Media Law found within Federal Law No.15 of 1980, which is further supplemented with Cabinet Resolution No. (23) of 2017 on media content. Electronic Media Activity Regulation Resolution of 2018 (EMA Regulations) The EMA Regulations apply to all electronic media activities conducted within the territory of the UAE, including free zones. The EMA Regulations include a number of practical considerations that individuals and businesses engaged in electronic media activities, including social media platforms, in the UAE should take into account when applying for an electronic media license. The electronic media license must be obtained from the MRO by anyone who engages in electronic media activities for commercial purposes in the country through an account on a widely recognized social media platform. Federal Decree Law No. 34 of 2021 on Combatting Rumors and Cybercrimes (Cybercrime Law) The Cybercrime Law and other applicable laws pertaining to the protection of privacy, reputation and defamation apply when utilizing social media. Individuals in the UAE must be aware of the provisions that outline “conduct” and “activities” that may constitute a crime. The Cybercrime Law includes fines, imprisonment, the confiscation of IT equipment and deportation as penalties. Content regulation; various sources across legal regimes (in particular, National Media Council instruments) There are national and cultural standards established by the UAE for media content, and all mass media institutions operating within the UAE must strictly adhere to them. These include a list of strongly prohibited activities and behaviors on social media (and other media platforms). There are certain standards from which one cannot deviate that holistically include the following:
Respect the regime of the UAE and its political system and symbols Respect the policies and direction of the UAE at both international and domestic levels Must not offend the national Unity; additionally, the social cohesion of regional and tribal conflict is strictly prohibited Respect the Islamic and divine beliefs while showing respect to other religions as well Refrain from disclosing without proper authority confidential official contacts, military treaties, conventions or matters that have been concluded by the government Refrain from reporting distorted deliberations of courts and other regulatory bodies Refrain from disclosing information related to an ongoing criminal investigation or to such investigations that have been ordered to be confidential Refrain from publishing photographs, news and comments that invade the privacy of an individual or family or destroys their reputation
- - - - - - - -
UAE Constitution Freedom of expression is a constitutional right in the UAE as provided under Article 30 of the UAE Constitution, and it is therefore protected in its legal system. Although freedom of speech and the right of information are protected by the UAE Constitution, exercising such rights must not contradict or violate other laws.
Advertising Standards Authority (ASA) The Advertising Standards Authority (ASA) is the UK’s independent regulator of advertising across all media and is responsible for monitoring advertising and marketing campaigns, as well as carrying out investigations in relation to the same. With social media forming an increasingly key part of businesses’ marketing and advertising strategies, advertising the decisions and codes of conduct prepared by the ASA are becoming more relevant by the day. UK Code of Non-broadcast Advertising and Direct & Promotional Marketing (CAP Code) The CAP Code imposes rules on the advertising industry in relation to the collection and use of personal information, and it restricts marketing communications delivered by electronic means. It also includes how promotions are run on social media specifically obligating that promotions must be run fairly. Promotion and competition posts should clearly include conditions covering: participation, a closing date, the nature and number of prizes or gifts, and any eligibility or availability restrictions. Some of the ASA’s rulings on this topic include the following:
Adverts on social media must be “obviously identifiable” as advertising. The ASA has published various rulings where social media influencers have been fined for not clearly displaying whether they have been paid to endorse a particular brand or product. At a minimum, it is advised that such posts include a prominent “ad” label up front to highlight that a post has been publicized for marketing purposes. Adverts must not create a false impression or be misleading. For example, the ASA has investigated various influencers for trivializing investment in crypto assets on Instagram by implying it was “straightforward and simple” without highlighting the risk that crypto values fluctuate and are unregulated in the UK. Social media promotions must be run fairly. Promotion and competition posts should clearly include conditions covering participation, a closing date, the nature and number of prizes or gifts, and any eligibility or availability restrictions.
Intellectual Property, Copyright and trademarks. While social media content provides opportunities for businesses to promote their companies and products to a wide range of customers, one of the key dangers is the risk of intellectual property (IP) infringement. When posting videos and images online, it is important to seek permission from the owner or creator. It is also important that businesses and content creators manage their IP portfolios to enable enforcement and protection against unauthorized users and, where possible, register content to maximize such protection. IP infringement on social media platforms can occur in many ways. The most common examples are as follows:
Copyright. Tweets, posts, photographs, videos and artwork on social media platforms and social network profiles may be protected as literary, musical and artistic works under the Copyright Designs and Patents Act 1988. Copyright is an automatic right in the UK and does not require registration. Therefore, when posting content online, it is important that permission is sought from the creator. However, “resharing” photos on Instagram and “retweeting” posts on X is permitted as long as the creator is credited. Trade marks. Social media presents specific challenges for brand use and protection. Due to the volume of material posted, it can be difficult to monitor. Most social media sites have terms and conditions of use that prohibit the unauthorized use of trade marks by third parties. While unregistered marks offer some level of protection, registered trademarks are significantly easier to enforce in the event of a dispute.
Consumer Rights The UK has wide consumer and other protections in place, all of which could apply to social media platforms, including the following:
Consumer Rights Act 2015 (Consumer Act). Under the Consumer Act, consumers are afforded various statutory rights and remedies in relation to goods, digital content and services. This piece of legislation also includes an unfair terms regime impacting terms used (whether contractual or noncontractual notices) between businesses and consumers. Consumer Protection from Unfair Trading Regulations 2008 (Consumer Regulations 2008). Consumer Regulations 2008 prohibits unfair commercial practices such as misleading a consumer and aggressive behavior. Electronic Commerce (EC Directive) Regulations 2002, SI 2002/2013 (as amended by the Electronic Commerce (Amendments, etc.), EU Exit and Regulations 2019, SI 2019/87) and the Defamation (Operators of Websites) Regulations 2013, SI 2013/3028 (EC and Def Regulations). The EC and Def Regulations provide for certain “intermediary defense” for internet intermediaries, including those relating to hosting, caching or mere conduit. Communications Act 2003 (2003 Act). The 2003 Act requires video-sharing services and social media sites that allow video sharing and live-streaming audio-visual services to protect users from harmful or criminal content through measures such as terms and conditions, reporting systems, age verification and parental controls. Protection from Harassment Act 1997 (1997 Act). The 1997 Act prohibits harassment, stalking or physical acts and covers online issues such as cyberbullying. Business Protection from Misleading Marketing Regulations 2008 (2008 Mar Regs). The 2008 Mar Regs prohibit advertising that misleads traders and governs comparative advertising (advertising that identifies a competitor or competitive product). They make engaging in misleading advertising a criminal offense and impose regulatory sanctions for certain comparative advertising. Code of Practice (COP) for providers of online social media platforms. This COP offers guidance to social media platform providers on appropriate actions that they should take to prevent bullying, insulting, intimidating and humiliating behaviors on their platforms. It is also particularly relevant to any sites hosting user-generated content.
- - - - - - -
Defamation Act 2013 (DA 2013) and the Defamation (Operators of Websites) Regulations 2013 (DOW Regulations 2013), SI 2013/3028 The UK does have defamation laws, as well as options for social media providers to defend themselves from defamation claims made on their platforms subject to certain limitations. In conjunction with the eCommerce Regulations, section 5 of the DA 2013 (c 26) provides a defense for the operator of a website where a defamation action is brought in respect of a statement posted on that website if it was not the operator who posted the statement. The defense can be defeated if the claimant can show that it was not possible for them to identify the person posting the statement, the claimant gave the operator a notice of complaint in relation to that statement and the operator did not respond to the notice of complaint in accordance with these regulations. Criminal Justice and Courts Act 2015 In the UK disclosing a private sexual photograph or film, without the consent of the person depicted and with the intention of causing that individual distress, an explicit criminal offence. UK Constitution Freedom of speech is an important concept in the UK, and the government is therefore striving to address the challenges arising from social media use while protecting people’s freedom of expression.
Federal Trade Commission Act (FTC Act) Section 5 of the FTC Act is one of the Federal government’s most powerful tools to protect consumers. The law prohibits unfair or deceptive acts or practices in or affecting commerce (15 U.S.C. § 45.). With respect to social media companies, the agency’s enforcement actions have signaled a strong interest in protecting consumers’ privacy. Practices likely to be deemed unfair include failure to adequately protect consumers’ personal data or providing inadequate disclosures to the public in privacy policies. Deceptive practices can include misrepresenting the extent to which consumers can control their data and failing to comply with representations made in privacy policies. Social Media Influencers Using social media influencers for marketing purposes involves a mix of legal and ethical considerations. While there isn't a specific federal law dedicated solely to influencer marketing, several regulations and guidelines come into play when US companies collaborate with social media influencers. The Federal Trade Commission (FTC) has issued guidelines that apply to influencer marketing (Source). For example, if an influencer receives compensation (monetary or non-monetary) for promoting a product or service, they must disclose this relationship clearly and conspicuously. Misleading or deceptive practices, such as not disclosing a paid partnership, can hvae legal consequences and harm a company's reputation. The FTC requires influencers to disclose any "material connection" they have with a company or brand they promote. When collaborating with influencers, companies should have well-drafted agreements that outline the scope of work, compensation, disclosure requirements, intellectual property rights, and any other relevant terms. Depending on the industry, there might be specific regulations or guidelines that influencers and companies need to follow. For example, industries like healthcare and finance have stricter regulations regarding endorsements and claims. States and local jurisdictions may have additional regulations or requirements related to influencer marketing. Section 230 of the Communications Decency Act (Section 230) In terms of online speech, perhaps the most significant federal law governing social media companies is Section 230 of the Communications Decency Act. Section 230 provides that no “interactive computer service” shall be treated as the publisher or speaker of any information provided by another information content provider (47 U.S.C. § 230.). Thus, subject to a few exceptions, it shields social media companies from liability based on third-party content (i.e. the content that their users post). The law also shields companies if they choose to remove content from their platforms (Id.). Digital Millennium Copyright Act (DMCA) Section 512 of the DMCA shields online service providers from potential legal liability for copyright infringement that occurs on their platforms (Source). In order for the safe harbor to apply, service providers must, upon becoming aware of infringing content, act expeditiously to remove it from their site. The service provider must designate an agent to receive notifications of claimed infringement, provide the agent’s contact information on its website and register the agent with the US Copyright Office (17 U.S.C. § 512(c).). US Constitution Social media’s popularity in the US may be due, in part, to the country’s storied commitment to free speech. While it is clear that the US Constitution protects individuals’ right to free expression, whether the First Amendment of the US Constitution protects a social media company’s ability to moderate content on its site is a more novel question that is currently being tested in federal court. This deeply held value of free speech in the US coupled with social media platforms’ desire to moderate content on their sites is sparking national conversation about how and to what extent federal and state agencies may regulate social media companies’ policies regarding users’ speech.
The proliferation of data protection laws globally has been happening at an increasing pace, particularly since the introduction of the EU’s General Data Protection Regulation on 25 May 2018. Given the nature of a social media platform, data protection laws will always be a critical piece of legislation to be considered in a social media environment. This section looks specifically at current data protection laws.
EU General Data Protection Regulation (EU GDPR) The EU GDPR regulates privacy in the EU and European Economic Area (EEA) (the EEA is all the EU member states as well as Iceland, Liechtenstein and Norway) and affects social network operators by regulating the bases and purposes they can use for processing users’ personal data, their transparency obligations and other rights that must be granted to individuals with regard to their personal data. EU ePrivacy Directive (ePD) The ePD, which was amended several times since being enacted, regulates the confidentiality of communications and the use of personal data for marketing communications. The ePD regulates providers of electronic communication services, which must provide secure services and inform subscribers whenever there is risk to security. European Data Protection Board (EDPB) While not a law, the EDPB (constituted under the EU GDPR ) plays an increasingly important role in EU data protection law through the issuing of guidance, and perhaps most importantly its input into member state decisions on data protection matters which are referred to it when there is disagreement among EU data protection supervisory authorities (SA) on actions of controllers within the EU. Specifically the EDPB can issue binding decisions in response to disputes between SAs which binding decisions then must be implemented by the relevant leading SA in the particular member jurisdiction where the issue arose. International Data Transfers outside of the EU Transfers of personal data outside the EU are only allowed where certain conditions are met. One of those is where a country has been “deemed adequate” by the European Commission. This means that the laws in that jurisdiction are deemed by the European Commission to provide the same level of protection for personal data to that given by EU member states. If a country does not have an adequacy decision, then there are certain mechanisms that can be used to enable data transfers, the most commonly used of which are the EU standard contractual clauses (SCCs). Further detail on some of these elements are set out below:
Countries deemed adequate by the EU - Andorra, Argentina, Canada (commercial organizations only), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, New Zealand, Republic of Korea, Switzerland, the UK and Uruguay. US and EU Adequacy - The Trans-Atlantic Data Privacy Framework (DPF) was agreed between the US and the European Commission in March 2022 and adopted into law by the EU in July 2023. As a result, personal data transfers from the EU to the US to organizations which participate in the DPF obligations, will be considered “adequate”. EU SCCs – The European Commission published updated SCCs following the introduction of the GDPR as well as considering the outcome of the Schrems II decision.* Using the EU SCCs, along with having conducted the appropriate assessments and taking any appropriate mitigation measures identified in the assessments will enable lawful transfers out of the EU to non-adequate jurisdictions.
Post Schrems II, companies are also required – where there is no adequacy decision – to undertake transfer risk assessments of both the need for the transfer, as well as the country that the personal data is being transferred to, This is to assess if that country’s laws are “essentially equivalent” to those provided in the EU for the protection of personal data. If a company finds in their assessment that the third country does not provide essentially equivalent protection, they must consider supplemental measures to improve the protection of the transferred personal data. This can be a particularly burdensome requirement in the best of times, but may become more onerous when using social media that usually operates from a global footprint, with servers around the world, and where the interplay between that personal data that is controlled independently by the social media platform may be very hard to determine. This will then have knock-on consequences in relation to international transfer considerations, such as implementation of supplemental measures.
* Schrems II was a decision by the Court of Justice of the European Union that essentially struck down the EU-US privacy shield mechanism for international transfers (a precursor to the DPF). The decision also cast doubt on the suitability of the then-standard contractual clauses that were the default mechanism for international personal data transfers where a jurisdiction was not considered “adequate” for data protection purposes by the EU.
EU data protection laws apply in France Commission Nationale de l’Informatique et les Libertés (CNIL) The CNIL is the French SA and they are applying increased scrutiny on social media platforms face in France, including levying significant fines against social media companies.
Personal Data Privacy Ordinance (PDPO) The PDPO is the law governing the processing of personal data in Hong Kong. The PDPO will govern the collection, use or other processing of personal data on social media. This includes, social media platforms obligation to notify individuals of matters in relation to the processing of their personal data, such as whether it is obligatory or voluntary for the individuals to supply the personal data to be collected, the consequences of failure to supply the personal data if it is obligatory to supply it, the purpose for use of personal data, the classes of persons to whom the data may be transferred, individuals’ rights to access or correct their personal data, and the contact details of the person who will handle the requests to exercise such rights. These are usually disclosed to individuals through a privacy policy. Separately, social media platforms will need to obtain consent from individuals for their use of personal data for direct marketing purposes. The PDPO also contains provisions relating to “doxxing”, which is the publishing of private or personal data on the internet with the intention of causing harm or distress. Office of the Privacy Commissioner for Personal Data (PCPD) The PCPD has become a priority area for the PCPD since October 2021 when the anti-doxxing regime came into effect (specifically see section 64(3A) of the PDPO). To this point the PDPC recently took enforcement actions under the new anti-doxxing regime and arrested individuals/defendants who allegedly disclosed the personal data of a data subject without their consent. The consequence for the individuals are that they may be liable for a fine of HK$100,000 and imprisonment for two years. “Guidance on Protecting Personal Data Privacy in the Use of Social Media and Instant Messaging Apps” (Guideline). These Guidelines have been issued by the Office of the Privacy Commissioner for Personal Data (PCPD)to illustrate the PCPD’s concerns over the risks involved for social media users in relation to personal data privacy. The Guideline lists data protection risks, such as:
Loss of privacy Misuse of personal data Fake accounts and identities
And it provides practical advice to social media users to mitigate these risks. In addition, the Guideline provides a detailed step-by-step guide to direct social media readers to change their privacy settings on various social media platforms such as Facebook, Instagram and X.
EU data protection laws apply in Ireland. Data Protection Acts from 1988 to 2018 (together the Irish DP Acts) The Irish DP Acts work alongside the EU GDPR to govern data protection in Ireland. Data Protection Act 2018 (2018 Act) The 2018 Act establishes the statutory powers, duties and functions of the Data Protection Commission (DPC). The DPC is the Irish SA responsible for monitoring the application of data protection legislation in Ireland.
Section 31 of the 2018 Act, which transposed the EU GDPR, establishes the digital age of consent. The 2018 Act states service providers like social media platforms are not permitted to process the personal data of a subject under the age of 16 years without consent. It is the responsibility of the service provider to obtain the consent of the child’s parents prior to processing such data.
S.I. No. 336/2011 (Irish e-Privacy Regulations) The Irish e-Privacy Regulations transposed the ePD. DPC and the EDPB While the DPC is the SA in Ireland, the interaction between the DPC and the EDPB plays a critical role in Ireland, in particular, is critical to keep in mind. As noted above, the EDPB is essentially the arbiter of disputes between EU SAs if there is disagreement on a decision or sanction to be imposed on a company for data protection non-compliance. Given that a number of social media companies have their EU headquarters in Ireland, there are regular interactions between the DPC and the EDPB which can have impacts on social media companies based in Ireland. DPC’s Regulatory Strategy for 2022-2027 (2022-2027 Strategy) One of the DPC’s main priorities in its 2022-2027 Strategy is the protection of children and other vulnerable groups. EU GDPR notes that children merit special protection with regard to their personal data. Children and other vulnerable groups share the common risk factor of the frequent dependency on others to advocate on their behalf. The DPC notes that there is confusion among these groups around data sharing and, particularly, around consent. This has resulted in children and vulnerable adults enduring prolonged exposure to adverse situations. Children Front and Centre: Fundamentals for a Child-Oriented Approach to Data Processing 2021 (Fundamentals) (Ireland’s age-appropriate design code) This guidance outlines principles and recommendations for companies processing children's data in Ireland. The guidelines are applicable to all organisations providing services directed at, intended for, or likely to be accessed by children, regardless of whether they are online or offline services. The Fundamentals outline 14 core principles that serve as baseline expectations for organisations processing children's data. The Fundamentals also provide examples of data protection by design and default measures, including default privacy settings, user choice, limitations on data sharing, avoidance of manipulative techniques, and the provision of parental dashboards where appropriate.
EU data protection laws apply in Spain Spanish 3/2018 Data Protection Organic Act (Data Act) regulate privacy in Spain The Data Act affects social network operators particularly, regulating the bases and purposes they can use for processing users’ personal data, their transparency obligations and other types of guarantees social networks must comply with. Spanish Data Protection Agency & Case law While the EU GDPR and the Data Act are the primary sources of data protection law in Spain, the role of the Spanish Data Protection Agency and the law it creates through its decisions also play a significant role in Spain, particularly given that the Spanish Data Protection Agency is one of Europe’s most active and strict SAs.
EU data protection laws apply in the Netherlands. Dutch Data Protection Authority (DDPA) The DDPA is the SA for data protection law in the Netherlands. In terms of the DDPA’s work there are two key themes that they are looking at which will have particular relevance for social media:
Dark patterns Social media platforms can make use of so-called dark patterns. These are interfaces and user experiences that cause users to make unintended and potentially harmful decisions regarding the processing of their personal data. With regard to dark patterns, the principles of fair processing, transparency, data minimization and accountability come into play in particular. Consent and data-protection-by-design and by-default requirements also play an important role. The latter can help providers avoid dark patterns already present at the design stage. Targeting Besides using dark patterns, platforms can offer targeting services as part of their business model. The criteria for targeting individuals can be developed based on personal data that users have (actively) provided or shared with social media platforms. In addition, personal data may be observed or derived and collected by the provider or by third parties to gain further insights and support the targeting services. The combination of data collected from different sources and the potentially sensitive nature of the personal data processed may pose risks to the fundamental rights and freedoms of individuals. The legal basis on which the provider/targeter may rely may be consent of the data subject or legitimate interests. The provider/targeter must meet certain specific conditions to show that it can rely on them as a legitimate legal basis for targeting.
Federal Decree-Law No. 45 of 2021 on the Protection of Personal Data Protection (PDPL) This law came into effect on January 2, 2022. The PDPL, which is the main governing piece of legislation, applies to all onshore personal data processing. It offers a legal framework to safeguard the confidentiality and security of personal information, with an emphasis on the rights and responsibilities of all parties involved. The PDPL applies to all personal data processing. The PDPL creates specific obligations that must be met to enable compliant data processing. These include:
Personal data must be treated in line with a legal basis and data processing principles such as data minimization, transparency and anonymization. Article 10 of the PDPL also mandates that companies (a controller or processor) hire a data protection officer in cases where the processing is using new technologies or the data carries a risk regarding the confidentiality and privacy of data subjects, when processing entails systematic and extensive evaluation of sensitive personal data, or when processing of sensitive data is performed on a large scale. As with all data protection laws globally, lawful consent is a must. Article 6 of the PDPL provides the following necessary conditions for obtaining valid consent from the data subjects for the processing of their data:
The PDPL permits the transfer of personal data to countries deemed to have a sufficient degree of protection by the UAE Data Office. These countries must have special legislation for the protection of personal data or have ratified bilateral or multilateral agreements for the protection of personal data. For countries not authorized by the UAE Data Office as having an appropriate degree of protection, the PDPL offers a variety of transfer options for personal data. Noting that the UAE Data Office is not yet established, we have no visibility over the jurisdictions that will be deemed “adequate” by said office.
• • •
The data controller must prove the data subject’s consent if consent is relied on as a lawful basis for the processing of their personal data. The consent can be obtained electronically or in writing but must be obtained in a clear, simple, unambiguous and accessible manner. The method for obtaining consent should include information on how the data subject may withdraw their consent, and the procedure for doing so must be simple for them.
Data Protection Act 2018 (UK DPA) and UK GDPR The UK GDPR is EU GDPR (as transposed into UK law by Section 3 of the European Union (Withdrawal) Act 2018). The UK DPA and UK GDPR regulate privacy in the UK and affect social network operators by regulating the bases and purposes they can use for processing users’ personal data, their transparency obligations and other rights that must be granted to individuals with regard to their personal data. At the time of publication of this guide, the UK GDPR and EU GDPR are still significantly similar, with the DPA providing additional input where permitted under the UK GDPR, and minimal changes to the text from that in the EU GDPR save for replacing references to European bodies and institutions with their UK equivalents (e.g. referring to the UK Information Commissioner (IC) rather than a supervisory authority). Privacy and Electronic Communications Directive (PECR) The PECR sits alongside the UK GDPR and UK DPA and gives individuals specific rights in relation to direct marketing communications. International Transfers out of the UK Following both Brexit and the Schrems II case (see the EU data protection section for more information), the IC has prepared the UK International Data Transfer Agreement (IDTA) which is the UK equivalent to the EU SCCs. The IDTA is used for transfers of personal data out of the UK. There is also a UK Addendum which the IC has prepared which can be used alongside the EU SCCs to convert them to work for a UK international transfer enabling companies to avoid needing to use both the EU SCCs and the IDTA (where appropriate). As noted in the EU data protection section, transfer risk assessments are also required when consideration transfers of personal data out of the UK. For transfers to the USA from the UK, the UK Secretary of State for Department for Science, Innovation and Technology has, on 21 September 2023, issued regulations essentially creating the UK-US Data Bridge. The UK-US Data Bridge essentially is an extension of the DPF (see the EU section for further information), enabling transfers to between the UK and US for those US entities that have self-certified under the DPF and this new DPF extension. The UK-US Data Bridge will become effective from 12 October 2023. It is important to remember that the DPF, and this UK DPF extension do not grant adequacy to all US entities, rather it is only an extension to those US entities who are eligible to and who have then self-certified under the DPF regimes. Data Protection law and Competition law overlap Recently, the intersection between data protection law and competition or antitrust law is also becoming a relevant consideration, particularly in light of the EU’s DMA and the pending DSA. While both of these are EU laws, they may have indirect (if not direct) effect on companies within the UK, and if they do not then the UK Competition Act 1998 will, particularly if it is considered that actions taken by social media providers in relation to the handling of personal data is considered an abuse of a dominant position and as such considered anticompetitive. This is something that will become more relevant particularly in the metaverse, where there may need to be a sharing of information between competitors in order for their users to move freely about in the metaverse, but it is also an issue that may be live now in terms of access to information held or services provided by social media companies, which may be different or on different terms when provided within their own group versus to non-group companies. Children’s data processing For data protection purposes in the UK a child is considered someone under the age of 13 years old, however, generally the age an individual needs to be to be able to legally enter a contract in the UK is 18. The impact of these points will differ for the use of social media, as there could be questions as to whether someone under the age of 18 can legitimately agree to the terms of service of a social media provider, whilst the processing of the personal data of users of those platforms when the individuals are under the age of 18, but older than the age of 13 may still be lawful under UK GPDR. Age-Appropriate Design Code (the “Children’s Code”) To assist orgnisations processing personal data of children, the IC issued the Children’s Code in September 2021. The Children’s Code requires that designers of apps or social media platforms make children’s privacy a primary consideration. The Children’s Code sets out 15 standards of age-appropriate design reflecting a risk-based approach. The focus is on providing default settings which ensures that children have the best possible access to online services while minimizing data collection and use by default. Many social media platforms have implemented changes to their child privacy and safety measures as a result of the Children’s Code.
Federal Privacy Law There is currently no federal comprehensive, cross-sector data privacy law in the US. State Privacy Laws Individual US states have enacted their own consumer privacy laws. To date, the following nine state-level laws have been passed as of the publication of this guide:
CCPA, as amended by the California Privacy Rights Act Colorado Privacy Act Connecticut Data Privacy Act Delaware Personal Data Privacy Act Indiana Consumer Data Protection Act Iowa Consumer Data Protection Act Montana Consumer Data Protection Act Oregon Consumer Privacy Act Tennessee Information Protection Act Texas Data Privacy and Security Act Utah Consumer Privacy Act Virginia Consumer Data Protection Act
- - - - - - - - - - - -
Several of these laws are currently effective while the he rest will become effective at various times over the next few years, with the latest set to take effect in 2026. These laws are similar to the EU GDPR in that they set forth a comprehensive set of requirements related to the protection and security of consumers’ personal data as well as provide consumers with specific rights. Consumer rights generally include the right to access, right to correct, right to delete, right to portability, right to opt out of sale and right to nondiscrimination. In addition, the majority of the state laws require detailed consumer privacy notices, data processing agreements and, under specified circumstances, data protection impact assessments. International Transfers Unlike the EU GDPR or UK GDPR, current US state consumer privacy laws do not place restrictions on the transfer of personal data outside the US. Children’s Online Privacy Protection Act (COPPA) COPPA was enacted to protect children’s safety and privacy on the internet. The law requires operators of websites and online services directed to children or any operator that has actual knowledge it is collecting personal information from a child to, among other things, (1) include a privacy policy on its website explaining its data practices related to children’s data; (2) obtain verifiable parental consent for the collection, use or disclosure of children’s personal data; and (3) upon a parent’s request, disclose information about the types of personal information collected from the child (15 U.S.C. § 6502.).
As social media continues to evolve with the rapid emergence of new social media platforms and increasingly novel communication mechanisms, social media regulation also continues to evolve to seek to modernize and strengthen the legal and regulatory framework of such social media companies. We anticipate the following trends being key areas of development and focus with respect to social media in the short to medium term:
How do IP rights owners protect and enforce their IP rights in the metaverse? How do content creators deal with the metaverse? How do IP rights owners deal with the licensing or transfer of their rights? How will copyright be enforced if users’ identities are not easily discoverable?
How to manage the sharing of personal data in the metaverse and set up the accountability and privacy obligations required to protect its use? How to deal with applicable governing laws? How will companies deal with data transfer and determine when a transfer is occurring? How will companies deal with data breaches and accidental exposures?
Growing use of social media channels for e-commerce purposes Increased popularity in the development and sale of non-fungible tokens (NFTs) Brands leveraging video content more strategically Further growth in “paid for” advertising (including the use of social media influencers) Growth and increased user numbers for social audio platforms Increased use of virtual reality (VR) and augmented reality (AR) products/offerings Increased use of social media for customer service
Based on the foregoing, the public, industry groups and civil society generally are increasingly demanding the development and modernization of social media regulation. There is now a noticeable global movement to regulate social media companies, and there have been numerous legislative and policy developments designed to equip regulators with the tools required to address the scale of the issues associated with social media. We consider the following items to be key areas for future legislative developments:
Key areas for future legislative developments
Click on numbers for information
Targeted regulation of social media companies
We expect to see regulators and legislators trying to strike a balance among social responsibilities, the potential risks of online harm and safeguarding individuals’ freedom of speech.
IP infringement
Increased actions are expected relating to alleged infringement of IP rights via social media platforms in new and more advanced forms.
Competition law and data
Given the scale and significance of some online platforms, an increased focus around competition law and data is likely.
Challenges arising from developments in telecom and broadcasting laws
We expect to see challenges arising from potential changes in telecom and broadcasting laws, particularly changes that look to level the playing field between traditional telecom providers and nontraditional providers of communication services. The EU has already made amendments to the European Electronic Communications Code to account for the so-called over-the-top communication service (such as messaging apps provided by social media and other companies), and we anticipate the UK doing the same. However, given some of the critical differences between traditional telecom providers (such as the need for address and location information in order to price, itemize and bill calls) and nontraditional communication service providers, we understand that the UK is looking to better understand these key differences so that it can propose legislative changes that are alive to and are drafted to account for the differences.
Artificial Intelligence and GenerativeAI on Social Media
Increased use and development of the metaverse
Many businesses, including social media outlets, are continuing to improve their use of VR, AR and artificial intelligence (AI) to enhance interaction with customers by offering them new and immersive 3D experiences online. The creation of a parallel universe (the metaverse) requires all fields of law to be adapted to this new reality (exemplified by the use of prefixes such as “e-,” “virtual,” “meta-,” “smart,” “cyber” or “.0”). These fields include IP, privacy, virtual goods and rights (NFTs), “smart” or “fat” contracts, cyber currencies, taxes, ethics, e-marketing and ad tech, environmental, and distribution. By way of example:
IP. How do IP rights owners protect and enforce their IP rights in the metaverse? How do content creators deal with the metaverse? How do IP rights owners deal with the licensing or transferring of their rights? How will copyright be enforced if users’ identities are not easily discoverable? Privacy. How will companies manage the sharing of personal data in the metaverse and set up the accountability and privacy obligations required to protect its use? How will they deal with applicable governing laws? How will they deal with data transfers and determine when a transfer is occurring? How will they deal with data breaches and accidental exposures? NFTs/crypto/smart contracts/fat contracts. Will NFTs comply with the IP rights of third parties? How should an NFT as a license be dealt with? How should the conversion of an artwork into an NFT be dealt with? Which rules apply to the creation and payment of cryptocurrencies? What impact do antibribery laws or money laundering regulations have on NFTs? Taxes. Is the transfer of virtual currency/assets in the metaverse subject to tax requirements? Are NFTs understood to receive similar treatment as cryptocurrencies regarding tax? Ethics. Who will be held accountable for transparency and the ethical use of the data being collected? How will companies deal with their brand ethics regarding, for example, a person’s age in the metaverse? How will companies handle the use of biometric data and the risk of cybersecurity attacks or breaches? Ad tech. Which governing laws apply to product placement in the metaverse? Which legal obligations in terms of rights of publicity and IP apply? How should ad fraud or a brand’s safety be dealt with? Environmental. Which environmental regulations will apply to the metaverse? In relation to brand image, how can a company face the environmental challenges of the metaverse? What will be the green impact of the metaverse? Distribution/exclusive supply. Which goods/assets are involved? Are they virtual or tangible? Has the metaverse been covered in the exclusivity clause?
Key areas for future legislative developments, by geography
Regulation Proposal on Child Sexual Abuse Material (CSAM) This regulation will impose a range of obligations to remove CSAM, including conducting risk assessments and mitigation, issuing detection and removal orders, reporting, and using detection technology. The regulation targets hosting services, interpersonal communication services (i.e. messaging services), app stores and internet access services. AI The EU AI regulation is a landmark piece of proposed legislation that is expected to be adopted in 2023 and introduces a tiered regulation of AI systems where certain use cases are prohibited and others are subject to substantial compliance obligations. Prohibited practices include “subliminal techniques” beyond a person’s consciousness in order to materially distort a person’s behavior or where an AI system manipulates the vulnerabilities of a specific group of persons due to personal characteristics. “High risk” AI includes a wide number of sectors such as law enforcement and biometrics, and in such instances, AI providers must adhere to burdensome technical and transparency requirements and conformance assessments. Furthermore, the AI Liability Directive will introduce a new standard of tortious liability and disclosure obligations for AI providers and will harmonize rules across the EU. The metaverse The EU has taken an active stance in the digital spaces—the AR and VR sectors—and has launched initiatives to prepare for this nascent industry’s maturity. European Commissioner Thierry Breton has stated, “This new virtual environment must embed European values from the outset.... Private metaverses should develop based on interoperable standards and no single private player should hold the key to the public square or set its terms and conditions.” While the DSA and DMA will provide regulators with new tools to police digital spaces, the European Commission is expected to publish a Metaverse Regulation this year covering issues such as network infrastructure taxes, digital rules following the DSA and DMA, and safety and interoperability measures. Data protection The European Commission’s Work Programme 2023 includes initiatives concerning digital enforcement and enhanced data use. Proposals to amend consumer protection cooperation rules, make updates to harmonize the approach by data protection authorities enforcing the EU GDPR and set up a “common European mobility data space” are to be examined. In addition, it has been acknowledged for some time now that the ePD is outdated. In 2017, the European Commission proposed text for a new e-Privacy Regulation that, when in force, will repeal the ePD. The e-Privacy Regulation is still in draft, having gone through numerous rounds of negotiation. Even after 6 years it is still unclear what the e-Privacy Regulation will provide or when it will come into law, but areas of focus over the past 6 years have been on cookies and other tracking technologies, including in new technologies such as instant messaging apps, Voice over Internet Protocol platforms and the Internet of Things. These will inevitably be relevant to social media so a watching brief on this will be important. Another key consideration within the e-Privacy Regulation is the potential for the enforcement to include the one-stop-shop mechanism as set out in the EU GDPR for regulatory purposes. It is also highly likely that the level of fines for non-compliance with the e-Privacy Regulation will mirror those set out in the EU GDPR.
EU Developments will be equally applicable in France. French Online influencer law The French Parliament adopted on June 1, 2023, a draft bill that seeks to regulate influencers’ activities online after years of growing concern over the impact of influencer content on social media, particularly with respect to certain products (health, financial) and advertising campaigns that resulted in an increasing number of fraud claims. The purpose of this law will be to:
Define digital “influencer” and “influencer agent” Provide a contractual framework for the relationship between influencer and influencer agent, setting out certain rights and obligations between them; those agreements will have to be governed by French law when the contracting influencer targets French subscribers Strengthen the role of online platforms in dealing with unlawful content (in particular by implementing a reporting tool); online platform operators will also be required to publish an annual report on content moderation activities Prohibit or restrict digital influencing activities in relation to certain products and services (e.g. plastic surgery, cryptocurrencies)
Social media specific laws There are no social media-specific laws and regulations in Hong Kong. However, certain enforcement activities by the Hong Kong Government may impact this area in the future. Hong Kong National Security Law From the NSL perspective, while the NSL is live, there has been limited enforcement or use of some of the powers that the NSL gives to Hong Kong law enforcement agencies, such as the ability to remove online content or obtain user data without a judicial warrant. To date, the law enforcement agencies have only requested that users of social media accounts remove the information in violation of the NSL, and we are not aware of any request to social media platforms for removal of content or obtaining user data, but we anticipate this will change, meaning social media platforms will begin receiving such requests directly, rather than users being asked to make changes. Potential changes or updates to the PDPO The PCPD is working closely with the Hong Kong Government to conduct a review of the PDPO with the intention of proposing significant amendments to the PDPO. These amendments include establishing a mandatory mechanism for data breach notification (there is currently only non-binding best practice guidance on this in Hong Kong), requiring formulation of a data retention policy, empowering the PCPD to impose administrative fines and introducing direct regulation of data processors. As social media platforms collect and process significant amounts of personal data, these amendments will have an impact on them, potentially increasing compliance costs and imposing more stringent requirements.
EU Developments will be equally applicable in Ireland. General Scheme of the Digital Services Bill 2023 (2023 Bill) On March 20, 2023, the 2023 Bill was released by the Department of Enterprise, Trade and Employment, but as of the date of publication of this guide, it is yet to be enacted. The 2023 Bill will support, at a national level, the DSA (see the EU general laws section). The 2023 Bill addresses the designation of the Media Commission as the digital services coordinator for Ireland and will give effect to other miscellaneous matters, including the establishment of a liability regime for providers of online intermediary services and the harmonization of court orders to take down illegal content from online services, including social media platforms. Online harm and misleading content An emerging focus in Ireland has been on regulating harmful and misleading content online, moving away from an era of self-regulation for large online platforms, such as social media platforms. The consequences of this focus is that there may either be more regulatory enforcement of existing laws, or a look to issue new laws or guidance to tackle these issues. Some of these are set in this section.
Online Behaviour: Influencer Marketing Report (the Influencer Report) The need to regulate misleading online content can be seen by the CCPC’s recent Influencer Report, which investigated online consumer behavior and influencer marketing. In its recommendations, the CCPC states that social media platforms should assume greater responsibility for informing and educating their users on misleading advertising, and further recommends that such platforms support users by labeling online content and facilitating the reporting of misleading advertising. Criminal Justice (Incitement to Violence or Hatred and Hate Offences) Bill 2022 (Hate Crime Bill) The Hate Crime Bill will add a hate crime element to many common crimes, as well as (if enacted), changing the definition of “incitement to hatred” (essentially hate speech)—criminalizing any intentional or reckless communication or behavior likely to incite violence or hatred against a person or persons because they are associated with a protected characteristic. The Hate Crime Bill would further make it an offense to prepare or possess material (such as posters or leaflets) likely to incite violence or hatred, even if the material had not yet been shared or made public. Communications (Retention of Data) (Amendment) Act 2022 reforms While this law has bridged a gap to allow for the continued retention of and access to certain data for criminal justice and national security purposes, the Minister for Justice has noted its intention to further reform and clarify the law on data retention in Ireland. It is expected that a new general scheme of a bill will be published that will consolidate existing legislation on the retention of data. Artificial Intelligence In July 2021, the Irish Government published a national AI paper titled “AI – Here for Good: National Artificial Intelligence Strategy for Ireland”, which outlines a number of programs to incentivize collaboration between the public sector and businesses. The Irish Government has welcomed the findings from bodies such as the Expert Group on Future Skills Needs, which found in its report “AI Skills: A Preliminary Assessment of the Skills Needed for the Deployment, Management and Regulation of Artificial Intelligence” that everyone, regardless of their area of employment, will require some knowledge and understanding of AI. Blockchain The Department of Finance published a paper on virtual currencies and blockchain technology outlining its views on the effects of virtual currencies on consumers and companies in areas such as data protection. Additionally, the Irish tax authority issued a manual titled “Taxation of Cryptocurrency Transactions” to explain cryptocurrency taxation for crypto assets. Virtual asset service providers will need to consider their obligations following the transposition of the fifth anti-money laundering directive into Irish law by way of the Criminal Justice (Money Laundering and Terrorist Financing) (Amendment) Act 2021.
EU Developments will be equally applicable in Spain. Anonymity Anonymity is a general concern for enforcement of legal actions in general and has frustrated many civil and administrative actions, with the authorities being unable to follow through on actions due to difficulty in identifying the defendants or prosecuted individuals or even, with the advent of blockchain, in seizing digital assets. This concern, though, has only been addressed by legal doctrine (on which a belief permeates that it will be a change in the internet’s architecture that will finally address this problem), and nothing has yet been done from a lawmaking standpoint. Fake news Regarding reducing fake news, legislators have made this one of their priorities, and we have seen public announcements of initiatives to tackle this problem (including the possibility of setting up a Ministry of Truth, very much in Orwellian fashion). However, these announcements have been met with severe criticism, mainly from constitutional scholars who highlight the troublesome nature of such initiatives from the standpoint of freedom of expression rules. Nothing has to date been presented to the legislative bodies for enactment.
EU Developments will be equally applicable in the Netherlands. Social media specific laws There are no clear indications whether new national laws and regulations specific to social media will be introduced in the (near) future.
Social media networks in the UAE are effectively regulated in accordance with international norms and in consideration of the country’s socioeconomic needs. Media cities The UAE not only monitors current media regulations, but it also employs a variety of methods to promote their evolution. The UAE Government established media cities to encourage local, regional and international entrepreneurs to establish offices producing major media content in all forms. Media and Film regulation changes It is widely believed that the legal framework surrounding the media in the UAE will be revised at some point to address the media in general and digital media in particular. This despite the existing 2018 Electronic Media Activity Regulation Resolution,. There are no clear indications as to when this will occur, although we believe it might be in the near future. The website censorship committee established by the MRO with representatives from the Ministry of the Interior, the Telecoms and Digital Government Regulatory Authority and the Signals Intelligence Agency will also be important to monitor (In brief: media law and regulation in United Arab Emirates – Lexology).
Online Safety Bill (OSB) We expect to see regulators and legislators trying to strike a balance among social responsibilities, the potential risks of online harm and safeguarding individuals’ freedom of speech. In the UK, this is materializing via the highly contentious OSB, which, at the time of publication of this guide has just completed its parliamentary review process and the text of the OSB has now been agreed. The OSB is therefore subject to Royal Assent before becoming law, the date for the Royal Assent to be given is not yet known. The OSB has been the subject of much public focus and scrutiny. The OSB, will impose obligations on platforms to identify and remove illegal content, particularly material relating to terrorism and child sexual abuse, as well as impose additional duties of care on platforms that are likely to attract child users. The critical issue surrounding the protection of minors (particularly from online sexual abuse and/or grooming) is encrypted private messaging (whether sent via social media messaging channels or other messaging apps). There is concern that the requirements in the OSB risk undermining the privacy and security of end-to-end encryption for private messaging for all users, not just in relation to children. Whether the final text of the OSB has done enough to allay these fears or create enough room or time to allow for the development of such solutions without impacting privacy remains to be seen. Fines will be introduced for non-compliance, the maximum amount of which will be the greater of £18million or 10% of worldwide revenue for the previous full accounting period. Digital Markets, Competition and Consumer Bill (Bill) The headline-grabbing change in the Bill is a wide-ranging new competition regime for digital markets that will put the Competition and Markets Authority (CMA) at the heart of the UK Government’s policy to regulate large digital platforms. This is a step-change in the UK regulatory landscape that provides for a proactive and interventionist regulatory regime—similar to the regulation of financial institutions and utilities—for some of the major Big Tech companies. The Bill proposes substantial reforms to UK competition law more generally that impact all sectors, not just digital markets. The CMA’s enforcement powers to intervene in anticompetitive conduct are expanded, and reforms proposed to the UK’s merger control regime introduce new ways in which the CMA can intervene in M&A deals. Importantly, the Bill provides the CMA with an explicit ability to intervene in conduct that takes place outside the UK and to review M&A deals that have a very limited impact on the UK economy. Finally, the CMA has a whole new suite of tools to enforce consumer law, including the ability to impose fines of up to 10% of global turnover on companies that break the law. This means that the Bill substantially strengthens the role and powers of the CMA for the future. The timing of the adoption of the Bill will largely depend on Parliament. However, according to an official from the Digital Markets Unit, the Bill may not receive royal assent until spring 2024. Advertising Some of the key findings of the House of Commons’ Digital, Culture, Media & Sport Committee’s report on influencer culture are that the rapid growth in social media influencing has expanded and outstripped advertising regulations, meaning that updates to the ASA, further reform and enforcement powers are urgently required. The committee is currently considering giving the ASA statutory powers to enforce the CAP Code in order to improve compliance. Changes to UK data protection laws: the draft Data Protection and Digital Information (No.2) Bill (Data Reform Bill) The Data Reform Bill has been tabled by the UK Government and is currently going through House of Commons review (having had two readings in Parliament already). The Data Reform Bill aims to clarify, amend and in some cases relax current UK data protection obligations for businesses. Of particular importance to social media platforms is that the Data Reform Bill proposes to raise potential fines for infringements of the PECR (currently set at a maximum of £500,000) to be in line with the UK GDPR (£17.5 million or 4% of global turnover, whichever is higher), which will impact how social media companies deploy the use of cookies to track users (which impacts both on- and off-platform tracking). The changes seek to reduce barriers for responsible innovation in a digital economy and so be more pro-business in appearance, but only apply in relation to the UK. Some of these pro-business proposals are also being met with some concern outside the UK, in particular in the EU where some privacy activists are calling for a reexamination of the UK adequacy determination if the Data Reform Bill is passed as is. If this happens, it would have consequences for, among other issues, data transfers between the UK and EU. Further regulation of AI technology The UK Government has published a National AI Strategy with a long-term view of establishing a UK framework for the regulation of AI, and changes to legislation affecting the rollout of AI technology and solutions are anticipated. Increased use and development of the metaverse We have already seen changes being implemented to prepare for the advent of the metaverse in the IP sphere to ensure that trademarks are appropriately protected in the metaverse; for example, it is now possible to register a trade mark in the following classes: online non-downloadable virtual goods and NFTs (class 42) and financial services, including digital tokens (class 36).
In the US, social media legislative trends signal an increasing focus on online speech, youth, and a continued desire to protect consumers’ privacy. Section 230 reform Over the past couple of years, multiple bills have been proposed in both Congress and state legislatures seeking to regulate social media companies in general and, in particular, such companies’ content moderation policies. At the Federal level, most conversations focus on Section 230 reform, with some lawmakers calling for it to be amended or repealed due to “censorship” concerns, while others are concerned it doesn’t do enough to ensure the removal of harmful content. In a pair of opinions decided in May 2023, the US Supreme Court declined to hold technology platforms liable for aiding terrorism under the Justice Against Sponsors of Terrorism Act because of content posted by their users, but avoided analyzing Section 230’s protections for social media companies. (Twitter, Inc. v. Taamneh et al., 598 U.S. ____ (2023); Gonzalez v. Google LLC, 598 U.S. ____ (2023).) Social Media Access by Minors There is a growing trend among states to regulate access by minors to social media platforms. (E.g., Tex. Bus. & Com. Code §§ 509.001- 501.152; 2023 Ark. ALS 689; ORC Ann. §§1347.01- 1347.99; 2023 Bill Text LA H.B. 61; 2023 Bill Text LA S.B. 162; Utah Code Ann. Title 13, Ch. 63 §§101- 501.) These laws require social media companies (as defined by the law) to obtain parental consent before allowing minors (typically under 18 or 16 years old) to create an account. Many also imposed requirements around age verification of all users on the platform. The majority of these laws will take effect in 2024 and certain of them contain a private right of action in addition to regulatory enforcement. Age Appropriate Design Codes California’s Age Appropriate Design Code, which takes effect July 1, 2024, applies to any covered businesses offering offering online services, products, or features likely to be accessed by children under 18 years old. (2022 Cal ALS 320.) Design obligations include, but are not limited to, configuring default privacy settings to a “high level” of privacy and providing an “obvious signal” to the child if the online service allows parents to track their online activity. Florida also recently passed a similar law. (Fla. Stat. §§ 501.0001- 501.1735.) State User Content regulation At the state level, two laws—Florida’s SB 7072 and Texas’ HB 20—have been passed and aim to ban social media companies from regulating users’ content. In May 2022, the US Court of Appeals for the 11th Circuit decided to uphold the injunction placed on Florida’s law. In doing so, the court held that social media companies are private actors whose rights the First Amendment protects. (NetChoice, LLC v. AG, Fla., 34 F.4th 1196 (11th Cir. May 23, 2022). ) In September 2022, the Fifth Circuit made an opposite ruling by rejecting the idea that the First Amendment always protects companies’ decisions regarding content moderation. (NetChoice, LLC v. Paxton, 49 F.4th 439 (5th Cir. Sept. 16, 2022).) Both cases have been appealed to the Supreme Court, which has yet to decide whether it will resolve the circuit court split. Consumer privacy developments In addition to online speech regulation, federal and state legislatures have demonstrated a keen interest in regulating social media companies’ processing of consumers’ personal data. As noted in the US data protection section, there are currently 12 states with consumer protection laws, with more state laws anticipated (if they are not already in draft) expected. Although the passage of a federal data privacy law does not appear imminent, businesses should stay aware of congressional and agency developments in this area, as it remains a key area of interest for regulators and consumers at all levels, both federal and state specific. Federal data protection law At the Federal level, during the summer of 2022, some progress was made when the proposed American Data Privacy and Protection Act advanced out of the Committee on Energy and Commerce. But it failed to advance to the House or Senate floor. It remains to be seen whether any further meaningful action will be taken on the bill in the near future. In the meantime, the FTC has signaled it may issue consumer data privacy regulations, but as with the federal bill, it could be years before any such regulations are promulgated.
FRANCE
Minors under the age of 18 years old are offered additional protection and safeguards from a data protection perspective in general under the GDPR, which introduced specific provisions for minors. In particular, they provide for the requirement of appropriate information, the strengthening of their right to be forgotten, and an ability to consent, under certain conditions, to the processing of their data (alone beyond the age of 15 or with their parents before that age). They also call for particular vigilance with regard to the profiling of minors. In June 2021, the CNIL published a series of recommendations (below) to strengthen minors’ protection online, in particular on social media, where concerns over cyber-harassment have increased over the years. 1. Regulate the ability of minors to act online 2. Encourage minors to exercise their rights 3. Accompany parents in digital education 4. Seek parental consent for minors under the age of 15 5. Promote parental control tools that respect the privacy and interests of children 6. Reinforce the information and rights of minors through design 7. Verify the age of the child and the parents’ agreement to respect his or her privacy 8. Provide specific guarantees to protect the interests of the child
HONG KONG
The PCPD recognizes children (aged under 18) are often identified as a vulnerable group who have special requirements in privacy protection, especially in terms of online activities. To address this issue, the PCPD issued the “Collection and Use of Personal Data through the Internet – Points to Note for Data Users Targeting at Children” guideline (the “Children Protection Guideline”). The Children Protection Guideline provides several recommendations for businesses to comply with. For example, businesses are recommended to avoid collecting (instead of just limiting) the collection of personal data in relation to children, encouraging children to get their teachers or parents involved during the children’s online activities with the business, etc.
IRELAND
In Ireland, a child is defined as a person under the age of 18 years old by the Children Act 2001. The main concerns around children’s use of social media in Ireland center around bullying and receiving inappropriate photos/videos. In a recent study conducted by the National Advisory Council for Online Safety, it was found that 18% of children aged 9-17 said they had seen sexual messages on the internet in the past 12 months. 8% of children also said they received sexual messages. Another recent report by Cybersafe Kids, an Irish charity that promotes children’s safety online, found that 28% of children say they have experienced online bullying. Ireland has sought to tackle the issues of online bullying with the introduction of the Harassment, Harmful Communications and Related Offences Act 2020. The 2020 Act criminalizes behaviors such as distributing or threatening to distribute intimate images without consent, recording, distributing or publishing intimate images without consent and distributing, publishing or sending threatening or grossly offensive communication. In addition to the 2020 Act, the Online Safety and Media Regulation Act has recently been passed into law, although it has not yet been commenced. The new Act will establish the new Media Commission. One of the main responsibilities of the Media Commission will be to formulate Online Safety Codes, with the stated objective to minimize the availability of “harmful online content.” The Act provides for the broad investigative and enforcement powers of the Media Commission with regard to social media providers, including: - Appoint officers to investigate suspected non-compliance - Seek to impose administrative, financial sanctions of up to €20 million or 10% of turnover in respect of non-compliance - Issue notices to end non-compliance - Seek the prosecution of senior management of designated online services for failure to comply with a notice to end non-compliance - Seek to block access to certain online services - Issue content limitation notices to designated online services in respect of individual pieces of harmful online content It is hoped that these Online Safety Codes and enhanced powers of the Media Commission will lead to higher levels of accountability of social media providers going forward.
SPAIN
Though the legal age in Spain is set at 18 years old, children aged 14 years old are considered mature enough to consent to the processing of their data and, generally, to consent to the user terms and conditions. This is not a contentious issue in the law, but its reflection in the practice of social networks has indeed been contentious. The Spanish Data Protection Agency has been adamant that social networks cannot simply rely on its users affirming they are of a certain age, and has required that they put in place measures two, if not in every case, at least somehow verify the age of its users. Particular focus of these was the now defunct Tuenti social network, which was aimed at younger audiences and had to agree with the Spanish Data Protection Agency on a protocol to verify the age of its users. It would not be strange if this resurfaces with the growing social networks end of the youngest users, such as TikTok. Catfishing does not present a major concern in Spain (though some cases have indeed surfaced), but cyberbullying does. Cyberbullies often use social networks and messaging apps as their tools of choice for harassing their victims, and authorities, including the Spanish Data Protection Agency, have made it a particular focus of their activity in recent years.
THE NETHERLANDS
Under Article 1:233 of the Civil Code (BW), a minor is considered to be a person who has not yet reached the age of 18. Where Article 6(1)(a) GDPR applies in connection with a direct provision of information society services to a child, the processing of personal data of a child is lawful if the child is at least 16 years old. Where the child is under 16, such processing is lawful only if and to the extent that the person having parental responsibility for the child gives consent or authorization for consent in this regard. Consequently, it follows from Article 5 of the Dutch law implementing the GDPR that if Article 8 of the regulation does not apply, the consent of the data subject is required instead of that of his or her legal representative if the data subject has not yet reached the age of 16. In this regard, member states may establish by law a lower age, provided it is not less than 13. Moreover, according to Article 4.1, first paragraph, of the Media Act, audiovisual media offers that could harm the physical, mental or moral development of persons younger than 16 years of age must be made available by the institution responsible for the content of the offer in such a way that persons younger than 16 years of age should normally not hear or see these offers. Furthermore, it follows from paragraph 4 of the same article that personal data of minors collected or otherwise generated by media institutions pursuant to this article may not be processed for commercial purposes. The Dutch Data Protection Authority (DDPA) fined TikTok for violating the privacy of minor children. Users of the TikTok app were shown information about installing and using the app only in English. Thus, by offering the privacy notice in English only, TikTok failed to properly explain how it collects and processes personal data. The DDPA ruled that the requirement of understandability requires at least that when the controller addresses data subjects who speak a different language, it provides a translation in the language of those data subjects, and that this is particularly true in the case of addressing (young) children so that they can easily understand the information. Moreover, the DDPA took into account the fact that a relatively large group of Dutch people have a good command of the English language. However, it has been decided that this does not alter the conclusion, especially as TikTok is used by many people under the age of 16. The DDPA considers that a good command of the English language of data subjects in that age group is not self-evident and that TikTok should not have assumed that providing only the English version of the privacy policy to children is in accordance with Article 12 GDPR. The DDPA, therefore, found a breach of Article 12(1) of the GDPR, which requires the controller to take appropriate measures to provide information to the data subject in a concise, transparent, understandable and easily accessible form and in clear and plain language, in particular for information specifically addressed to a child.
Pursuant to Article 1 of Federal Law No.3 of 2016 on Child Rights, a child is any person being born alive who is under the age of 18. A child’s minimum age before being granted social media access is an important factor to always consider. While the UAE does not have legislation that governs minimum age requirements, there may be applicable guidelines and regulations for content (most visibly in relation to the censorship of films, television and to a lesser extent, magazines), but for the age restriction for online content, thus far, there are no provisions that cover a legal minimum age requirement.
UNITED KINGDOM
In the UK, a minor is defined in Section 1 of the Family Law Reform Act 1969 as a person under the age of 18. However, different definitions of a minor or child exist in different laws, so the meaning will often depend on the context. In the context of contracting with children, under English law, there are restrictions on the ability for “minors” (those under the age of 18) to enter into contractual relationships with businesses (save for certain exemptions, including contracts for necessaries). If an exemption does not apply, it is voidable at the minor’s option. In the context of data protection, the UK GDPR specifies that children require specific protection with regard to their personal data, in particular when offering services to children and/or building personality user profiles or marketing. Article 8 of the UK GDPR sets the age at which children can consent to the processing of their personal data in the context of an Information Society Services (ISS) at 13 years old. Most social media services in the UK require their users to be 13 years old or over. In addition, the Age-Appropriate Design Code (Code) came into force on September 2, 2021. The Code requires that designers of apps or social media platforms make children’s privacy a primary consideration. The Code sets out 15 standards of age-appropriate design reflecting a risk-based approach. The focus is on providing default settings which ensures that children have the best possible access to online services while minimizing data collection and use by default. Many social media platforms have implemented changes to their child privacy and safety measures as a result of the Code. Against the backdrop of the UK Online Safety Bill progressing through Parliament, there have been ever-increasing calls for more stringent regulation of social media providers. A significant development in the UK regarding children’s activeness on social media has been the dialogue surrounding the suicide of schoolgirl Molly Russell. In October 2022, a UK coroner wrote to social media firms and the government calling for action following the inquest into Molly Russell’s death. It was reported that Molly ended her life in November 2017 after viewing suicide and self-harm content online. Coroner Andrew Walker issued six recommendations, including separating platforms for adults and children and reviewing algorithms used by sites.
UNITED STATES
Laws related to children’s online activity (including their use of social media), at both the state and federal levels primarily focus on protecting children’s privacy. At the federal level, COPPA defines a child as an individual under the age of 13. As stated above, the law requires “operators” to obtain parental consent prior to collecting children’s personal data. Similarly, each of the comprehensive state privacy laws regulate the processing of children’s data. For example, in Colorado, Connecticut and Virginia (which define a child as COPPA does), children’s data is considered “sensitive data,” for which companies must obtain opt-in consent to collect or process. California’s Age-Appropriate Design Code Act, signed into law in September 2022, is an online safety law designed to further protect children’s privacy in the state. The law was inspired by the UK Age-Appropriate Design Code and is applicable to businesses that “provide an online service, product, or feature likely to be accessed by children.” The law defines children as consumers under the age of 18 and requires companies to consider the best interests of children when designing, developing and providing their online product, service or feature. Specific requirements include conducting data protection impact assessments prior to releasing new services, products or features to the public and configuring all default privacy settings to offer a high level of privacy.
Vincent Denoyelle Partner, Paris vincentdenoyelle@eversheds-sutherland.com T: +33 6 48 60 26 64
Edouard Burlet Senior Associate, Paris edouardburlet@eversheds-sutherland.com T: +33 1 55 73 40 00
Rhys McWhirter Head of Technology (Asia), Hong Kong rhysmcwhirter@eversheds-sutherland.com T: +852 2186 4969
Jeff Tian Associate, Technology, Hong Kong jefftian@eversheds-sutherland.com T: +852 2186 4997
Jamie Leung Associate, Technology, Hong Kong jamieleung@eversheds-sutherland.com T: +852 2186 4987
Marie McGinley Partner, International Head of Technology and Head of IP, Technology & DP, Dublin mariemcginley@eversheds-sutherland.ie T: +353 1 6441 457
Aisling O’Hare Of Counsel, Dublin aislingohare@eversheds-sutherland.ie T: +44 28 95262070
Ellie Cater Senior Associate, Dublin elliecater@eversheds-sutherland.ie T: +353 1 6644280
Daniel Necz Associate, Dublin danielnecz@eversheds-sutherland.ie T: +353 1 6644330
Rosalyn English Solicitor, Dublin rosalynenglish@eversheds-sutherland.ie T: +35 3 16 64 49 72
Vicente Arias Partner and Head of Technology, Media & Entertainment, Madrid varias@eversheds-sutherland.es T: +34 91 429 43 33
Lucía Jurado Associate, Madrid ljurado@eversheds-sutherland.es T: +34 683 349 060
Olaf van Haperen European TMT Sector Head and Partner, Technology & Data Protection, Rotterdam olafvanhaperen@eversheds-sutherland.com T: +31 617 456 299
Ilham Ezzamouri Associate, Rotterdam ilhamezzamouri@eversheds-sutherland.com T: +31 6 38764682
Nathalie Djojokasiran Associate, Rotterdam nathaliedjojokasiran@eversheds-sutherland.com T: +31 10 2488 024
Nasser Ali Khasawneh Partner and Head of the TMT Sector, Dubai nasseralikhasawneh@eversheds-sutherland.com T: +971 4 389 7000
Christine Khoury Principal Associate, Dubai christinekhoury@eversheds-sutherland.com T: +971 4 389 7064
Matthew Gough Partner and Head of the UK Consumer Law Team, Cardiff matthewgough@eversheds-sutherland.com T: +44 292 047 7943
Paula Barrett Partner and Co-Lead, Global Cybersecurity and Data Privacy, London paulabarrett@eversheds-sutherland.com T: +44 207 919 4634
Eve England Partner, Cardiff eveengland@eversheds-sutherland.com T: +44 29 2047 7770
David Wilkinson Partner and Head of UK IP, London davidwilkinson@eversheds-sutherland.com T: +44 20 7919 0775
Penelope Jarvis Legal Director (South African qualified), London PenelopeJarvis@eversheds-sutherland.com T: +44 207 919 4684
Michael Bahar Partner, Co-Lead of Global Cybersecurity and Data Privacy, Washington DC michaelbahar@eversheds-sutherland.com T: +1 202 383 0882
Brandi Taylor Partner, San Francisco branditaylor@eversheds-sutherland.com T: +1 415 528 2862
Tanvi Shah Associate, San Diego tanvishah@eversheds-sutherland.com T: +1 858 252 4983
Melissa Fox Counsel, Atlanta melissafox@eversheds-sutherland.com T: +1 404 853 8109
Janell Johnson Associate, Washington DC janelljohnson@eversheds-sutherland.com T: +1 202 383 0327
In the European Union (“EU”) a wide range of new laws are being considered under the Digital Single Market Strategy with the aim of maintaining the EU’s position in digital markets globally under the “three pillars” approach which include:
Access to online products and services for consumers and businesses Shaping the environment for digital networks and services to grow and thrive Maximising the growth potential of the European digital economy
1.
2.
3.
In this respect, the European Commission has developed the Digital Services Act package, which includes the Digital Services Act (DSA) and the Digital Markets Act. The DSA will create additional obligations for digital service providers, including specific obligations for “very large platforms,” which are defined as platforms with over 45 million users.
The DSA will provide regulation in the following areas:
Content liability With respect to content liability, the DSA will essentially reproduce the liability safe haven that is currently provided for in the e-Commerce Directive. Therefore, a hosting service, if it has actual knowledge of illegal activity or content, must act expeditiously to remove it, or it will be held liable.
Reporting obligations Very large platforms will also have specific reporting obligations to enforcement authorities in certain cases – for example, when people’s safety is at stake.
Accountability Like the GDPR, the DSA will introduce elements of accountability and financial fines. Such fines can reach 6% of the provider’s annual turnover, which is even higher than those that can be issued under the GDPR. This shows that the DSA is not just an invitation for platforms to include mechanisms against hate speech; it actually creates binding obligations with potential fines at stake.
Notification systems Online platforms must put mechanisms in place to enable any individual or entity to notify them of the presence of illegal content. Such mechanisms must be easy to access and user-friendly. The DSA also introduces the notion of “trusted flaggers” for very large platforms. The platforms will have to treat notifications from trusted flaggers as a priority based on an internal complaints handling system that they must put in place.
Information and transparency obligations The DSA also increases information and transparency obligations, including, for example, the obligation for very large platforms to provide transparency over the main parameters of the decision-making algorithm that is used to offer content. This is in response to the aforementioned concern over freedom of speech.
Additional measures are also being considered to address the issues of harmful content and other areas that will impact social media companies such as AI, the Metaverse, competition law and data protection, as follows:
CSAM The Regulation Proposal on Child Sexual Abuse Material (“CSAM”) will impose a range of obligations to remove child sexual abuse materials including conducting risk assessments and mitigation, detection and removal orders, reporting obligations and the use of detection technology. The Regulation is targeted towards hosting services, interpersonal communication services i.e. messaging services, apps store and internet access services.
Artificial Intelligence The AI Regulation is a landmark piece of proposed legislation that is expected to be adopted in 2023 and introduces a tiered regulation of artificial intelligence systems where certain use cases are prohibited and others are subject to substantial compliance obligations. Prohibited practices include “subliminal techniques” beyond a person’s consciousness in order to materially distort a person’s behaviour or where an AI system manipulates the vulnerabilities of a specific group of persons due to personal characteristics. “High risk” AI includes a wide number of sectors such as law enforcement and biometrics and in such instances AI providers must adhere to burdensome technical and transparency requirements and conformance assessments. Furthermore, the AI Liability Directive will introduce a new standard of tortious liability and disclosure obligations for AI providers and harmonise rules across the EU. Metaverse The EU has taken an active stance in the digital spaces, augmented reality and virtual reality sectors and has launched initiatives to prepare for this nascent industry’s maturity. Commissioner Thierry Breton has stated “This new virtual environment must embed European values from the outset... Private metaverses should develop based on interoperable standards and no single private player should hold the key to the public square or set its terms and conditions." While the Digital Services Act (“DSA”) and Digital Markets Act (“DMA”) will provide regulators with new tools to police digital spaces there is an expectation the Commission will publish a Metaverse Regulation this year covering issues such as network infrastructure taxes, digital rules following the DSA and DMA and safety and interoperability measures. Competition The EU Digital Markets Act which is targeted towards large technology companies will implement new obligations on “gatekeepers” to ensure smaller businesses are treated in a fair manner. The purpose of the new legislation is to encourage competition in the digital markets by prohibiting certain commercial practices and requiring bigger players to adhere to positive obligations to promote competitiveness such as providing fair access to application developers. Data Protection The Trans-Atlantic Data Privacy Framework (‘TADPF’), which will allow EU to US data transfers, is set to replace the Privacy Shield Framework following the Schrems II decision. On 7 October 2022 the US government signed an Executive Order adopting the TADPF and the EU Commission must now draft an adequacy decision and implement its adoption procedure. In the EU Commission’s Work Programme 2023 there are initiatives concerning digital enforcement and enhanced data use. Proposals are listed to make amendments to consumer protection cooperation rules, updates to harmonize the approach by data protection authorities enforcing the EU General Data Protection Regulation and the set-up of a "common European mobility data space" is to be examined.
There are no social media-specific laws and regulations in Hong Kong. However, certain enforcement activities by the Hong Kong government may impact this area in the future:
From a data protection perspective, in the past few months, the Office of the Privacy Commissioner for Personal Data (the PCPD) actively took enforcement actions under the new anti-doxxing regime which came into effect in October 2021. The arrested individuals/defendants allegedly disclosed personal data of a data subject without his/her consent in contravention of section 64(3A) of the PDPO and may be liable to a fine of HK$100,000 and imprisonment for two years. We believe this anti-doxxing enforcement shall continue to be the priority of PCPD for the next few years. From an NSL perspective, it is expected that the Hong Kong government shall undertake enforcement to implement the law further. Due to the lack of precedents, it is hard to predict the trend of such enforcement activities. However, given the fact that NSL allows Hong Kong law enforcement agencies to remove online content or obtain user data without a judicial warrant, enforcement activities may have an impact on the social media sector.
An emerging focus in Ireland has been on regulating harmful and misleading content online, moving away from an era of self-regulation for large online platforms, such as social media platforms.
The need to regulate misleading online content can be seen by the Competition and Consumer Protection Commission’s (CCPC) recent Online Behaviour: Influencer Marketing Report, which investigated online consumer behaviour and influencer marketing. In its recommendations, the CCPC outlines that social media platforms should assume greater responsibility for informing and educating its users on misleading advertising, and further recommends that such platforms should support users in the labelling of online content, as well as facilitating the reporting of misleading advertising. As noted above, the regulation of harmful online content is also in keeping with efforts at an EU level, in particular the Digital Services Act, which entered into force in November 2022 and which will impose obligations on online platforms with respect to illegal content posted by users of their services. The Irish Government supports the measures proposed under the Digital Services Act and has noted that the newly formed Media Commission will be the primary regulator for the Digital Services Act in Ireland. One of the motivations for regulating harmful online content stems from the need to protect children online, which is another focal point of Irish policy. The Irish Data Protection Commissioner (DPC) as the lead supervisory authority for multiple social media platforms in Ireland, has a unique role in overseeing various social media platforms’ compliance with GDPR. Their commitment to protecting children and other vulnerable groups is one of their strategic priorities, as set out in their Regulatory Strategy for 2022-2027. While the Communications (Retention of Data) (Amendment) Act 2022 has bridged a gap to allow for the continued retention and access to certain data for criminal justice and national security purposes, the Minister for Justice has noted its intention to further reform and clarify the law on data retention in Ireland. It is expected that a new general scheme of a bill will be published which will consolidate existing legislation on retention of data. In July 2021, the Irish Government published a National Artificial Intelligence paper entitled “AI - Here for Good: National Artificial Intelligence Strategy for Ireland” outlining a number of programmes to incentivise collaboration between the public sector and business. The Irish government has welcomed the findings from bodies such as the Expert Group on Future Skills Needs which found in its report “AI Skills: A Preliminary Assessment of the Skills Needed for the Deployment, Management and Regulation of Artificial Intelligence” that everyone regardless of their area of employment will require some knowledge and understanding of AI. The Department of Finance published a paper on virtual currencies and blockchain technology outlining its views the effect of virtual currencies on consumers and companies in areas such as data protection. Additionally, the Irish tax authority issued a manual entitled “Taxation of Cryptocurrency Transactions” to explain cryptocurrency taxation for crypto-assets. Virtual Asset Service Providers (VASPs) will need to consider its obligations following the transposition of the Fifth Anti-Money Laundering Directive into Irish law by way of the Criminal Justice (Money Laundering and Terrorist Financing) (Amendment) Act 2021.
- - - - - -
Three areas of growing concern are (i) anonymity in, (ii) privacy, and (iii) fake news.
Anonymity is a general concern for enforcement of legal actions in general and has frustrated many civil and administrative actions, with the authorities being unable to follow through on actions due to difficulty in identifying the defendants or prosecuted individuals or even, with the advent of blockchain, in seizing digital assets. This concern, though, has only been addressed by legal doctrine (on which a belief permeates that it will be a change in internet’s architecture which will finally address this problem), and nothing has yet been done from a lawmaking standpoint. Regarding privacy, both the Spanish Data Protection Agency and the Criminal courts have gradually been curtailing processing of users’ data by social networks and other internet service providers, as will be explained below in more detail. On curtailment of fake news, legislators have in recent years made this one of their apparent priorities, and we have seen public announcements of initiatives to tackle this problem (including the possibility of setting up a Ministry of Truth, very much in Orwell’s fashion). However, these announcements have been met with severe criticisms, mainly from constitutional scholars raising the troublesome nature of such initiatives from the standpoint of freedom of expression rules. Nothing has to this date been presented to the legislative bodies for enactment.
The regulation of social media networks in the Netherlands is in line with European norms and standards.
The Netherlands recently adapted its Media Act to the European Audiovisual Media Services Act. The Netherlands has also recently implemented the directive on the modernization of consumer protection rules. There are no clear indications whether new national laws and regulations specific to social media will be introduced in the (near) future.
The diverse array of social media networks in the UAE is effectively regulated in some degree of accordance with international norms and in consideration of the country's socioeconomic needs.
The UAE not only monitors the current media regulations, but also employs a variety of methods to promote their evolution. The local government established media cities to encourage local, regional, and international entrepreneurs to establish offices producing the finest art in all forms. In addition, the UAE has taken a keen interest in film promotion. In spite of the 2018 Electronic Media Activity Regulation Resolution, it is widely believed that the legal framework surrounding the media in the UAE will be revised at some point, in particular, to address media in general and digital media in particular. There are no clear indications as to when this will occur, although we believe it might be in the near future. The website censorship committee established by the MRO with representatives from the Ministry of the Interior, the Telecoms and Digital Government Regulatory Authority, and the Signals Intelligence Agency will also be an important area to monitor. The rise of social media has also prompted a plethora of regulations aimed at regulating this area and preventing the spread of misinformation.
In the coming years, it will be important to monitor the increased regulatory scrutiny, and enforcement of such laws as social media becomes increasingly integrated into daily life.
The following legal developments will need to be taken into account by social media businesses:
Online Safety Bill: As noted above, we expect to see regulators and legislators trying to strike a balance between social responsibilities, the potential risks of online harm, and safeguarding individuals’ freedom of speech. In the UK, this is materializing via the highly contentious Online Safety Bill, which is the subject of much public focus and politically-driven scrutiny. The Online Safety Bill, as currently drafted, will impose obligations on platforms to identify and remove illegal content, particularly material relating to terrorism and child sexual abuse, as well as imposing additional duties of care on platforms that are likely to attract child users. Fines will be introduced for non-compliance. Advertising: Some of the key findings of the House of Commons’ Digital, Culture, Media & Sport Committee’s report on influencer culture are that the rapid growth in social media influencing has expanded and outstripped advertising regulations, meaning that updates to ASA, further reform and enforcement powers are urgently required. The committee is currently considering giving ASA statutory powers to enforce the CAP Code to improve compliance. Changes to UK data protection laws: The UK Government recently published a draft Data Protection and Digital Information Bill (Data Reform Bill), with the stated aim of building on the UK’s current regime, and clarifying, and in some cases relaxing, current UK data protection obligations for businesses. This, however, has been paused, and the UK Government recently announced plans to overhaul the UK GDPR more significantly. It is difficult to determine the scale of change compared to the UK GDPR (at present, the UK GDPR mirrors its EU counterpart) but of particular importance to social media platforms is that the Data Reform Bill proposes to raise potential fines for infringements of PECR, (currently set at a maximum of £500,000), to be in line with the UK GDPR (£17.5 million or 4% of global turnover, whichever is highest). We expect that, although the Data Reform Bill may be tabled for now, this change will be implemented in other data privacy legislation. Further regulation of Artificial Intelligence (AI) technology: The UK Government has published a National AI Strategy with a long-term view of establishing a UK framework for the regulation of AI, and changes to legislation affecting the roll-out of AI technology and solutions is anticipated. Increased use and development of the “metaverse”: We have already seen changes being implemented to prepare for the advent of the metaverse in the intellectual property sphere to ensure that trademarks are appropriately protected in the metaverse; for example, it is now possible to register a trade mark in the following classes (i) online non-downloadable virtual goods and NFTs (class 42), and (ii) financial services, including digital tokens (class 36).
In the US, social media legislative trends signal an increasing focus on online speech and a continued desire to protect consumers’ privacy. As private companies, social media platforms are free to create policies around what user content they will and will not permit to be posted. Many would likely argue that such companies have a societal duty to do so - for example, to stop the spread of disinformation or to foster an inclusive environment by prohibiting hate speech.
Over the past couple of years, multiple bills have been proposed in both Congress and state legislatures seeking to regulate, in general, social media companies, and, in particular, such companies’ content moderation policies. At the federal level, most conversations focus on Section 230 reform, with some lawmakers calling for it to be amended or repealed due to “censorship” concerns while others are concerned it doesn’t do enough to ensure removal of harmful content. At the state level, two laws have been passed aimed at banning social media companies from regulating users’ content. Both of these bills have since been enjoined in the courts. In addition to online speech regulation, federal and state legislatures have demonstrated a keen interest in regulating social media companies’ processing of consumers’ personal data. In 2018, California was the first state to enact a comprehensive consumer privacy law with its passage of the California Consumer Privacy Act (CCPA). Since then, four more states have passed similar laws. At the federal level, during the Summer of 2022, some progress was made when the proposed American Data Privacy and Protection Act advanced out of Committee to be set for a House vote, but Congress recessed prior to the vote occurring. It remains to be seen whether any further action will be taken on the bill in the near future. In the meantime, the FTC has signaled it may issue consumer data privacy regulations, but, as with the federal bill, it will likely be years before any such regulations are promulgated, if at all.
Although the passage of federal data privacy law does not appear to be imminent, businesses should stay aware of congressional and agency developments in this area as it remains a key area of interest for regulators and consumers.