TPR: The Evolution of Struggle and Resistance in Digital Networks Under The EU Legislation.
Nuel Abuchi Obiekwe.
TPR: The Evolution of Struggle and Resistance in Digital Networks Under The EU Legislation.
‘The Net interprets censorship as damage and routes around it’ -John Gilmore, 1993
‘Within networked governance arrangements, power interprets regulatory resistance as damage and routes around it’
-Julie Cohen, 2020.
- Introduction:
Reflection on the Quotes:
Gilmore’s 1993 quote underscores the decentralized nature of the Internet or its separate space nature, highlighting the conception of the Internet as cyberspace or the open net. From the beginning, States controlled the mouthpiece to which the citizens listened.[1] The internet served as a medium to amplify free speech and cross-cultural understanding. Beneath this theory of an ‘open net conception’ lies the concern for moral values, child safety, and terrorism, which was a strong basis for a call about internet regulation. The argument was that the Internet has a separate environment different from real space with unique qualities that made it difficult to regulate.
The Internet is a network of interconnected computers where information in the form of data packets travels through multiple pathways.[2] Gilmore suggests that censorship, like trying to block information or restrict access, is perceived as “damage” by the system. The architecture of the internet is such that it adapts or finds ways around restrictions, making it difficult to control or censor information flow fully. The quote emphasizes the internet’s ability to empower individuals by ensuring that information can flow even in restrictive environments because citizens with technical expertise can often bypass filters put in place by states through the use of proxy networks like VPN, this demonstrates the “routing around” nature of the net.
Cohen’s quote highlights surveillance capitalism in internet regulation, shifting the focus from technological resistance to power dynamics in networked governance. Given the concept of self-regulation, this idea represents a new logic of accumulation by tech giants.[3] Large tech corporations wield enormous regulatory power over users and smaller entities in today’s digital landscape. Rather than being helpless in the face of regulation, these corporations adopt regulatory efforts to preserve their market dominance.[4]
Cohen’s statement reflects the growing ability of corporate and state powers to evade regulatory restrictions, just as the internet once evaded censorship. Cohen highlights how regulatory frameworks that seek to impose limits on power often result in sophisticated strategies to subvert or adapt to those rules, thus preserving the existing power dynamics.[5] Entities with significant power, such as large tech corporations, view any form of regulatory resistance not as an inspiring challenge but as a threat. This view creates a defensive posture, where the primary goal is to neutralize resistance rather than engage with it.[6]
The phrase “routes around it” suggests a strategic approach where tech giants seek alternative means to achieve their objectives, often at the expense of regulatory compliance. This tactic tends to undermine the legitimacy of governance structures and erode public trust, as they prioritize profit and control over ethical considerations.
Both views are crucial for understanding the tensions between transnational private and public regulation in digital networks. Gilmore focuses on the technological evasion of censorship, while Cohen’s observation applies to how corporations and governments circumvent legal and regulatory obstacles. These two dynamics come into play in the context of Transnational Private Regulation, particularly as the EU seeks to implement rules through the Digital Market Act and Digital Service Act.
2. Transnational Private Regulation
TPR. refers to the mechanism by which private actors, NGOs, tech giants, international or intergovernmental organizations, and other technical standard setters set rules that govern the digital space, often with minimal or indirect involvement from states.[7] These bodies exercise independent regulatory powers devolved through international law or national legislation or by contract for prescribing standards or systems of behaviors on internet issues with minimal supervision from states and in the case of ICANN’s mission as coordinating the allocation and assignment of domain names.[8]
This regulation plays a crucial role in shaping the operational behaviors of internet platforms and services across national borders. The global nature of the internet often makes traditional national regulation insufficient. TPR ensures the digital ecosystem remains secure and competitive and respects users’ rights, especially when addressing issues spanning multiple jurisdictions.
2.1 Understanding the Role of some Key EU legislation in the regulatory regime.
The EU has adopted several key regulatory frameworks to govern online activities. While state-driven, these frameworks mandate private regulatory mechanisms. While not strictly private in the traditional sense, it creates a framework where private entities must follow rules that affect global Internet governance through self-regulatory mechanisms or industry-wide compliance.[9]
- Digital Services Act:
The DSA, which entered into force in November 2022, represents a significant step towards regulating online platforms, especially large social media companies, marketplaces, and other service providers that act as intermediaries between users and digital services. The Act introduces a set of rules for content moderation, transparency, and user safety, aiming to create a safer digital environment. One of the key features of the DSA is the imposition of greater responsibilities on platform providers to ensure that third-party content does not spread harmful misinformation, hate speech, or unlawful activities.[10]
The DSA enforces a private regulatory model by mandating platforms to implement and self-regulate through content moderation policies. They must create clear, transparent processes for reporting harmful content, carry out risk assessments, and take appropriate actions, such as removing illegal content or preventing its spread. Moreover, for large platforms known as VLOP, the DSA requires more stringent oversight and collaboration with external auditors to ensure compliance. In this way, private companies must uphold public objectives, such as user safety and fairness, and cooperate with other private actors, e.g., trusted flaggers and fact-checkers, to enforce these rules.
- Digital Markets Act:
The DMA complements the DSA, competition law, and the GDPR. It stretches a bit further than the Platform-to-Business (P2B) Regulation of 2019, which focuses on introducing transparency in the relationship between platforms and their business users. The DMA complements GDPR by addressing data-sharing practices and user consent.[11] DMA was designed to remedy the limitations inherent in the enforcement of the abuse of dominance prohibition of Articles 101 and 102 of the TFEU by focusing on anti-competitive behaviors in the digital space, particularly among large dominant platforms or “gatekeepers,” defined based on turnover, market capitalization, and active users.[12]While the DMA targets anti-competitive practices by gatekeepers, the DSA focuses on the responsibilities of digital platforms to protect users and prevent harm. The EU competition law broadly addresses anti-competitive practices and monopolistic behaviors across all sectors.
The DMA specifically seeks to prevent large digital platforms from abusing their market power to the detriment of smaller competitors and consumers.[13] It ensures that the market remains contestable and fair. The gatekeepers use their superior bargaining position to appropriate the efforts of business users, either directly by exploiting them or indirectly by excluding them from the market, especially when they compete in the same business and services. In other words, DMA aims to make room for innovation by smaller players and let such players reap the benefits from their innovative efforts.[14]
DMA describes Core platform services as including online intermediation services, search engines, social networks, video-sharing platforms, messaging services, operating systems, web browsers, virtual assistants, cloud computing services, and some types of online advertising services, which may be connected to one of the other core platform services.[15]
For DMA to apply, these conditions must exist: (i) the undertaking must have a significant impact on the internal market (it must meet the quantitative thresholds or be considered a gatekeeper by assessment of the Commission); (ii) it must provide a core platform service that is an important gateway for business and end-users[16]; and (iii) it must enjoy a durably entrenched position or likely do so in the foreseeable future.[17] Hence, the DMA limits itself to regulating core platform services.
- E-Commerce Directive (2000/31/EC):
The E-Commerce Directive is one of the first EU legislative frameworks for internet governance, primarily dealing with the liability of intermediaries such as internet service providers, hosting services, and platforms regarding user-generated content. The directive sets out a framework under which intermediaries are not held liable for content uploaded by users unless they have knowledge of illegal activities and fail to act.[18]
Through private regulatory mechanisms, platforms are required to self-regulate their interactions with third-party content, ensuring compliance with the law. This encourages platforms to adopt internal content moderation systems, risk management practices, and transparency mechanisms to prevent illegal activities like fraud or piracy from occurring on their platforms.
- EU Copyright DSM Directive (2019/790):
The EU Copyright Directive mandates platforms hosting user-generated content, like YouTube or Facebook, take proactive steps to prevent the sharing of copyrighted material without consent.[19]This shifts some responsibility from copyright holders to the platforms themselves, requiring them to implement automated systems to detect infringing content.[20]
In practice, this encourages platforms to set up internal, private systems for content monitoring and copyright compliance, thus reinforcing the role of private regulation. Platforms must enter into licensing agreements with copyright holders, implement content filtering systems, and create transparent procedures for addressing copyright disputes. While these measures are designed to protect rights holders and respect users’ freedom of expression, they also emphasize the importance of private actors (platforms and copyright holders) in regulating content flow, creating a form of self-regulation within a broader public policy framework.
V. General Data Protection Regulation (GDPR): Protecting Privacy in the Digital Age:
The GDPR, implemented in 2018, is a landmark piece of legislation that governs the processing of personal data within the EU. It establishes the rights of individuals regarding their data, including the right to access, erase, and object to data processing. The GDPR requires private entities whether they are data controllers or processors to implement comprehensive privacy measures to protect users’ data, including transparency obligations, data breach notifications, and privacy by design.
Private regulation under the GDPR occurs when businesses adopt mechanisms for data governance and compliance. They must ensure that third-party service providers, such as cloud hosts or marketing agencies, also comply with GDPR rules, introducing a form of TPR. Moreover, the regulation promotes the use of industry self-regulation by encouraging organizations to adopt codes of conduct and certifications as means of demonstrating compliance. In this way, private actors are responsible for managing compliance with data protection principles and ensuring that third-party partners uphold these standards.
As the internet continues to evolve and cross borders, TPR will remain a vital mechanism for achieving effective governance. These EU regulations underscore the growing importance of private actors in regulating their operations, creating a balance between market freedom and the protection of public interests in the digital age.
3.1 Meta Content Moderation and Apple Self-Regulation:
The European Union (EU) increasingly regulates the digital landscape as it strives to balance innovation, user protection, privacy, and competition within the digital economy. Two major players, Meta and Apple, are subject to this evolving regulatory framework, particularly in areas related to content moderation and self-regulation. Meta, as a dominant social media platform, faces challenges around moderating user-generated content, while Apple, with its tightly controlled ecosystem, enforces stringent self-regulation in its App Store.
3.2 Digital Services Act and Meta’s Content Moderation:
Meta, a parent company of Facebook, Instagram, and WhatsApp, regulates user content through its Terms of Service and Community Standards, standard contractual documents with underlying values, and internal guidance for moderators. Instagram and Facebook’s Terms of Service prohibit content that violates their terms or Community Standards. Such content may be unlawful, misleading, discriminatory, fraudulent, or infringe on the rights of others, such as intellectual property rights.[21]
Meta moderates its platform contents by ex-ante or ex-post methods. Ex-ante moderation uses automated systems to block or flag illegal content, while ex-post moderation is manually applied in the day-to-day review of flagged or unlawful content that violates their community standard.[22] Content may also be removed or blocked if necessary to prevent abuse of the service or to avoid legal consequences for Meta.
The DSA requires digital services or social media platforms not to survey user content or investigate users’ activities proactively.[23] This prohibits Meta from deploying invasive surveillance practices that could infringe on privacy and people’s fundamental rights.
DSA also mandates platforms to adopt effective Notice and take-down mechanisms.[24] Meta must provide an easily accessible system for users to report illegal content. It must take such a report seriously and act timely upon receiving such notice to remove or disable access to such content unless there is lawful justification for retaining it. The provision complements Article 11 of the EU Charter of Fundamental Rights in protecting the right to freedom of expression and information. This provision is particularly significant for Meta, given its role in hosting political content, news, or user-generated media that can sometimes spread harmful or illegal content.
The DSA mandates platforms to give users the right to appeal any content moderation decisions they make.[25] This enhances Meta’s transparency and affords users access to a redress mechanism.
Platforms are also required to conduct regular risk assessments relating to illegal content and misinformation.[26] By this provision, Meta is obliged to assess potential risks to public safety, users’ rights, and public health that may arise from the content shared on its platforms. It must also implement mitigation measures to address these risks[27], like enhancing its content moderation systems and cooperating with third-party fact-checkers and trusted flaggers.[28]
The DSA requires greater transparency for large platforms like Meta. This includes disclosing algorithms and providing users with tools to understand how their data is used.[29] Additionally, Meta is expected to improve data portability, enabling users to transfer their data to other platforms easily.[30] These changes aim to foster competition and user control.
The DSA authorized transparency reporting.[31] Meta must regularly publish detailed reports on its content moderation practices, including the volume of content removed, the types of illegal content detected, and the mechanisms used to detect harmful content. This provision is intended to promote Meta’s accountability to platform users and ensure seasoned content moderation practices.
The DSA also mandates platforms to implement stricter default privacy settings to safeguard minors from harmful or inappropriate content.[32]
However, based on the EU Recommendation, internet companies must enforce proactive and effective content recognition technologies in detecting terrorist content, swiftly removing or disabling such content, and further stopping it from reappearing once it has been removed.[33] In that regard, Meta runs one of the primary examples of an industrial content moderation system, using algorithms ex-ante to flag and block contents.
3.3 Case Study Illustration:
Meta’s Handling of Misinformation in the COVID-19 Pandemic
During the pandemic, Meta faced significant challenges in moderating misinformation relating to vaccines, public health, and government policies. They partnered with organizations certified by the International Fact-Checking Network to identify and mitigate viral falsehoods. False Content, including the viral conspiracy theories about 5G technology spreading COVID-19, was downgraded in visibility across Facebook and Instagram.[34] Meta also collaborated with local organizations to identify and act against misinformation inciting unrest in potentially violent territories. Specifically, during the vaccine rollout phases, fake news linking vaccines to fatal side effects was rapidly addressed to prevent public panic in several regions. Pages and accounts repeatedly sharing misinformation faced restrictions such as reduced reach, inability to monetize, or removal.[35]
4.0 Apple’s Self-Regulation:
The DMA plays a crucial role in Apple’s self-regulatory practices, particularly because Apple is classified as a gatekeeper platform under the DMA. As a gatekeeper, Apple has significant market power over the distribution of apps through the App Store, and it controls the conditions under which third-party developers can access iOS users.
4.1 Digital Markets Act and Apple’s Self-Regulation
The DMA introduces rules for fairness in digital markets, prohibiting gatekeepers from engaging in self-preferencing practices.[36] In effect, Apple cannot prioritize its apps or services over those of third-party developers. This provision directly impacts Apple’s self-regulation, particularly in how it manages the visibility and accessibility of third-party apps.[37]
The DMA also imposed certain obligations on gatekeepers to allow users more control over their devices and platforms.[38] Hence, third-party apps could require access to Apple-specific features like Siri, NFC (used in Apple Pay), and other system integrations.[39] Competitors like Spotify and Google Maps should be able to compete on a more equal footing as default apps to Apple Music and Apple Maps, respectively.[40] These provisions challenged Apple’s strict control over its ecosystem and payment models.
4.2 Case Study Illustration:
Case 40 437 Apple – App Store Practices (music streaming)
The European Commission investigated Apple’s handling of its App Store rules, particularly the anti-steering provisions. These provisions barred developers of music streaming apps, such as Spotify, from informing users about cheaper subscription options outside of the App Store or including external purchase links while mandating developers to use Apple’s in-app payment system for subscriptions and digital goods, incurring a cost burden of 15%-30% commission on developers, thereby forclosing innovation.[41] Apple was indicted as the practice was deemed an abuse of Apple’s dominant position in the market for music streaming apps under Article 102(a) of the Treaty on the Functioning of the European Union (TFEU) and Article 54 of the European Economic Area (EEA) Agreement.[42]
Although the court did not rule directly on DMA, the case reflects the growing tension between Apple’s self-regulatory framework and the EU’s evolving digital competition laws. The case highlighted the issue of self-preferencing in violation of Article 5 of the DMA and the restriction of fair competition in the digital marketplace.
4.3 The E-Commerce Directive and Content Moderation
By the combined effect of Article 12 (on mere conduit), Article 13 (on cashing), and Article 14 (on hosting), platforms are generally not liable for the content that users upload unless they are actively involved in its creation or distribution (this shield is often referred to as safe harbor). Article 15 shields platform service providers from monitoring obligations.[43] This provision allows Meta to moderate content without facing liability as long as they follow the Notice and Action procedures.
4.4 The EU Copyright Directive and App Store Regulation
The Directive, particularly Article 17, requires platforms that host user-generated content such as Meta or Apple’s App Store to prevent the uploading of copyrighted material without authorization.[44] For Meta, this means implementing effective content filtering systems to prevent the spread of pirated content. On the other hand, Apple must ensure that apps in its ecosystem comply with copyright laws, and it is responsible for ensuring that third-party developers do not violate intellectual property rights.[45]
5. Assessing the truth value:
Both Meta and Apple’s cases represent challenges policymakers face in regulating private actors who operate transnationally.[46] The frameworks represent the EU’s effort to balance innovation with the need for accountability and fairness, but their success in realizing these objectives seems mixed.
The DSA created a framework that requires Meta to be more transparent and accountable in its content moderation efforts, mandating platforms to adopt stringent content moderation policies, transparency in algorithmic decisions, and risk assessments for societal harm.[47] Meta has argued that the compliance burdens may stifle innovation and make it harder for smaller companies to compete, indirectly positioning itself as a protector of competition and innovation.[48]
Meta is also mandated to introduce greater transparency about its algorithm systems, including releasing 22 system cards that explain how content is ranked on platforms like Facebook and Instagram[49]. Meta cites intellectual property and user privacy concerns in resisting this requirement to provide detailed access to their algorithms and data to regulators.
By focusing on the metaverse, Meta tends to shift attention away from regulatory challenges tied to its current platforms. The metaverse remains largely unregulated, allowing Meta to pursue growth in less scrutinized areas. Meta actively challenges the scope and implementation of DSA rules in courts and lobbies EU policymakers to dilute or delay provisions that could impact its advertising model.[50]
Meta’s moderation systems have struggled to fully comply with the DSA’s requirement to mitigate systemic risks. Critics point out that harmful content, including hate speech and misinformation, remains prevalent on the platform despite the DSA’s obligations to address these issues.[51]
The DMA targets Apple primarily due to its gatekeeper role in the iOS ecosystem. The Act has remarkably impacted the operation of Apple products in terms of preventing anti-competitive behavior, promoting fair competition, and opening up markets to ensure that smaller players are not unfairly excluded.[52] The DMA requires Apple to allow sideloading. This will enable users in the EU to set third-party browsers or install apps outside the App Store and navigation apps as defaults while Safari can be completely uninstalled.[53] Apple initially resisted this by emphasizing user security and privacy concerns, arguing that sideloading would expose users to malware and reduce overall security.[54]
Apple also opposes provisions that require opening its payment systems to third parties as required by Article 7. It defends its policies by arguing that integrated payment systems ensure a seamless user experience and safeguards data.[55]
Apple leans heavily on its reputation for strong privacy protections as a counterargument to regulatory demands. For example, it positions features like App Tracking Transparency as consumer-focused despite accusations that they disadvantage competitors like Meta in advertising.[56] Apple actively lobbies EU regulators to shape the interpretation and enforcement of the DMA in its favor, emphasizing security and privacy as areas requiring tailored exceptions.[57] Apple has made limited concessions to comply with the DMA, such as exploring sideloading in Europe. However, it implements these changes minimally and restrictively, signaling a reluctant adaptation rather than a full embrace.
Conclusion:
Finally, from the two case studies, the strategies of Meta and Apple highlight the complexities of regulating Big Tech. Both companies aim to protect their dominant positions while aligning superficially with compliance expectations. Their resistance underlines the ongoing tensions between enforcing fair competition and addressing innovation, privacy, and security concerns in the digital age.
Bibliography:
[1]Palfrey, J. (2010). Four phases of internet regulation. Social Research: An International Quarterly, 77(3), 981-996
[2] Lessig, L. (1999). Code and other laws of cyberspace. New York, NY: Basic Books.
[3] Zuboff, S. (2019, January). Surveillance capitalism and the challenge of collective action. In New labor forum (Vol. 28, No. 1, pp. 10-29). Sage CA: Los Angeles, CA: SAGE Publications.
[4] Lessig, supra
[5] Cohen, J. (2012).”Transnational Governance:The Emergence of a Global Regulatory Regime.”
[6] Foucault, M. (1980).”Power/Knowledge: Selected Interviews and Other Writings, 1972-1977.
[7] Cafaggi, F. (2011).New foundations of transnational private regulation. Journal of Law and Society, 38(1), https://doi.org/10.1111/j.1467-6478.2011.00533.x
[8] Mahler, T. (2019).”Chapter 2:A global ‘private’ regime governing the Domain Name System (DNS)”. In Generic Top-Level DomainsA Study of Transnational Private Regulation. Cheltenham, UK: Edward Elgar Publishing. Retrieved Dec 14, 2024, from https://doi-org.ezproxy.uio.no/10.4337/9781786435149.00012
[9] Bradford, A. (2020).The Brussels Effect: How the European Union rules the world. Oxford University Press.
[10] European Commission.(2022).Digital Services Act Package.Retrieved from https://ec.europa.eu/digital-services-act.
[11]van den Boom, J. (2022).What does the Digital Markets Act harmonize? – exploring interactions between the DMA and national competition laws. European Competition Journal, 19(1), 57–85. https://doi.org/10.1080/17441056.2022.2156728
[12] van den Boom,ibid.
[13] Bostoen, F. (2023).Understanding the Digital Markets Act.The Antitrust Bulletin,
[14] Bostoen, F.ibid
[15] van den Boom, J. (2022).supra
[16] Article 2(2)DMA
[17] Article 3(1)DMA
[18] European Parliament and Council. (2000).Directive 2000/31/EC on electronic commerce. Official Journal of the European Communities. Retrieved from https://eur-lex.europa.eu
[19] Article 17 Copyright DSM Directive
[20] Eleonora Rosati, The legal nature of Article 17 of the Copyright DSM Directive, Journal of Intellectual Property Law & Practice, Volume 15, Issue 11, November 2020, Pages 874–878,
[21] Rudohradská, S. (2024).ABUSE OF A DOMINANT POSITION ON THE DIGITAL MARKETS – CASE META VS. BUNDESKARTELLAMT. EU and Comparative Law Issues and Challenges Series (ECLIC), 8, 473–490.
[22] Bergstrøm, O. I. (2023).Meta’s search for legitimate and accountable content regulation (Master’s thesis). ISO 690
[23] Article 7
[24] Article 14 and 15 DSA
[25] Article 19,DSA
[26] Article 22,DSA.
[27] Article 34,DSA
[28] Bergstrøm, O. I.supra.
[29] Article 24,DSA.
[30] Article 5(2),DSA.
[31] Article 13,DSA
[32] Article 28,DSA
[33] Article 13, EU Commission Recommendation 2018/334
[34] Meta. (2024).Misinformation. Transparency Center. Retrieved December 2, 2024, from https://transparency.meta.com.
[35] Meta. (2024).ibid
[36] Article 6(1)(a),DMA
[37] Rudohradská, S. (2024)supra
[38] Article 6 and 7 DMA.
[39] Article 5(1)(b)DMA.
[40] Article 5(1)(d),DMA
[41] Hovenkamp, H. (2021).Antitrust and platform markets. Harvard University Press.
[42] OECD. (2014).Note on the Commission’s recent enforcement of EU antitrust rules in the pharmaceutical sector (DAF/COMP/WD(2014)62).Organization for Economic Co-operation and Development. Retrieved from https://www.oecd.org
[43]Riis, T. and S.(2019).Leaving the European Safe Harbor, Sailing toward Algorithmic Content Regulation. Journal of Internet Law, 22(7),1-21
[44] European IP Helpdesk. (2019).Copyright in the digital single market: Key changes and implications. Intellectual Property Helpdesk. Retrieved December 2, 2024, from https://intellectual-property-helpdesk.ec.europa.eu
[45] European Commission. (2023).Regulation (EU) 2022/1925 on contestable and fair markets in the digital sector (Digital Markets Act).Retrieved December 2, 2024, from https://ec.europa.eu
[46] Voon, T., & Mitchell, A. D. (Eds.).(2021).The regulation of digital markets. Cambridge University Press
[47] Veale, M. (2022).Digital Services Act: A framework to regulate large platforms. European Law Journal, 28(4), 321–342.
[48] Veale, M.,ibid.
[49] Meta. (2024, January 22).Offering people more choice on how they can use our services in the EU. Retrieved from https://about.fb.com
[50] Voon, T. supra
[51] Lomas, N. (2023, September 1).Meta accused of failing to comply with EU Digital Services Act. TechCrunch. Retrieved from https://techcrunch.com
[52] Hovenkamp, H. (2021). supra
[53] Dedrick, J., & Kraemer, K. L. (2020).Digital platforms and global value chains. Global Strategy Journal, 10(3), 421–440.
[54] Lobel, O. (2018).Digital platforms and global law. Oxford University Press.
[55] Veale, M.,supra.
[56] Hovenkamp, H. supra
[57] EU Commission. (2019).Competition policy for the digital era. Publications Office of the European Union.