Abstract
Why has the policy idea of mandating digital platforms with the procedural enforcement of national laws proven to be so widely appealing? This article employs an instrumental case study of Germany’s NetzDG and develops a power-integrated multiple streams approach to examine the power dynamics between nation-states and big tech platforms in the development of digital policy. Drawing on 26 elite interviews, document analysis, and process tracing, the research uncovers a tug-of-war over sovereignty in the digital sphere. This struggle is characterized by the state’s objectives to reclaim power, continuous contention over key resources, and the critical role of policy entrepreneurs, all of which profoundly shape contemporary technology regulations.
What is prohibited offline is also not allowed online and will be punished.
—Heiko Maas, 2015
Germany’s Network Enforcement Act1 ushered in a new era of internet regulation, characterized by an increased assertion of state power over digital platforms. Introduced, passed, and enacted within an unusually rapid timespan of nine months in 2017, this law stands as likely the first in the world to specifically target social media.2 NetzDG pioneered a new regulatory framework that compelled social media platforms to actively enforce German laws against illegal speech. It set stringent obligations for swift content removal, mandated transparency reporting, and imposed severe fines for noncompliance—establishing a regulatory approach both bold and unprecedented. Described as “the first of its kind worldwide,”3 NetzDG represented a departure from the previously dominant laissez-faire approach to platform regulation. In the European Union (EU) and United States, limited liability frameworks had focused primarily on individual content decisions, offering broad protections and granting internet services immunity from liability for user-generated content. In contrast, NetzDG required platforms to implement specific systems for content review and removal, compelling them to proactively manage moderation procedures to comply with German law rather than merely making ad-hoc content decisions. This approach not only shifted the legal compliance burden onto the platforms but also significantly enhanced state oversight, steering toward a new direction in platform regulation.4 NetzDG set a precedent for “new school speech regulations,”5 mandating that social media platforms establish effective procedures to promptly enforce national speech laws.
Following NetzDG’s lead, several democratic jurisdictions have since adopted new, dedicated laws that specifically target internet services, including social media, and create obligations for these platforms to establish procedures for swiftly removing content illegal under national laws. For instance, France’s Law Avia in 2020,6 Austria’s KoPl-G in 2021, Australia’s Online Safety Act in 2021, the United Kingdom’s Online Safety Act in 2023, and, importantly, the EU’s Digital Services Act (DSA), which came into effect in Germany in May 2024, effectively repealing NetzDG.7 Reflecting on NetzDG’s impact, Hemmert-Halswick argues that it was evident that “NetzDG has already done pioneering work and paved the way for similar laws on content regulation.”8
Although NetzDG has been praised by some as a pioneering law, concerns about its impact both within Germany and internationally persisted. Its effectiveness in content moderation has been questioned by researchers like Heldt,9 Schmitz and Berndt,10 and particularly Liesching et al.,11 who found that platforms removed only a marginal amount of content under NetzDG, concluding that the law had “no immediate, significant regulatory effect.”12 Even before its adoption, the requirement for social media companies to enforce complex legal standards raised concerns not only about a potential chilling effect on speech but also about the capacity and appropriateness of private actors performing public regulatory functions.13 Perhaps most alarmingly, autocratic states, including Russia, have begun using NetzDG as a blueprint for their own legislation, leveraging it for censorship under the guise of compliance with national speech laws.14
Given the criticisms and concerns about its impact, the swift enactment of NetzDG in Germany, coupled with continued international interest in adopting regulatory frameworks for the procedural enforcement of national laws by online platforms, presents a paradox. Does NetzDG represent a universally appealing “idea whose time has come”?15
This research examines NetzDG as an instrumental case study to analyze how the idea of this particular law emerged successfully at this point in time. Integrating Kingdon’s multiple streams approach (MSA)—a well-established framework for analyzing the role of ideas in the context of policy change across the problem, policy, and political streams16—with theories of regulatory power, the research proposes an original, power-integrated MSA. This approach is used to analyze how power negotiations between regulatory actors drive complex regulatory processes and the adoption of ideas. The study reveals a complex tug-of-war between the state and social media companies, highlighting how contests over power and resources shape regulatory policy. The analysis not only sheds light on the specific case of NetzDG but also provides insights applicable to understanding regulatory challenges in digital governance globally.
Power in the Regulation of Digital Platforms
Platform Regulation in Germany
The early days of the internet were marked by a belief in a “naturally independent”17 cyberspace, where innovation and creativity could thrive without the constraints of traditional regulation. This perception was rooted in the idea that the internet transcends national borders and regulatory jurisdictions, making it inherently ungovernable and resistant to centralized control.18 Scholars and policymakers were quick to challenge “utopian visions of space beyond the state.”19 Nonetheless, a deep-seated preference for minimal regulation persisted well into the early 2000s, favoring limited government intervention in the digital sphere.20 With the growing influence of the internet and the rise of user-generated content beginning in the mid-2000s, governments worldwide eventually passed legal frameworks addressing key areas such as privacy, copyright, and e-commerce. While introducing fundamental legal boundaries, these early regulatory efforts were designed to protect and foster the bourgeoning digital landscape. For example, the United States with Section 230 of the Communications Decency Act of 1996 and the EU with the e-Commerce Directive of 2000 (ECD) embraced limited liability provisions for internet services. Specifically, in Germany, these provisions are further defined under the Telemedia Act of 2007 (TMG). Under these laws, internet services were generally not held liable for user-generated content, unless they failed to remove illegal content after receiving a specific notice about such content. This reactive approach emphasized content-based decisions without enforcing systematic procedures for content moderation and compliance.
In more recent years, there has been a trend toward increasing centralization of the internet, with a few dominant big tech platforms emerging as central infrastructures of social, economic, and political life, both online and offline.21 However, it is becoming evident through a growing body of research that these digital platforms are not mere neutral conduits of information but actively contribute to a range of online harms such as disinformation, privacy violations, cyberbullying, hate speech, and the erosion of trust in institutions.22 High-profile scandals involving these platforms, such as the Cambridge Analytica affair and Russian interference in US elections, triggered a critical reevaluation of existing regulatory frameworks, particularly with respect to social media platforms.23 Against the backdrop of mounting evidence about online harms on digital platforms, there is now a growing recognition that more stringent regulatory measures and direct government intervention are imperative, as highlighted by Stockmann.24 Concurrently, since 2016 an increasing number of nation-states have adopted specifically platform-aimed regulatory measures.25
The Network Enforcement Act
Enacted in 2017, following an unusually rapid adoption process, NetzDG was the first dedicated law from a democratic state to specifically address social media companies, compelling them to enforce German law by removing illegal speech from their networks. Although the notice-and-takedown approaches specified in TMG remained intact for internet services more broadly, NetzDG went beyond this, legally requiring social media platforms to establish systems for handling notices about illegal content. Critically, this included procedures to ensure that potentially illegal content is promptly reviewed and, if necessary, removed or blocked. The law sets strict timeframes for content review, mandating the removal of manifestly illegal content within 24 hours and all illegal content within 7 days. Failure to comply can result in fines of up to €5 million for individuals and up to €50 million for companies. Overall, this procedural approach sought to ensure that platforms proactively managed content moderation processes to comply with German law.
The Act put forward a narrow definition of illegal content that is limited to specific offenses outlined in the German Criminal Code (StGB). According to NetzDG, illegal content includes all content that falls under the criminal offenses listed in § 1 (3) NetzDG. Although this approach effectively covered “only relatively vile statements,”26 the law refrained from defining broader and more phenomena such as hate speech or disinformation (Table 1).
Paragraph | Content Description |
§ 86 StGB | Dissemination of propaganda materials of terrorist and unconstitutional organizations |
§ 86a StGB | Use of symbols of terrorist and unconstitutional organizations |
§ 89a StGB | Preparation of a serious violent act endangering the state |
§ 91 StGB | Instructions for committing a serious violent act endangering the state |
§ 100a StGB | Treasonous forgery |
§ 111 StGB | Public incitement to commit crimes |
§ 126 StGB | Disturbing the public peace by threatening to commit crimes |
§ 129 StGB | Formation of criminal organizations |
§ 129a StGB | Formation of terrorist organizations |
§ 129b StGB | Criminal and terrorist organizations abroad; support for such organizations |
§ 130 StGB | Incitement of the people |
§ 131 StGB | Representation of violence |
§ 140 StGB | Rewarding and approving of crimes |
§ 166 StGB | Defamation of religions, religious, and ideological associations |
§ 184b StGB | Distribution, acquisition, and possession of child sexual abuse material |
§ 185 StGB | Insult |
§ 186 StGB | Defamation |
§ 187 StGB | Slander |
§ 189 StGB | Defamation of the memory of the deceased |
§ 201a StGB | Violation of the intimate personal sphere through image recordings |
§ 241 StGB | Threatening |
§ 269 StGB | Falsification of data with evidential value |
Paragraph | Content Description |
§ 86 StGB | Dissemination of propaganda materials of terrorist and unconstitutional organizations |
§ 86a StGB | Use of symbols of terrorist and unconstitutional organizations |
§ 89a StGB | Preparation of a serious violent act endangering the state |
§ 91 StGB | Instructions for committing a serious violent act endangering the state |
§ 100a StGB | Treasonous forgery |
§ 111 StGB | Public incitement to commit crimes |
§ 126 StGB | Disturbing the public peace by threatening to commit crimes |
§ 129 StGB | Formation of criminal organizations |
§ 129a StGB | Formation of terrorist organizations |
§ 129b StGB | Criminal and terrorist organizations abroad; support for such organizations |
§ 130 StGB | Incitement of the people |
§ 131 StGB | Representation of violence |
§ 140 StGB | Rewarding and approving of crimes |
§ 166 StGB | Defamation of religions, religious, and ideological associations |
§ 184b StGB | Distribution, acquisition, and possession of child sexual abuse material |
§ 185 StGB | Insult |
§ 186 StGB | Defamation |
§ 187 StGB | Slander |
§ 189 StGB | Defamation of the memory of the deceased |
§ 201a StGB | Violation of the intimate personal sphere through image recordings |
§ 241 StGB | Threatening |
§ 269 StGB | Falsification of data with evidential value |
Source: German Criminal Code, Strafgesetzbuch (StGB). Translations provided by the author(s).
As an early example of platform-aimed content regulation, NetzDG sparked significant interest and inspired a broad spectrum of research. Examining the literature on the regulatory policymaking processes underpinning NetzDG, three categories of inquiry were identified: journalistic, legal, and political analysis. The first category includes journalistic analyses that explore the implementation of the Act, including reports on platform problems. This includes investigations into opaque content deletion practices and into dubious working conditions of German content moderation teams.27 The second category comprises legal analyses. Scholars have examined the Act’s legal implications, assessed its compatibility with EU frameworks, and contextualized its development within the broader history of speech and technology regulations.28 The third category focuses on empirical analysis of the political processes underpinning NetzDG. Scholars specializing in regulatory politics have pointed out that NetzDG is deeply embedded in German political institutions, challenging the view of it as a disruptive legislative innovation. Notable studies include Gorwa’s29 exploratory analysis of domestic and EU politics and He’s30 discursive analysis of the policy discourse around NetzDG, both supporting this perspective.
The existing literature offers valuable insights into the regulatory policymaking processes of NetzDG, highlighting various factors across societal, legal, and political dimensions. However, much of this scholarship examines these factors in isolation. As a result, a comprehensive analysis of the underlying regulatory processes that drove the adoption of NetzDG as a successful policy idea remains largely unexplored.
Kingdon’s MSA
This analysis draws from Kingdon’s MSA to effectively bridge the critical gap identified in the study of NetzDG.31 Widely acclaimed in agenda setting and policy change research, MSA introduces a conceptual framework to dissect policy processes through three interconnected streams: problems (public and political perception of issues), policies (available solutions including legislative proposals), and politics (policymakers’ receptivity and political climate). Kingdon utilizes MSA as an explanatory lens to understand why a specific idea is adopted among numerous potential problems and solutions.32 MSA posits that for a policy idea to become adopted, multiple conditions must align across the streams, creating a transient “policy window”33 for policy change. Kingdon challenges the notion that an idea can be so compelling that it simply “sweeps over our politics and our society, pushing aside everything that might stand in its path.”34
When applied to the study of NetzDG, MSA offers a valuable theoretical framework to bridge the existing gaps in research for three key reasons. First, its tripartite analysis across streams provides a comprehensive heuristic, enabling a nuanced understanding of NetzDG’s development. Second, by treating these streams as interrelated, MSA effectively reveals the dynamic interplay of ideas, actors, and institutions across the streams. Third, MSA’s emphasis on the dynamic nature of regulatory processes shifts the focus from merely cataloging events to providing in-depth explanations, transitioning from describing the “what” happened to explaining the “how” and “why” behind specific ideas.35
Adhering to Kingdon’s original framework, this case study conceptualizes the three streams as follows:
Problem stream: Problems are policy issues that are deemed to require attention. Kingdon acknowledges that problems are ambiguous, and there are no objective indicators to determine which problems will receive attention and emerge on the public agenda. Problems are often brought to the forefront of attention by “focusing events,”36 critical moments, or crises that attract widespread attention.
Policy stream: Policies are feasible solutions to problems. They are actively proposed by policy actors. According to Kingdon, policies take time to develop. They whirl around in what he coins a “policy primeval soup,”37 evolving as they are proposed and modified.
Politics stream: The politics stream considers policymakers’ receptivity to a proposed policy solution to a problem. Policymakers’ motives are informed by personal beliefs and political ambitions but are also embedded in wider political contexts responding to shifts in national mood, electoral events, or coalition politics.
Kingdon contends that commonly, the convergence of streams is promoted by “policy entrepreneurs,”38 who are actors that promote specific ideas, try to build support for them, and push them on the public agenda. Although Kingdon’s MSA developed from the study of US politics, the framework has been prolifically applied, including in the German context.39
Conceptions of Regulatory Power
The study of platform regulation unfolds within a complex tapestry of power between private corporations and public entities. Platforms have emerged as influential political actors globally, actively shaping policy directions and pursuing corporate aims. They engage with global political and business elites and leverage their substantial financial clout for lobbying purposes.40 Some scholars have likened digital platforms to nation-states, arguing that they have attained a level of sovereignty that can rival or at least complement state power.41
Recognizing the inherent role of power in regulatory processes, this case study examines the power dynamics between governments and platforms in shaping regulatory policy, specifically focusing on the adoption of NetzDG as a policy idea. To achieve this, the research draws on insights from political science and public administration literature, advocating for a nuanced understanding of power that considers it both relational and resource-driven.
At the core of the conception of regulatory power put forward in this research is Dahl’s influential definition: “A has power over B to the extent that he can get B to do something that B would not otherwise do.”42 At a basic level, fundamental theories of regulation encompass such relational understandings of power, where a regulator A seeks to impact the behavior of a regulatee B. Extending Dahl’s conception to the context of regulation, Mann posits that “infrastructural power”43 is the “institutional capacity of a central state, despotic or not, to (…) logistically implement decisions”44 vis-à-vis a public.45 This perspective positions the state’s capacity to implement policy—specifically, the regulator’s ability to influence a regulatee—as a critical component of regulatory power.
Lindvall and Teorell46 extend ideas of power in regulation further and focus on the formative relationship between policies, their outcomes, and the role of resources in policy enforcement. They posit that state capacity is “a form of power that is exercised by using specific resources to enhance the effectiveness of specific policy instruments.”47 This perspective is complemented by Black,48 who introduces the broader concept of regulatory capacity, applicable to any regulatory actor, as the use of resources toward regulation. This scholarship posits that regulatory power stems from the state’s capacity to utilize resources for regulation (Figure 1).
Synthesizing these theories in the context of platform regulation, this case study puts forward a relational and resource-based conception of power: Regulatory power is the capacity of states to utilize resources to achieve the desired outcomes by influencing platform behavior. In the context of NetzDG and drawing from the author’s previous research on platform regulation in the United Kingdom,49 this case study identifies four key resources: information, authority, organized expertise, and capital (Table 2).
Resource | Definition |
Information | Access to up-to-date information relevant for decision-making and monitoring compliance |
Authority | The state’s legal power and legitimacy within its legal system to order |
Organized expertise | Professionalized staff with relevant training, education, and experience. Such staff cannot be built instantaneously but is a resource that needs to accumulate over time. |
Capital | Financial resources that enable access to other relevant regulatory resources, including organized expertise |
Resource | Definition |
Information | Access to up-to-date information relevant for decision-making and monitoring compliance |
Authority | The state’s legal power and legitimacy within its legal system to order |
Organized expertise | Professionalized staff with relevant training, education, and experience. Such staff cannot be built instantaneously but is a resource that needs to accumulate over time. |
Capital | Financial resources that enable access to other relevant regulatory resources, including organized expertise |
Note: Conceptualization of state capacity by the author(s) drawing from Mann, Lindvall, and Teorell, and Black; the synthesis of interview data collected for this research; prior research conducted by the author(s) on platform regulation in the United Kingdom. Lindvall, Johannes, and Jan Teorell. State Capacity as Power: A Conceptual Framework. STANCE Working Paper Series, May 2016; Black, Julia. “Enrolling Actors in Regulatory Systems: Examples from UK Financial Services Regulation.” Public Law 2003, no. Spring Issue (2003): 63–91; Neudert, Lisa-Maria. “Regulatory Capacity Capture: The United Kingdom’s Online Safety Regime.” Internet Policy Review 12, no. 4 (December 1, 2023).
Power-Integrated MSA
This case study presents a novel approach by integrating Kingdon’s MSA with a relational, resource-based view of regulatory power: the power-integrated MSA. Incorporating insights on regulatory power into the MSA, this approach not only provides insights into the multiple streams but also highlights how power dynamics play a crucial role in their convergence, as well as the adoption of ideas. This integrated approach advances the study of platform regulation in three critical ways:
State-platform power dynamics: The approach unveils the complex interplay between the state and platforms, offering analytical insights into how and why their relationship shapes the adoption of ideas.
Strategic resource utilization: In mapping the utilization of resources, frequently directed by policy entrepreneurs, this approach reveals how states and platforms strategically influence regulatory policy.
Actionable strategies for boosting policy impact: By highlighting resource configurations and pinpointing areas of diminished resources, this method provides targeted recommendations to amplify regulatory efficacy.
The empirical case analysis addresses two research questions. First, in what ways did convergence of problem recognition, policy proposals, and political contexts promote the adoption of NetzDG as a policy idea? Second, how did the relationship of the state and platforms shape Germany’s approach to platform regulation, and what does this reveal about the dynamics of regulatory power in digital regulation?
Methods
This research utilizes the power-integrated MSA in an instrumental case study of NetzDG by employing interpretative methods. Through the instrumental approach, the study aims to achieve a nuanced understanding of the German case and gain deeper insights into the phenomenon of regulations imposing procedural obligations for the enforcement of national laws by platforms.50
Germany is selected as an instrumental case study for several compelling reasons. To begin with, Germany is the first democracy to implement standalone, platform-aimed regulation. Following the enactment of NetzDG, countries such as Australia, Austria, France, and the United Kingdom, along with the EU, have adopted dedicated regulations targeting internet platforms, charging them with enforcing national speech laws. Second, Germany wields significant influence over European politics and within the various institutions of the EU, potentially shaping the trajectory of platform regulation beyond NetzDG.51 Third, Germany’s role as a reformer of digital policy, evidenced by impactful early regulations on data protection, provides a relevant context for studying policy ideas aimed at regulating platforms.52
The case study employs an interpretive approach incorporating elite interviews, document analysis, and process tracing.53 This approach proves particularly effective in providing contextual understandings of complex policy processes, revealing underlying motivations, relationships, and explanations constructed by involved actors.54 The utility of this combination of methods has been previously employed in research on emergent platform regulation, validating its applicability in this field.55
As a key component, 26 qualitative, elite interviews were conducted between January and April 2023. Table 3 displays the target groups selected for interview recruitment. A full list of subjects is available in Appendix A. Subjects were selected based on their expertise, influence, or authority within the field of platform regulation. The recruitment strategy focused on elites with relevant insights into NetzDG and the wider German digital policy landscape. LinkedIn, as well as the team sections of relevant organizations, were utilized to identify relevant participants. Additionally, this case study employed snowball sampling techniques.
Target group | Description | Number of participants |
Government | Senior civil servants, government officials, regulators, and legislators in government departments and on committees, including the German Bundestag, the Bundestag Digital Agenda Committee, the German Federal Chancellery’s Digital Policy Office, and the Ministry of Justice. Also, personnel at European institutions, if there was a direct connection to NetzDG and platform regulation in Germany | 9 |
Academia | Researchers at universities and academic research institutions that have conducted research on NetzDG or topics related to it, including platform regulation, German digital policy, and German regulatory politics | 8 |
CSO | Experts in civil society organizations (CSO), including nonprofit organizations, think tanks, and civil rights organizations | 5 |
Law | Lawyers, legal experts, and judges at law firms, legal counsels, and German courts who have led prominent cases or published research, case law, or legal evaluations on NetzDG, media regulation, and audio-visual media regulation in Germany | 3 |
Target group | Description | Number of participants |
Government | Senior civil servants, government officials, regulators, and legislators in government departments and on committees, including the German Bundestag, the Bundestag Digital Agenda Committee, the German Federal Chancellery’s Digital Policy Office, and the Ministry of Justice. Also, personnel at European institutions, if there was a direct connection to NetzDG and platform regulation in Germany | 9 |
Academia | Researchers at universities and academic research institutions that have conducted research on NetzDG or topics related to it, including platform regulation, German digital policy, and German regulatory politics | 8 |
CSO | Experts in civil society organizations (CSO), including nonprofit organizations, think tanks, and civil rights organizations | 5 |
Law | Lawyers, legal experts, and judges at law firms, legal counsels, and German courts who have led prominent cases or published research, case law, or legal evaluations on NetzDG, media regulation, and audio-visual media regulation in Germany | 3 |
Despite attempts to contact current and former platform personnel, no interviews were secured with individuals from these groups. However, two subjects had current or previous affiliations with the German digital lobby association Bitkom, which represents over 2,000 member companies in the digital economy, including major firms like Google, Facebook, and TikTok.56 One subject had previously moderated an official multiepisode Meta podcast called “The Facebook Briefing.”57 To address the absence of platform perspectives in interviews, official statements issued by Facebook, Bitkom, and eco—another key digital lobby association in Germany— as well as media statements made by spokespersons for Facebook and Google were included in the analysis.58
Interview data were collected through Zoom interviews, which were audio- and video-recorded with participants’ consent. Interviews lasted between 30 and 90 minutes and followed a semi-structured design combining open-ended with more targeted probing questions. Examples of interview questions are available in Appendix B. Transcription of the interviews was performed manually or with the assistance of the transcription software Trint. Interviews were held in English and German.59 Following each interview, a synthesis memo was created to highlight key quotes and relevant topics.
During the data collection period, NetzDG was in active use. Although no official provisions were in place at the time of fieldwork, there was widespread speculation about the DSA replacing NetzDG.60 At the same time, Germany was already preparing for the implementation of the DSA.61 This timing enabled subjects to evaluate NetzDG retrospectively, considering the context of new regulations aimed at digital platforms and how NetzDG has evolved in practice. However, this approach also introduced significant limitations, particularly the potential distortion of subjects’ recollections of events because of effects such as hindsight bias, recency effects, or selective memory.
To complement the interview-based research, this case study employs document analysis, focusing on documents that span the period from early policy discussions on online hate speech to the adoption of the NetzDG (2015–17). To select relevant documents, targeted searches were conducted on government websites, archives of policy documents were examined, and online databases were explored. Additionally, archives of press releases and blog posts of major platforms, including Google, Meta, and Twitter, were queried, but no relevant documents were identified. Keywords such as NetzDG, Netzwerkdurchsetzungsgesetz*, Rechtsdurchsetzung*, sozial* Netz*, sozial* Medi*, social medi*, Plattform*, Plattformregulierung*, Facebook-Gesetz*, hate speech, and Hassrede* were strategically combined using Boolean search operators to construct search strings. The archive included the law; official draft proposals for the law made by the Federal Ministry of Justice,62 the Federal Government, and political factions; official statements issued in response to the proposed law; official evaluations; and protocols of plenary hearings. A list of key documents included in the archive is available in Appendix C.
To analyze the data, a combination of thematic analysis and process tracing was used to identify and interpret underlying patterns and gain a comprehensive understanding of their relevance to the research questions. The analysis systematically categorized key phrases and concepts, progressing from descriptive codes to overarching themes.
Case Analysis and Findings
This case analysis of Germany’s NetzDG integrates the traditional MSA with an analysis of regulatory power to study emergent platform regulation. The power-integrated MSA enables rich contextual understandings of how and why a specific policy idea succeeds. Specifically, it examines how regulatory power—embodied in the control and allocation of resources such as information, authority, organized expertise, and capital—is not only exercised but also fundamentally shaped by their strategic utilization. By exploring how these resources influence the problem, politics, and policy streams, the study reveals the intricate power dynamics that underpin policy adoption in the digital governance.
Problem: Failing Regulatory Power
Official documents reveal that NetzDG was introduced to address significant enforcement problems under the TMG, which mandates that internet services, including social media platforms, remove illegal content from their networks. The Bundestag’s official justification for NetzDG recognized “significant problems in the enforcement of existing law,”63 particularly concerning laws related to criminal illegal speech such as the incitement of the people, insult, defamation, and disturbance of public peace by threatening to commit crimes—all of which are criminal offenses under the German Criminal Code (StGB). According to justification, NetzDG aimed to achieve an “improvement in the enforcement of law,”64 introducing new compliance rules to compel the effective and prompt removal of illegal content, an objective that was not being met under the TMG.
According to numerous subjects, these enforcement problems first became widely evident during the 2015 refugee crisis, when Germany welcomed hundreds of thousands of Syrian refugees under Chancellor Angela Merkel.65 This period witnessed a rampant spread of illegal hate speech, racial extremism, and disinformation on social media not only in Germany66 but also across Europe.67 It exposed platforms’ insufficient efforts to adhere to the TMG’s content removal requirements. Dr. Stefan Dreyer, Senior Researcher for Media Law at the Leibniz Institute, recalled, “The refugee crisis marked the very first time that there was a political discourse about the responsibility of digital intermediaries … and their legal liability.”68
The refugee crisis and subsequent “hate speech crisis”69 acted as a backdrop for a series of focusing events that propelled issues around online illegal content onto the public agenda.70 The widespread circulation of a so-called “Merkel selfie” with a Syrian migrant in defamatory online campaigns, the viral spread of online fake maps alleging widespread refugee crime, and the harassment of female politicians and politicians of Turkish descent, including Aydan Özoğuz and Cem Özdemir, added to the challenges faced.71 Concurrently, extremist groups, notably the far-right Alternative für Deutschland (AfD) party and the anti-Islam Patriotic Europeans against the Islamization of the West (Pegida), gained substantial momentum on social media.72 In Germany, these trends were met with particular sensitivity, sparking concerns about a resurgence of racist and xenophobic sentiments reminiscent of the Third Reich. Increasingly, social media appeared to many as a gathering place “for neo-Nazis and an amplifiers of the kinds of speech that are illegal under German law.”73 Reports about interference and computational propaganda in the 2016 US election and the UK Brexit referendum similarly fueled concerns over illegal contents and their impact.74
Amid growing public and political awareness of online hate speech issues, Heiko Maas, serving as the Minister for the Federal Ministry of Justice and known for his strong stance against right-wing extremism, emerged as an early proponent of government intervention and “policy entrepreneur.”75 Maas was instrumental in framing the hate speech crisis as primarily a problem of insufficient enforcement of German law on the internet, which laid the groundwork for subsequent legislative action. His 2015 open letter to Facebook, triggered by reports of racial hatred online, marked a critical moment. In it, he declared that the internet should not be a “lawless space”76 and emphasized the platform’s obligation under TMG to remove illegal content. In a later interview, Maas coined the phrase, “What is prohibited offline is also not permitted online.”77 This concept became the foundational idea behind several of his platform-aimed initiatives, including NetzDG. He persistently promoted this principle through numerous interviews and public appearances, making it synonymous with NetzDG.
Maas’s framing of the policy problem as an enforcement issue resonated widely. In 2015, Chancellor Angela Merkel expressed her support for Maas and stressed in an interview that both the state and the platforms, and specifically Facebook, had to undertake urgent action to counter hate online.78 In 2016, Federal President Joachim Gauck expressed support for political and civil measures to “more effectively tackle internet hate speech.”79 In 2017, in his inaugural interview as the new Federal President, Frank-Walter Steinmeier voiced concern over the brutalization of online discourse, highlighting the need for reform.80 Thus, the spread of illegal content facilitated by social media platforms was a problem at the forefront of those most influential in German politics.
The problem also captured broader public interest. Surveys indicated that a majority of Germans supported more stringent controls on online content.81 Media coverage extensively highlighted platforms’ failures to enforce protections against illegal speech. A notable example was “The Hate Machine”82 by influential journalist Jan Fleischhauer in Der Spiegel. The article criticized the brazenness of social media companies for their failure to comply with TMG by not removing illegal content, as well as the government’s lack of enforcement of these laws. Described as an “historical document,”83 the article highlighted growing demand for improved enforcement.84
Against this backdrop, Maas initially championed voluntary, collaborative efforts between the state and social media platforms. In 2015, Maas initiated a collaborative task force against illegal hate speech online that included Facebook, Google, Twitter, and various civil society organizations. Building on Maas’s central idea of enforcing laws more effectively online, as part of this initiative, companies made voluntary commitments to review complaints about illegal content and, if necessary, remove it within 24 hours. Platforms initially appeared to embrace the task force. In an official statement issued by the task force members, platforms acknowledged that “the review of complaints poses challenges for companies”85 and made various commitments to improve. Spokespersons from Facebook and Google publicly committed to hiring sufficient German-speaking staff to manage complaints.86 Yet, in 2016, official evaluations raised doubts about the task force’s effectiveness and platform compliance.87 In a letter to Facebook, Maas contended that the company’s efforts were insufficient compared to what had been agreed upon in the task force, criticizing them as “too little, too slow, and often misdirected in removing the wrong type of content.”88 Concurrently, public pressure over hate speech and disinformation intensified amid the ongoing refugee crisis, the 2016 Presidential Election and Brexit. Maas felt betrayed by the companies’ failure to make substantial progress in combating hate speech and instructed legislative lawyers at the Ministry of Justice to initiate work on a hard law approach, aimed at finally ensuring the enforcement of laws online.89 The first draft, introduced on March 14, 2017, coincided with the publication of the final evaluation report about the task force.90 This timing was noted by subjects as unusually swift, emphasizing the perceived momentum and urgency behind regulatory actions in this domain.91
In summary, issues around illegal speech online were brought into focus by the 2015 refugee and subsequent online hate speech crisis, but it was Maas who adeptly characterized the situation as a problem of enforcement. This framing emphasized that the primary policy problem was not the absence of laws but the German government’s failure to effectively enforce existing regulations vis-à-vis platforms.
Resources in the Problem Stream
Exploring the systemic reasons behind the enforcement problems through the lens of regulatory power revealed four critical resource deficiencies. Interviewees highlighted critical information concerning platform technologies and practices, particularly regarding the mechanisms driving the dissemination of illegal content and how platforms manage such material.92 This shortfall significantly limited the government’s capacity to accurately assess the scope and causes of illegal content online, resulting in oversimplified problem definitions. Dr. Daniel Holznagel, then a Legal Officer at the Federal Ministry of Justice involved in drafting NetzDG, explained that policymakers tackled issues of illegal content “from scratch,”93 prompting a focus on obvious issues around content on their networks. Complex, underlying technological phenomena such as algorithms and platform design were still “in their infancy,”94 resulting in these factors being overlooked. Lubos Kuklis, currently serving on the EU’s DSA enforcement team, subsequently highlighted a related challenge. He argued that under the existing frameworks and voluntary obligations at the time, platforms typically shared limited or incomplete data and information, often citing privacy, legal, or logistical concerns. This practice, Kuklis explained, resulted in what he termed “partial monitoring.”95
Moreover, the government faced authority limitations. Although TMG set out obligations for the removal of illegal content, it also allowed for generous safe harbors. Only the most severe content disputes advanced to judicial reviews, which often led to drawn-out processes concluding with minimal fines that fell short of effectively compelling platform compliance.96 A notable example is the lawsuit involving the Merkel selfie, where Anas Modamani, the Syrian refugee depicted in the photo, sued Facebook for defamation and false accusation. The court ruled against Modamani, stating that Facebook was not liable for third-party content.97 Subjects also pointed to jurisdictional challenges and the transnational nature of firms as important factors that complicated enforcement efforts.98 Therefore, limitations in authority left the German government ill-equipped to enforce laws on speech, indicating a deficiency in regulatory power.
Furthermore, participants described a deficit in organized expertise and capital for the enforcement of speech laws online both at the platform and state level.99 Already, the ECD and TMG recognized that the enormous volume of online content necessitated novel systems for oversight of illegal content at large scales. This understanding led to the adoption of notice-and-takedown approaches, mandating platforms to remove illegal content upon receiving a relevant notice. Although there were no specific legal mandates for the use of organized expertise or capital, subjects argued that there was an implicit expectation that platforms would maintain a robust content moderation workforce within Germany and make necessary investments.100 Yet, in light of the pervasive spread of illegal content online, it became clear that the prevalent approach had proven insufficient. When experts began exploring possible interventions, government involvement in existing notice-and-takedown systems was quickly dismissed for resource-related reasons. Dr. Thorsten Thiel, a professor of political science, noted that the German government had been significantly streamlined over the past decades, resulting in a shortage of available staff and making a government-operated solution impractical.101 Likewise, the prospect of increased judiciary involvement in content decisions under the TMG was deemed impractical, given the vast scale of content, which would likely overwhelm judicial systems. As for a technical solution, the government lacked the expertise to develop such measures, and implementing them could also raise concerns about surveillance.102 Consequently, the necessity to rely on the organized expertise and capital of platforms was seen as “without alternative.”103
Politics: Challenged Regulatory Power
Examining the political dynamics that shaped NetzDG’s adoption, the analysis revealed a landscape marked by acute concerns over online hate speech and social media platforms, electoral issues, and long-term strategic ambitions. The reception of NetzDG varied across Germany’s political landscape. Proposed by the SPD’s Heiko Maas during an election year and swiftly brought before the Bundestag, the GroKo, the governing majority coalition of CDU/CSU and SPD, backed NetzDG, emphasizing the urgency of prompt regulation. In the final reading of the law in the Bundestag, Maas described NetzDG as a “prerequisite for the exercise of freedom of expression,”104 while suggesting that additional measures would be necessary in the future. Similarly, in the same session, Nadine Schön from the CDU/CSU acknowledged that the NetzDG might not be the “ultimate solution,”105 but argued that it was important to “get a legal regulation in place now”106 in light of acute concerns relating to hate speech and platform negligence. Numerous subjects noted that this tolerance to gaps and flaws was unusual in German politics, emphasizing the strong political momentum and perceived urgency.107 Politically, the coalition openly positioned the NetzDG as an imperfect part of a broader, future regulatory framework, but emphasized the urgency of promptly passing NetzDG in light of the ongoing hate speech crisis. Furthermore, as the law garnered significant public attention, it presented both the GroKo and rising political figure, Maas, with a valuable opportunity to showcase a political achievement just before the federal election in the fall of 2017.
Receptions among the opposition parties were either reserved or rejecting. The Greens/Bündnis 90 (Greens) voiced concerns over potential speech restrictions. Renate Künast,108 a high-profile member of the Bundestag for the Greens who had chaired an expert hearing of NetzDG, emerged as a vocal campaigner, arguing that under NetzDG “the incentive to delete is greater than the incentive to uphold law and freedom of expression.”109 However, in an official motion to the Bundestag, the Greens ultimately supported the law’s objective to swiftly regulate platforms, calling for new legally binding rules around notice-and-takedown approaches, enhanced transparency, and effective sanction mechanisms.110 The Linke, the smallest fraction in the Bundestag, opposed the Act more vehemently.111 The party expressed strong concerns that the law would impose difficult legal decisions on private platforms and therefore rejected it.
The public reception of NetzDG was notably divided and sparked widespread debate. Although surveys showed support for stricter social media regulation and the media landscape was broadly critical of social media platforms, numerous civil society organizations and industry lobby groups voiced serious concerns.112 Bitkom and eco, representing the interests of major tech companies, including Facebook, Google, and Twitter, publicly commented on the draft law, expressing concerns about its potential to stifle free speech and apprehensions regarding the privatization of speech law enforcement.113 Facebook’s Head of Public Policy, Eva-Maria Kirschsieper, contended that the law would burden private companies with judicial responsibilities, leading to their rejection of the law.114 Arnd Haller, Head of Google’s Legal Department, expressed concern that the law would prioritize swift removal over careful review processes.115 In contrast, supporters of NetzDG dismissed such concerns, pointing to Germany’s robust jurisprudence in handling illegal speech offline and emphasizing public demand for more rigorous platform regulation.116
Discussing the broader political significance of NetzDG, subjects noted that NetzDG was perceived not only as a measure to combat illegal speech but also to address right-wing extremist and neo-Nazi sentiments, which have particular significance in Germany because of its Nazi past. Notably, NetzDG was frequently described as a “law against the AfD,”117 a party known for using hateful rhetoric on their hugely popular social media.118
Another crucial factor in support of NetzDG was Germany’s aspiration to lead in digital policy. Policymakers viewed the law as an opportunity for Germany to be at the forefront of digital policy, building on its longstanding influence in areas like data protection and intermediary liability.119 Furthermore, NetzDG aligned with European digital sovereignty strategies, promoting a stronger EU tech sector through the enforcement of European laws.120 As Torben Klausa, a legal scholar who has published on NetzDG, put it, “NetzDG may not have been the best law, but it was the first one.”121
Initially proposed as late as March 2017, the final reading of NetzDG in the Bundestag—strategically scheduled, as several interviewees noted—took place immediately following a high-profile vote on same-sex marriage and just before the 2017 summer recess. This timing aimed to minimize media focus on NetzDG in case of its failure and also to ensure its passage before the fall federal election, the last viable opportunity for the law’s enactment. Ultimately, NetzDG was approved with support from the GroKo, against the votes of the AfD and the Linke, and with an abstention from the Greens/Bündnis 90.122
Resources in the Politics Stream
Tracing the role of regulatory resources and power in the politics stream revealed empirical explanations as to why policymakers in Germany were widely receptive to regulating social media platforms. With regards to information, the analysis indicated that the limitations experienced in access to information were largely attributed to social media platforms withholding data.123 This lack of engagement first became evident against the backdrop of the 2015 hate speech crisis, spotlighting the absence of a substantive relationship between government authorities and platform officials. The analysis highlighted that platform representation in Germany was often scant or sometimes nonexistent, with even the largest companies typically managing their European affairs through small public affairs teams stationed in Ireland or the United States.124 Under the direction of Maas, the German government—particularly through the Federal Ministry of Justice and the Digital Agenda Committee125—initiated efforts to bridge this gap. They extended invitations to platforms for testimonies, organized visits to their offices, and established a task force on illegal hate speech. However, interviewees consistently reported that these efforts proved ineffectual, highlighting the frustration of government officials with platforms’ lack of meaningful engagement.126 Dr. Martin Husovec described a fundamental information asymmetry between social media platforms and other actors, arguing that the lack of insight into platforms’ handling of illegal content made it difficult to “pin down the problems”127 and therefore “impossible to regulate.”128 Several participants suggested that platforms may strategically withhold critical information to challenge the government’s capacity to regulate, as regulation could potentially threaten their business model or otherwise interfere with platform activities.129 One subject questioned whether platform personnel in Germany even had sufficient access to information or if they were deliberately kept in the dark by US leadership.130
Between 2015 and 2017, the lack of government authority over platforms became starkly evident as interactions devolved into frustrating encounters, highlighting an intensifying struggle for power. For example, Alexander Sängerlaub, an expert on the NetzDG, shared an anecdote about a time when Gerd Billen, who was then the Secretary of State for the Federal Ministry of Justice, felt that he did not receive the proper respect during an official visit to the German Facebook offices. This incident, as reported by Billen to Maas, reportedly motivated Maas to abandon voluntary efforts and instead take a strong stance with a dedicated, hard law approach directly targeted at social media companies—rather than reforming existing regulations. Sängerlaub described a “policy process sparked by anger.”131 Similarly, Dr. Jens Zimmermann, Member of the German Bundestag and Chair of the Digital Agenda Committee, recalled an encounter with a high-ranking staff member from a platform’s government affairs team who boldly stated, “German laws simply don’t align with our business model,”132 a statement that the subject found audacious and inappropriate. Dr. Zimmermann explained that the incident demonstrated the platforms’, at times, overt disregard for state authority and their preference for internal policies over German law. Dr. Torben Klausa emphasized that policymakers’ significant frustration with platforms’ failure to acknowledge government authority was a key factor in implementing NetzDG. He viewed it not just as a regulatory solution but also as a symbolic wake-up call directly targeted at social media companies:
NetzDG was fuelled by an acute political dissatisfaction of feeling that they [policymakers] were not taken seriously by the big tech concerns. They wanted to make a point.133
In summary, frustrations over platform behavior vis-à-vis German officials, laws, and institutions ignited what a subject described as “big, big regulatory energy.”134
Finally, interview analysis revealed that it was the acute frustration about platforms’ resistance to employ sufficient organized expertise and capital dedicated to countering illegal speech that brought the relationship to a boiling point. Under the task force, major platforms, including Facebook and Google, had made voluntary commitments to implement effective processes and hire sufficient staff to handle complaints. However, platforms continued to hold back information about their content moderation practices and numerous investigations had raised doubts on the adequacy of their measures.135 Dr. Zimmermann pinpointed a key moment, which he described as a turning point propelling policymakers’ decision to swiftly enact Maas’s proposal for regulation:
The emergence of the NetzDG can be tied to a specific event. Back then, we had invited representatives from major platforms to the [Digital Agenda] Committee. When I asked the Facebook representative how many moderators they had in Germany, the answer was that it couldn’t be disclosed as it was a company trade secret. It was obvious that there couldn’t have been many, and they simply didn’t want to embarrass themselves. This led to great outrage at the time and to the decision, okay, enough is enough, we need to legislate here and quickly.136
This highlighted that platforms apparently not only neglected their obligations, albeit voluntary, for organized expertise and capital but also prioritized internal, commercial interests over German law. An anonymous interviewee captured this tendency by noting, “platforms are pretty skilled at circumventing by design.”137
Policy: Reasserting Regulatory Power
In the exploration of the policy stream, from a variety of feasible options, NetzDG emerged as the favored policy to curb the spread of illegal content on social media, with policy entrepreneur Maas leading its advocacy. At its core, NetzDG is rooted in the idea that what is illegal offline should also be illegal online, which Maas had coined. Primarily, NetzDG sought to effectively enhance the enforcement of laws on criminal speech in the online sphere. The legal principles underpinning the regulation of speech in Germany are embedded in German Basic Law and deeply ingrained in the country’s legal and cultural history. Established in the wake of World War II to counter the totalitarian media control138 exercised during Nazi rule, Germany’s Basic Law permits the restriction of fundamental rights, including limitations to freedom of expression, which is granted under Article 5 of the German Basic Law.139 In Germany, speech can be constrained by general law “if its purpose outweighs that of freedom of expression.”140 This approach is specifically designed to thwart political extremism and the resurgence of National Socialism. Rather than introducing entirely new speech regulations for the online domain, for example, by legally redefining phenomena like digital fake news or hate speech—an approach that was considered but ultimately dismissed—Germany draws from well-established laws, entrenching fundamental policies that have long circulated in Germany’s primeval policy soup.
Analysis of interviews revealed strong support for the idea of bolstering the enforcement of existing laws online, significantly contributing to the widespread acceptance of NetzDG. Subjects considered the enhancement of enforcement for well-established laws on criminal speech as “embarrassingly obvious”141 and “hard to object to,”142 recognizing not just the widespread acceptance of these laws but also the unrealized obligations of platforms under existing legal frameworks.143 Yet, the appeal of NetzDG extended beyond this pragmatic reasoning, resonating with principles fundamental to German democracy. Subjects stressed that amid a surge of highly politicized, xenophobic illegal hate speech online, concerns about the resurgence of political extremism and nationalist sentiments in Germany had intensified. Many politicians believed the situation required urgent regulatory intervention. In this context, enhancing the enforcement of historical safeguard laws seemed not only appropriate but necessary.144
Various approaches were discussed for enhancing enforcement. Initially, Maas championed a voluntary approach with the task force; yet, when evaluations raised doubts about their effectiveness, demands for legally binding obligations on platforms grew vocal.145 The reform of existing legal frameworks, such as the TMG that regulates internet services, was widely dismissed. Instead, a targeted intervention aimed specifically at social media companies was deemed necessary to take a strong stance vis-à-vis big tech firms. Moreover, the necessity for a dedicated law was underscored as imperative to “send a strong political signal”146 in response to escalating public and political demands to address online hate speech and disinformation. As for a nondomestic approach, Dr. Daniel Holznagel, then Federal Ministry of Justice Legal Officer, disclosed that German policymakers saw little prospect of action from the European Commission, which under Jean-Claude Juncker favored soft law approaches designed to benefit digital markets in Europe.147 As other alternatives were exhausted, Maas introduced NetzDG as a dedicated, hard law approach that addressed prior concerns. NetzDG proposed binding legal obligations for platforms, empowering the state with increased enforcement and oversight tools. Although it was widely known that NetzDG had some gaps and would likely require revision in the future, it met demands for urgent intervention to compel platform compliance after other alternatives had failed or were deemed impractical.148
Resources in the Policy Stream
The research interviews highlighted that the government’s effort to reclaim resources considered critical for effective policy enforcement played a significant role in shaping the NetzDG’s design. On an information level, NetzDG mandated extensive reporting and documentation responsibilities, obliging platforms to document content decisions. Interview subjects believed that such transparency measures could provide critical “knowledge and empiricism”149 and “inject more light”150 into opaque platform processes. A key objective of enhanced information access was to uncover systematic compliance issues. However, some subjects, including Alexandra Geese, a current MEP active on digital issues, viewed transparency measures as a “steppingstone”151 to enable future, evidence-based interventions.
With regards to authority, NetzDG empowered the state by establishing new enforcement tools. It established specific timeframes and procedures for content review and removal, along with requirements for transparency. NetzDG enabled a shift in focus away from the legality of individual pieces of content to platforms’ overall procedures for handling content, emphasizing procedural compliance and systemic oversight. A key feature of this legislative framework was the imposition of substantial fines for noncompliance. Dr. Stefan Dreyer, Senior Researcher for Media Law at the Leibniz Institute, highlighted why these provisions marked a significant shift from previous regulations:
Where we think on a large scale about content moderation, we recognize the Basic Law relevance of the decision-making systems. This means we move away from focusing on individual content and towards the procedures. This was already a first realization in Germany for NetzDG.152
Thus, the emphasis on procedural obligations and compliance signified a transition toward a more comprehensive approach to regulating platforms, enhancing government authority.
In relation to organized expertise and capital, NetzDG placed the onus of enforcing German law on platforms, with the state, specifically the Federal Ministry of Justice as the regulatory agency, overseeing compliance. Moving beyond the general notice-and-takedown stipulations of earlier regulations, NetzDG set forth detailed obligations for the application of organized expertise. This included requirements for having personnel based in Germany, establishing complaint mechanisms for users, and procedures for content review, removal, and transparency. Interviewees emphasized that acknowledging the substantial resource demands for skilled personnel, technology, and finances was central to the decision to entrust platforms with the enforcement. Technology journalist Alexander Fanta argued:
If we were to employ 30,000 moderators, it would certainly not be sufficient. . . . Weighing that, considering it, sleeping on it, it’s extremely difficult to design a good process. Therefore, I would argue that a solution must be inherently structurally embedded within platforms.153
An evaluation of NetzDG calculated that the cost of implementing the law over the first three years amounted to €20.4 million, which was less than the initially estimated €29 million, because of fewer instances of content removal under NetzDG than anticipated. Of this, €2.2 million was attributed to administrative expenses incurred by the Federal Ministry of Justice, which was tasked with monitoring compliance. The remaining costs were attributed to expenses incurred by platforms.154 At the time of the 3-year evaluation, no fines had been imposed.155 In cases of noncompliance, platforms could be liable for significant fines payable to the German government, funds that would be added to the federal budget.
A Policy Window for NetzDG
The adoption of Germany’s NetzDG represents a critical juncture in online speech regulation, marked by the convergence of multiple streams that catalyzed a fundamental policy change. By integrating Kingdon’s streams with concepts of regulatory power in the power-enhanced MSA, this analysis demonstrates that the struggle to establish regulatory power over platforms was deeply intertwined with the adoption of the law. NetzDG emerged as a viable policy idea to tackle the problem of insufficient enforcement of German law online by platforms, enabled by the state’s deficient regulatory over these platforms; politically, platforms were held accountable for this enforcement gap, fueling demand for regulatory intervention, whereby Heiko Maas acted as a policy entrepreneur who actively orchestrated a policy window. Based on this analysis that employs the platform-enhanced MSA, three decisive factors are distilled, which facilitated the opening of a policy window across the four streams for the adoption of NetzDG:
First, Maas’s articulation that “what is prohibited offline is also not allowed online,”156 emerged as the central policy idea of NetzDG, facilitating its success by enabling the state to reclaim power in the digital sphere by entrenching already widely accepted laws. This idea gained traction amid the persistent noncompliance of platforms with the TMG during the hate speech crisis and their general disregard for German law. The promise to empower the state with new enforcement tools to reenforce already well-established laws on speech online was appealing for its simplicity and potential to address policymakers’ frustrations over their limited influence on platforms. Rather than introducing new concepts, NetzDG strengthened preexisting policy ideas whirling around in Germany’s primeval policy soup, representing an incremental yet powerful policy change. Moreover, the speech laws entrenched through NetzDG not only enjoy wide acceptance but are also deeply ingrained in Germany’s Basic Law, satisfying urgent objectives to enhance the state’s influence over digital platforms and marking a concerted effort to reclaim power.
Second, contentious power struggles between the state and platforms underpinned NetzDG, whereby platforms actively sought to stifle regulation by withholding access to critical regulatory resources. In this dynamic, platforms emerged as political actors pursuing corporate interests and political agendas, while resisting governance models that threatened their operational autonomy. Against the backdrop of escalating tensions between policymakers and platforms and the failure of voluntary measures, German policymakers acutely sought to assert their power over platforms through NetzDG, an interventionist approach grounded in hard law. This law mandated platforms to enforce German law on their services and report back to the government about their content moderation procedures. In pursuing this approach, policymakers aimed not only to ensure compliance but also to reclaim strategic regulatory resources, wresting control from the platforms. Although the decision to compel platforms to enforce German law generated controversy, government resource constraints necessitated the involvement of platform resources in overseeing vast amounts of content.
Third, former Justice Minister Heiko Maas emerged as a policy entrepreneur for NetzDG, playing a pivotal role in orchestrating the convergence of the multiple streams. More than just offering a policy idea to address the problem of online hate speech, he rooted NetzDG in broader aspirations to enhance state power over the digital sphere, and in particular, transnational big tech platforms. Driven by both personal concern over online hate and political ambition, Maas was instrumental in framing the policy problem of illegal online content as an enforcement problem primarily associated with a lack of regulatory power over large US social media platforms, particularly Facebook, which strongly resonated with the public and German politicians. Maas persistently campaigned for NetzDG through numerous interviews and public appearances, coining the idea “what is prohibited offline is also not allowed online,”157 which became synonymous with NetzDG. Politically, Maas garnered support from the GroKo coalition, positioning NetzDG as a key policy agenda of the governing coalition in an election year. Furthermore, Maas was pivotal in shaping policymakers’ view that the platforms’ failure to comply with German law, or at least engage in voluntary collaboration, necessitated a stringent legislative approach.
A German Export?
This instrumental case study set out to spotlight the regulatory policy processes that underpin the development of Germany’s NetzDG, with a particular emphasis on the power dynamics between the German state and big tech platforms. The case analysis revealed that NetzDG was fundamentally driven by ambitions to establish regulatory power in the digital sphere. More than just offering a solution to curbing the spread of illegal speech on digital networks, NetzDG enabled the German state to reclaim regulatory power over big tech platforms—a regulatory objective that Maas leveraged as a policy entrepreneur. Across the problem, policy, and politics stream, an ambivalent relationship between the state and platforms emerged. On the one hand, escalating antagonism triggered the adoption of NetzDG as a dedicated, hard law approach. On the other hand, the German government relied on platforms to utilize their proprietary resources in the implementation of the law.
Reflecting on why NetzDG-like regulations, which mandate procedural obligations for platforms to enforce national laws, have emerged in leading democracies—such as Australia, the United Kingdom, and, most notably, the European Union—this section discusses whether NetzDG has evolved into a German export success in platform regulation.
It might be tempting to assume a direct influence whereby Germany, as a G7 state, EU leader, and one of the world’s largest economies, simply induced international regulation. Without question, NetzDG, as the first specifically platform-aimed regulation to emerge from a democracy, was a trailblazer, demonstrating the feasibility of platform regulation and providing a compelling model for doing so. Yet, the power-integrated MSA brings to the forefront a nuanced explanation for the emergence of NetzDG’s idea for content regulation, which could be applicable beyond the German context. In Germany, NetzDG introduced a successful idea for reclaiming critical resources and reasserting power over big tech platforms, while tackling pressing issues surrounding illegal speech online.
Table 4 comparatively summarizes NetzDG’s and DSA’s approaches for the regulation of illegal content online. Whereas compared to NetzDG, the DSA pursues a “broader scope of application”158 with regards to the services and issues addressed, DSA and NetzDG share crucial parallels.159 On a surface level, both acts emphasize the procedural enforcement of national laws by platforms, transparency obligations, and detailed procedures for compliance and complaints handling, alongside substantial fines. However, the DSA diverges from NetzDG by not imposing strict content removal timelines. Furthermore, it introduces innovations like audits and risk assessments for platforms’ content regulation systems.
Features | Network Enforcement Act (NetzDG) | Digital Services Act (DSA) |
Legislative Framework | ||
Approach | Interventionist with regulated self-regulation | Interventionist with co-regulatory elements; multilevel governance |
Implementation | German law enacted in 2017. Updated in 2021 and 2022 | EU law enacted in 2022 |
Scope | Online platforms with ≥2 million users in Germany; excludes messaging, professional, gaming, and shopping platforms (§ 1 (1, 2) NetzDG) | Wide range of online services, including cloud, messaging, social networking, search, and marketplaces (Art. 2 DSA). Enhanced regulation of VLOPs and VLOSEsa with ≥45 million users in the EU (Art. 33 DSA) |
Oversight and enforcement | BMJV (§ 4 NetzDG) | Multilayered oversight framework consisting of National Digital Service Coordinators (Art. 49-51), the European Commission for VLOPs and VLOSEs (Art. 56, 65-78), and European Board for Digital Services (Art. 58, 60-63) |
Fines | Substantial fines. Up to €5 million for individuals, up to €50 million for companies (§ 4 NetzDG). | Substantial fines. Up to 6% of global turnover for VLOPs and VLOSEs (Art. 74 DSA). Member states lay down rules for other services (Art. 52). |
Content Moderation | ||
Content subject to removal | Manifestly illegal content and certain types of illegal content as defined in the criminal code (StGB) of national-level law (§ 1 (3) NetzDG) | Illegal content as defined in EU- and national-level laws (Art. 3 DSA). Content specified in services’ terms and conditions (Art. 14 DSA). |
Content reporting | Users can report content for review (§ 3 NetzDG). | Users and trusted flaggers can report content for review, whereby reports from trusted flaggers are prioritized (Art. 16, 22 DSA). |
Complaint-handling process | Internal system for users to contest content decisions (§ 3 (2) NetzDG)b | Internal system for users to contest content decisions (Art. 20 DSA) |
Data access | Limited access for researchers (§ 5a NetzDG) | Limited access for vetted researchers from VLOPs and VLOSEs (Art. 40 DSA) |
Out-of-court dispute resolution process | Encourages out-of-court dispute resolution. Requirements for mediation body and processes (§ 3b NetzDG) | Encourages out-of-court dispute resolution. Requirements for mediation body and processes (Art. 21 DSA) |
Timeframes for action | Removal or blocking within 24 hours for manifestly illegal content, 7 days for illegal content upon receiving user complaint (§ 3 (2) NetzDG) | Prompt removal or disabling upon receiving user complaint, but no specified timeframe (Art. 16 (6) DSA) |
Transparency reporting | Comprehensive reporting obligations on content moderation processes (§ 2 NetzDG) | Comprehensive reporting obligations on content moderation processes (Art. 15, 24 DSA). Additional requirements for VLOPs and VLOSEs (Art. 42 DSA) |
Selection of New Content-Related Obligations under DSA | ||
Audits | — | Mandatory independent audits for VLOPs and VLOSEs, including for content moderation systems (Art. 37 DSA) |
Crisis response mechanism | — | Crisis response mechanisms for serious threats to public health and security for VLOPs and VLOSEs. The EC can require enhanced content moderation and other measures (Art. 36 DSA). |
Risk assessment | — | Mandatory risk assessment and mitigation for VLOPs and VLOSEs, including content moderation systems (Art. 34 DSA) |
Features | Network Enforcement Act (NetzDG) | Digital Services Act (DSA) |
Legislative Framework | ||
Approach | Interventionist with regulated self-regulation | Interventionist with co-regulatory elements; multilevel governance |
Implementation | German law enacted in 2017. Updated in 2021 and 2022 | EU law enacted in 2022 |
Scope | Online platforms with ≥2 million users in Germany; excludes messaging, professional, gaming, and shopping platforms (§ 1 (1, 2) NetzDG) | Wide range of online services, including cloud, messaging, social networking, search, and marketplaces (Art. 2 DSA). Enhanced regulation of VLOPs and VLOSEsa with ≥45 million users in the EU (Art. 33 DSA) |
Oversight and enforcement | BMJV (§ 4 NetzDG) | Multilayered oversight framework consisting of National Digital Service Coordinators (Art. 49-51), the European Commission for VLOPs and VLOSEs (Art. 56, 65-78), and European Board for Digital Services (Art. 58, 60-63) |
Fines | Substantial fines. Up to €5 million for individuals, up to €50 million for companies (§ 4 NetzDG). | Substantial fines. Up to 6% of global turnover for VLOPs and VLOSEs (Art. 74 DSA). Member states lay down rules for other services (Art. 52). |
Content Moderation | ||
Content subject to removal | Manifestly illegal content and certain types of illegal content as defined in the criminal code (StGB) of national-level law (§ 1 (3) NetzDG) | Illegal content as defined in EU- and national-level laws (Art. 3 DSA). Content specified in services’ terms and conditions (Art. 14 DSA). |
Content reporting | Users can report content for review (§ 3 NetzDG). | Users and trusted flaggers can report content for review, whereby reports from trusted flaggers are prioritized (Art. 16, 22 DSA). |
Complaint-handling process | Internal system for users to contest content decisions (§ 3 (2) NetzDG)b | Internal system for users to contest content decisions (Art. 20 DSA) |
Data access | Limited access for researchers (§ 5a NetzDG) | Limited access for vetted researchers from VLOPs and VLOSEs (Art. 40 DSA) |
Out-of-court dispute resolution process | Encourages out-of-court dispute resolution. Requirements for mediation body and processes (§ 3b NetzDG) | Encourages out-of-court dispute resolution. Requirements for mediation body and processes (Art. 21 DSA) |
Timeframes for action | Removal or blocking within 24 hours for manifestly illegal content, 7 days for illegal content upon receiving user complaint (§ 3 (2) NetzDG) | Prompt removal or disabling upon receiving user complaint, but no specified timeframe (Art. 16 (6) DSA) |
Transparency reporting | Comprehensive reporting obligations on content moderation processes (§ 2 NetzDG) | Comprehensive reporting obligations on content moderation processes (Art. 15, 24 DSA). Additional requirements for VLOPs and VLOSEs (Art. 42 DSA) |
Selection of New Content-Related Obligations under DSA | ||
Audits | — | Mandatory independent audits for VLOPs and VLOSEs, including for content moderation systems (Art. 37 DSA) |
Crisis response mechanism | — | Crisis response mechanisms for serious threats to public health and security for VLOPs and VLOSEs. The EC can require enhanced content moderation and other measures (Art. 36 DSA). |
Risk assessment | — | Mandatory risk assessment and mitigation for VLOPs and VLOSEs, including content moderation systems (Art. 34 DSA) |
Jaursch, Julian. “New EU Rules for Digital Services: Why Germany Needs Strong Platform Oversight Structures.” Berlin, Germany: Stiftung Neue Verantwortung, May 17, 2022. https://www.stiftung-nv.de/de/publication/dsa-why-germany-needs-strong-platform-oversight-structures; Kahl, Jonas. “14 wichtige Neuregelungen des Digital Services Act (DSA) im Vergleich zum NetzDG.” Spirit Legal, November 28, 2022. https://www.spiritlegal.com/de/aktuelles/details/14-wichtige-neuregelungen-des-digital-services-act.html; Kahl, Jonas, and Simon Liepert. “Digital Services Act: Was sich gegenüber dem NetzDG ändert.” c’t Magazin, December 9, 2022. https://www.heise.de/hintergrund/Digital-Services-Act-Was-sich-gegenueber-dem-NetzDG-aendert-7367625.html.
aVery large online platforms and very large online search engines.
bIn 2023, a German court (OLG) ruled that the NetzDG complaint-handling process is noncompliant with EU law. Dachwitz, “NetzDG-Reform ungültig.”
Note: § 3b to 3f NetzDG, § 4a NetzDG, § 5a NetzDG were introduced after the initial enactment of NetzDG.
Source: Review of NetzDG: Bundesministeriums der Justiz und für Verbraucherschutz. Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz – NetzDG), BGBl. I S. 3352 § (2017). https://eur-lex.europa.eu/eli/reg/2022/2065/oj/eng and the DSA: European Union. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) (Text with EEA relevance), 277 OJ L § (2022). http://data.europa.eu/eli/reg/2022/2065/oj/eng; by the author(s) also, Kahl, Jonas. “14 wichtige Neuregelungen des Digital Services Act (DSA) im Vergleich zum NetzDG.” Spirit Legal, November 28, 2022. https://www.spiritlegal.com/de/aktuelles/details/14-wichtige-neuregelungen-des-digital-services-act.html.
The observed parallels might suggest a direct influence of German ideas on platform regulations in Brussels. However, the mere presence of apparent commonalities does not sufficiently establish a causal relationship. What is more, an in-depth comparative legal assessment of regulatory practice would be required to determine whether they are more similar or divergent. And in any case, to comprehend why a particular policy idea succeeds, a nuanced analysis of the multidimensional factors contributing to regulatory processes is essential for uncovering the underpinnings of such policies.
To achieve this in the German case, this research has adopted NetzDG as an instrumental case study of contemporary platform regulation. In offering an in-depth examination of empirical explanations for regulatory processes within the German case, the analysis identified persistent issues, which may underpin platform regulation across cases. It highlights the central role of power dynamics between nation-states and influential, transnational tech corporations in shaping emerging technology regulations. The German experience illustrates the challenges faced in regulating illegal online speech because of gaps in information, authority, and expertise, and capital—issues directly linked to the dynamics between platforms and the state. Platforms resisted German regulatory efforts, withholding crucial resources. NetzDG offered an appealing policy idea to recuperate governmental power by mandating transparency reports, outlining content management procedures, and strengthening enforcement mechanisms—strategies which are also encoded in the DSA.
Although the analysis does not delve deeply into policy transfer or learning to conclusively demonstrate the impact of German regulation on international policies, integrating power dynamics into MSA underscores the significance of state-platform power relations in the German context and potentially beyond. The transnational nature of platforms suggest that they may have employed similar tactics across different regulatory landscapes. A fruitful avenue for future research might be to apply the power-integrated MSA in various jurisdictions to explore whether disputes over regulatory power have prompted NetzDG-related regulations or if alternate explanations exist.
In summary, attributing the global emergence of procedural obligations for the removal of illegal content to a simple diffusion of a German idea overlooks the complex power and resource struggles between nation-states and platforms. Rather than assuming that NetzDG is a German export success, the case study underscores that platform regulation is underpinned by a complex interplay of ideas, resources, and regulatory power.
Conclusion
This case study conducted an empirical analysis into regulatory policy processes and underlying power dynamics in the interactions between a national government and big tech platforms, culminating in the adoption of Germany’s NetzDG. By intertwining Kingdon’s MSA with theories of regulatory power, it developed a power-integrated MSA to analytically dissect digital platform regulation. Through an interpretative analysis of an instrumental case study of NetzDG, combining 26 elite interviews, document analysis, and process tracing, the article examined the regulatory journey that led to the adoption of NetzDG, with a specific emphasis on the role of regulatory power in shaping emergent platform regulation.
The power-integrated MSA intertwined two core inquiries, providing complex explanations for how and why NetzDG emerged as a successful policy idea in Germany. It assessed the convergence of problem recognition, policy proposals, and political opportunities. Additionally, it explored the impact of the relationship between the German state and US big tech platforms, particularly focusing on the dynamics of regulatory power among these actors within the context of contemporary digital regulation.
From this case analysis, three central explanations emerged as pivotal in influencing the adoption of NetzDG in Germany, while also providing insights into the broader regulatory approaches to emerging platform regulation. First, the core idea of aligning offline and online illegality gained wide acceptance by reinforcing existing German law and allowing policymakers to reclaim power over the digital sphere. Second, the relationship between the German state and big tech platforms was marked by contention, with platforms strategically withholding critical resources that NetzDG subsequently mandated they provide for regulatory purposes. Third, Judicial Minister Heiko Maas played an instrumental role as a policy entrepreneur, championing NetzDG as a politically viable policy idea to address both the issue of illegal speech online and the state’s diminished control over platforms.
Although procedural obligations for the removal of illegal content by platforms have become widely adopted, it is crucial to look beyond the simple view of NetzDG as merely a German policy export. The case analysis reveals that platform regulation is influenced by a complex interplay of ideas, resources, and regulatory power. Moving forward, the power-integrated MSA offers a robust framework for analyzing contemporary technology regulation, particularly spotlighting the power dynamics between nation-states, on the one hand, and powerful, big tech actors, on the other hand. This approach is particularly well-suited for examining regulations concerning platform-centric emerging technologies, such as social media, generative artificial intelligence, or virtual metaverses.
Appendices
Appendix A: List of Interviews Carried out by the Researcher between January 4, 2023, and April 14, 2023.
ID | Name | Type | Organization | Title | Previous or Current Appointments If Relevant | Date |
1 | Marc Liesching | Academia | Ludwig Maximilian University of Munich | Professor for Media Law | January 4, 2023 | |
2 | Keno Potthast | Academia | Leibniz Institute for Media Hans-Bredow Institute | Junior Researcher | January 10, 2023 | |
3 | Owen Bennett | Government | Ofcom (UK Office of Communications) | International Online Safety Principal | Previous: Senior Public Policy Manager at Mozilla (2018–22) | January 18, 2023 |
4 | Dr. Jonas Kahl | Law | Spirit Legal | Lawyer | January 4, 2023 | |
5 | Dr. Julian Jaursch | CSO | Stiftung Neue Verantwortung | Project Lead | January 4, 23 | |
6 | Alexander Fanta | CSO | Netzpolitik | Journalist (Brussels correspondent) | January 16, 2023 | |
7 | Lubos Kuklis | Government | Previous: Chair at European Platform Regulatory Authorities (2020–22) | January 17, 2023 | ||
8 | Rachel Griffin | Academia | School of Law at Sciences Po | PhD Candidate | January 24, 2023 | |
9 | Dr. Jens Zimmermann | Government | Bundestag; Digital Agenda Committee; Social Democratic Party (SPD) | Bundestag Member; SPD Spokesperson for Digital Policy | January 31, 2023 | |
10 | Dr. Amelie Heldt | Government | Federal Chancellery | Officer for Digital Policy | Previous: PhD Candidate, Leibniz Institute for Media Hans-Bredow Institute (2017–21) | February 8, 2023 |
11 | Government | February 1, 2023 | ||||
12 | Law | February 2, 2023 | ||||
13 | CSO | Futur1 | Director | Previous: Project Lead Misinformation, Stiftung Neue Verantwortung (2016–21) | February 20, 2023 | |
14 | Hendrik Wieduwilt | Other | Self-employed | Consultant & Communication Strategist | Previous: Podcast Host at Facebook (2020–21) | February 20, 2023 |
15 | Alexandra Geese | Government | MEP, Shadow Rapporteur DSA | European Parliament | February 20, 2023 | |
16 | Dr. Stefan Dreyer | Academia | Leibniz Institute for Media Hans-Bredow Institute | Senior Researcher | February 20, 2023 | |
17 | CSO | February 6, 2023 | ||||
18 | Torben Klausa | Academia | University of Bielefeld | PhD Candidate | Previous: Assistant at Bundestag Digital Agenda Committee 2016–17); Officer for Public Policy at Bitkom (2017–18) | February 6, 2023 |
19 | Dr. Katharina Meßmer | CSO | Stiftung Neue Verantwortung | Project Lead | Previous: Communications at Social Democratic Party (2006–9) | February 7, 2023 |
20 | Dr. Thorsten Thiel | Academia | University of Bielefeld | Professor | Previous: Weizenbaum Institute | February 6, 2023 |
21 | Dr. Daniel Holznagel | Law | Court of Berlin | Judge | Previous: Legal Officer on NetzDG, Ministry of Justice (2017–21) | February 14, 2023 |
22 | Government | February 16, 2023 | ||||
23 | Government | February 23, 2023 | ||||
24 | Dr. Martin Husovec | Academia | London School of Economics | Assistant Professor | February 16, 2023 | |
25 | Dr. Martin Kettemann | Academia | University of Innsbruck | Professor | March 20, 2023 | |
26 | Government | April 14, 2023 |
ID | Name | Type | Organization | Title | Previous or Current Appointments If Relevant | Date |
1 | Marc Liesching | Academia | Ludwig Maximilian University of Munich | Professor for Media Law | January 4, 2023 | |
2 | Keno Potthast | Academia | Leibniz Institute for Media Hans-Bredow Institute | Junior Researcher | January 10, 2023 | |
3 | Owen Bennett | Government | Ofcom (UK Office of Communications) | International Online Safety Principal | Previous: Senior Public Policy Manager at Mozilla (2018–22) | January 18, 2023 |
4 | Dr. Jonas Kahl | Law | Spirit Legal | Lawyer | January 4, 2023 | |
5 | Dr. Julian Jaursch | CSO | Stiftung Neue Verantwortung | Project Lead | January 4, 23 | |
6 | Alexander Fanta | CSO | Netzpolitik | Journalist (Brussels correspondent) | January 16, 2023 | |
7 | Lubos Kuklis | Government | Previous: Chair at European Platform Regulatory Authorities (2020–22) | January 17, 2023 | ||
8 | Rachel Griffin | Academia | School of Law at Sciences Po | PhD Candidate | January 24, 2023 | |
9 | Dr. Jens Zimmermann | Government | Bundestag; Digital Agenda Committee; Social Democratic Party (SPD) | Bundestag Member; SPD Spokesperson for Digital Policy | January 31, 2023 | |
10 | Dr. Amelie Heldt | Government | Federal Chancellery | Officer for Digital Policy | Previous: PhD Candidate, Leibniz Institute for Media Hans-Bredow Institute (2017–21) | February 8, 2023 |
11 | Government | February 1, 2023 | ||||
12 | Law | February 2, 2023 | ||||
13 | CSO | Futur1 | Director | Previous: Project Lead Misinformation, Stiftung Neue Verantwortung (2016–21) | February 20, 2023 | |
14 | Hendrik Wieduwilt | Other | Self-employed | Consultant & Communication Strategist | Previous: Podcast Host at Facebook (2020–21) | February 20, 2023 |
15 | Alexandra Geese | Government | MEP, Shadow Rapporteur DSA | European Parliament | February 20, 2023 | |
16 | Dr. Stefan Dreyer | Academia | Leibniz Institute for Media Hans-Bredow Institute | Senior Researcher | February 20, 2023 | |
17 | CSO | February 6, 2023 | ||||
18 | Torben Klausa | Academia | University of Bielefeld | PhD Candidate | Previous: Assistant at Bundestag Digital Agenda Committee 2016–17); Officer for Public Policy at Bitkom (2017–18) | February 6, 2023 |
19 | Dr. Katharina Meßmer | CSO | Stiftung Neue Verantwortung | Project Lead | Previous: Communications at Social Democratic Party (2006–9) | February 7, 2023 |
20 | Dr. Thorsten Thiel | Academia | University of Bielefeld | Professor | Previous: Weizenbaum Institute | February 6, 2023 |
21 | Dr. Daniel Holznagel | Law | Court of Berlin | Judge | Previous: Legal Officer on NetzDG, Ministry of Justice (2017–21) | February 14, 2023 |
22 | Government | February 16, 2023 | ||||
23 | Government | February 23, 2023 | ||||
24 | Dr. Martin Husovec | Academia | London School of Economics | Assistant Professor | February 16, 2023 | |
25 | Dr. Martin Kettemann | Academia | University of Innsbruck | Professor | March 20, 2023 | |
26 | Government | April 14, 2023 |
Note: Table includes participants that have requested anonymity.
Appendix B: Interview Questions
Questions about the subject
1. What is your name?
2. What pronouns do you use?
3. What is your current job title?
4. Which organization(s) are you affiliated with?
5. How long have you been in this role?
6. When did you first start working on issues related to technology policy and platform regulation?
7. Can you walk me through the main stages of your education and career?
Examples of ice breaker questions
8. Heiko Maas famously said about NetzDG, “What is prohibited offline is also not permitted online.” Hasn’t that always been the case?
9. The internet has long been described as a lawless Wild West. Do you think that was an accurate description in the context of Germany?
10. Germany is sometimes described as the “king of regulation” in the European Union. As one of the first democracies to implement regulations aimed directly at platforms, Germany took a pioneering role with NetzDG. How do you perceive this characterization, especially with regards to technology policy?
11. Angela Merkel reportedly scolded Mark Zuckerberg at a UN banquet in 2015, suggesting that social media platforms have a moral obligation to combat hate speech. Do you agree with this perspective?
Examples of open-ended and probing questions
12. Reflecting on the global impact of NetzDG, can you identify any trends or shifts in international policymaking inspired by NetzDG? What do you think prompts similarities and differences between different approaches?
13. NetzDG has evolved over time and has been amended on numerous occasions. What prompted these updates? Can you identify any patterns or trends? Why were these updates necessary and were they necessary at all in your opinion?
14. Looking back on the operation of NetzDG, did it achieve its intended objectives? Were there any unintended consequences or benefits?
15. Evaluations of NetzDG reveal that only a marginal amount of content was deleted on the grounds of NetzDG compliance. Why do you believe this is the case? Was there not as significant a hate speech crisis as initially thought? What implications does this have for the BMJV?
16. NetzDG has been in place for several years now. Reflecting on the time of its introduction, what were the major concerns? What were the biggest hopes and goals, and have they been realized? Why did concerns about over-blocking not materialize as anticipated?
17. Before NetzDG, a number of more collaborative approaches were attempted. What can you tell me about that period? What was the relationship like between different stakeholder groups? What complicated these relationships? Do you think the relationship between platforms and government prompted NetzDG?
18. Was there a pivotal moment or event that led to the adoption of NetzDG that stands out to you? How does this moment align with the medium- and long-term technology policy landscape?
19. When NetzDG was first proposed, platforms were celebrated as beacons of democracy and connectivity. However, in Germany, platforms, especially Facebook, came under criticism for their role in disseminating hate speech. Can you elaborate on this context? How did the public and political perception of platforms evolve over time?
20. How does NetzDG fit into the tradition of limited liability regimes in Europe? Was NetzDG a break from the status quo or an extension of prevalent strategies?
21. NetzDG introduced a hard law approach. Why was this considered necessary? What aspects of the collaborative or voluntary efforts were falling short?
22. NetzDG was implemented relatively swiftly. How did the BMJV and platforms prepare for this new regulatory regime? Were stakeholders able to comply within the given timeframe? What new roles and responsibilities were created?
Appendix C: Key Documents Included in the Document Analysis
Bundesministeriums der Justiz und für Verbraucherschutz. Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz—NetzDG), BGBl. I S. 3352 § (2017). http://www.bgbl.de/xaver/bgbl/start.xav?startbk=Bundesanzeiger_BGBl&jumpTo=bgbl117s3352.pdf.
Bundesministeriums der Justiz und für Verbraucherschutz. Referentenentwurf des Bundesministeriums der Justiz und für Verbraucherschutz (2017). https://www.bundesgerichtshof.de/SharedDocs/Downloads/DE/Bibliothek/Gesetzesmaterialien/18_wp/NetzDG/refe.pdf;jsessionid=03466AA0A0BDC2129514E11565808C44.internet982?__blob=publicationFile&v=1.
Bundesministeriums der Justiz und für Verbraucherschutz. Stellungnahme des Bundesrates, 315/17 § (2017). https://www.bundesgerichtshof.de/SharedDocs/Downloads/DE/Bibliothek/Gesetzesmaterialien/18_wp/NetzDG/refe.pdf;jsessionid=03466AA0A0BDC2129514E11565808C44.internet982?__blob=publicationFile&v=1.
Bundesrat. Empfehlungen der Ausschüsse, 315/1/17 § (2017). https://www.bundesgerichtshof.de/SharedDocs/Downloads/DE/Bibliothek/Gesetzesmaterialien/18_wp/NetzDG/refe.pdf;jsessionid=03466AA0A0BDC2129514E11565808C44.internet982?__blob=publicationFile&v=1.
Bundesrat. Gesetzesbeschluss des Deutschen Bundestages, 536/17 § (2017). https://www.bundesrat.de/SharedDocs/drucksachen/2017/0501-0600/536-17.pdf?__blob=publicationFile&v=1.
Bundesregierung. “Bericht Der Bundesregierung Zur Evaluierung Des Netzwerkdurchsetzungsgesetzes,” September 10, 2020. https://dserver.bundestag.de/btd/19/226/1922610.pdf.
Bundesregierung. Gesetz zur Durchführung der Verordnung (EU) 2021/784 des Europäischen Parlaments und des Rates vom 29. April 2021 zur Bekämpfung der Verbreitung terroristischer Online-Inhalte und zur Änderung weiterer Gesetze, BGBl. I S. 1182, 1184 § (2022). http://www.bgbl.de/xaver/bgbl/start.xav?startbk=Bundesanzeiger_BGBl&jumpTo=bgbl122s1182.pdf.
Bundesregierung. Gesetzentwurf der Bundesregierung (2017). https://www.bmj.de/SharedDocs/Gesetzgebungsverfahren/DE/2017_NetzDG.html.
Bundesregierung. Gesetzesentwurf der Bundesregierung, 18/12727 § (2017). https://dserver.bundestag.de/btd/18/127/1812727.pdf.
Deutscher Bundestag. Antrag der Abgeordneten Dr. Konstantin von Notz, Renate Künast, Tabea Rößner, Dieter Janecek, Luise Amtsberg, Katja Keul, Monika Lazar, Irene Mihalic, Özcan Mutlu, Ulle Schauws, Hans-Christian Ströbele und der Fraktion BÜNDNIS 90/DIE GRÜNEN, 18/11856 § (2017). https://dserver.bundestag.de/btd/18/118/1811856.pdf.
Deutscher Bundestag. Gesetzentwurf der Fraktionen der CDU/CSU und SPD, 18/12356 § (2017). https://dserver.bundestag.de/btd/18/130/1813013.pdf.
eco Verband der Internetwirtschaft e.V. “Stellungnahme Zum Referentenentwurf Eines Gesetzes Zur Verbesserung Der Rechtsdurchsetzung in Sozialen Netzwerken (Netzwerkdurchsetzungsgesetz),” March 30, 2017. https://www.bundesgerichtshof.de/SharedDocs/Downloads/DE/Bibliothek/Gesetzesmaterialien/18_wp/NetzDG/stellung_eco_refe.pdf?__blob=publicationFile.
Facebook Germany GmbH. “Stellungnahme Zum Entwurf Des Netzwerkdurchsetzungsgesetzes BR-Drucksache 315/17 ; BT-Drucksache 18/12356,” May 30, 2017. https://cdn.netzpolitik.org/wp-upload/2017/05/Facebook_Stellungnahme_zum_Entwurf_des_NetzDG.pdf.
“Plenarprotokoll 18/244.” Berlin, Germany, June 30, 2017. https://dserver.bundestag.de/btp/18/18244.pdf.
Weber, Maria-Terese. “Bitkom Stellungnahme Entwurf Eines Gesetzes Zur Verbesserung Der Rechtsdurchset- Zung in Sozialen Netzwerken Der Fraktionen von CDU/CSU Und SPD (BT DS 18/12356),” June 19, 2017. https://www.bundestag.de/resource/blob/510906/e8e218a40bd46373a96839c32b966faa/rohleder_bitkom-data.pdf.
Notes
NetzDG; German: Netzwerkdurchsetzungsgesetz.
Amélie Heldt, “Reading between the Lines and the Numbers: An Analysis of the First NetzDG Reports,” SSRN Scholarly Paper. Rochester, NY, August 29, 2018, https://papers.ssrn.com/abstract=3413677; Hemmert-Maximilian Halswick, “Lessons Learned from the First Years with the NetzDG,” 415–32. Nomos Verlagsgesellschaft mbH & Co. KG, 2021, https://doi.org/10.5771/9783748929789-415; Patrick Zurth, “The German NetzDG as Role Model or Cautionary Tale? Implications for the Debate on Social Media Liability,” Fordham Intellectual Property, Media & Entertainment Law Journal, 31, no. 4 (2020): 1084.
Hemmert-Halswick, “Lessons Learned from the First Years with the NetzDG,” 416; Zurth, “The German NetzDG as Role Model or Cautionary Tale?” 1048.
Amélie Heldt, “Let’s Meet Halfway: Sharing New Responsibilities in a Digital Age,” Journal of Information Policy 9 (December 1, 2019): 336–69, https://doi.org/10.5325/jinfopoli.9.2019.0336.
Torben Klausa, “Graduating from ‘New-School’—Germany’s Procedural Approach to Regulating Online Discourse,” Information, Communication & Society 26, no. 1 (January 2, 2023): 54–69, https://doi.org/10.1080/1369118X.2021.2020321; Rachel Griffin, “New School Speech Regulation as a Regulatory Strategy against Hate Speech on Social Media: The Case of Germany’s NetzDG,” Telecommunications Policy 46, no. 9 (October 1, 2022): 102411, https://doi.org/10.1016/j.telpol.2022.102411; see also Jack M. Balkin, “Old-School/New-School Speech Regulation,” Harvard Law Review 127, no. 8 (2014): 2296–342, who argues that the widespread emergence of digital networks as “infrastructure of free speech” prompts the emergence of “new-school speech regulation.” These regulations aim directly at digital networks and assign them regulatory roles pertaining to speech regulations, reflecting a shift in how free speech is regulated in digital environments.
Although key provisions of the law were struck down by the French Constitutional Council in 2020.
Bundesregierung, Gesetz zur Durchführung der Verordnung (EU) 2022/2065 des Europäischen Parlaments und des Rates vom 19. Oktober 2022 über einen Binnenmarkt für digitale Dienste und zur Änderung der Richtlinie 2000/31/EG sowie zur Durchführung der Verordnung (EU) 2019/1150 des Europäischen Parlaments und des Rates vom 20. Juni 2019 zur Förderung von Fairness und Transparenz für gewerbliche Nutzer von Online-Vermittlungsdiensten und zur Änderung weiterer Gesetze—Bundesgesetzblatt, BGBl. 2024 Nr. 149 § (2022), https://www.recht.bund.de/bgbl/1/2024/149/VO.html.
Hemmert-Halswick, “Lessons Learned from the First Years with the NetzDG,” 430.
Heldt, “Reading between the Lines and the Numbers.”
Sandra Schmitz, and Christian M. Berndt, “The German Act on Improving Law Enforcement on Social Networks (NetzDG): A Blunt Sword?” SSRN Scholarly Paper (Rochester, NY, December 14, 2018), https://doi.org/10.2139/ssrn.3306964.
Marc Liesching, Chantal Funke, Alexander Hermann, Christin Kneschke, and Carolin Michnik. Das NetzDG in der praktischen Anwendung: eine Teilevaluation des Netzwerkdurchsetzungsgesetzes. Schriftenreihe Medienrecht & Medientheorie, Band 3 (Berlin: Carl Grossmann Verlag, 2021).
Ibid., 359.
Griffin, “New School Speech Regulation as a Regulatory Strategy against Hate Speech on Social Media”; David Kaye, “Mandate of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression,” 2017, https://www.westminsterforumprojects.co.uk/.
Jacob Mchangama, and Joelle Fiss, “Analysis: The Digital Berlin Wall: How Germany (Accidentally) Created a Prototype for Global Online Censorship,” (Copenhagen: Justitia, November 5, 2019), https://justitia-int.org/the-digital-berlin-wall-how-germany-created-a-prototype-for-global-online-censorship/; Reporters Without Borders, “Russian Bill Is Copy-and-Paste of Germany’s Hate Speech Law,” Reporters without Borders: For Freedom of Information (blog), July 2017. https://rsf.org/en/news/russian-bill-copy-and-paste-germanys-hate-speech-law; Svea Windwehr, and Jillian C. York, “Turkey’s New Internet Law Is the Worst Version of Germany’s NetzDG Yet,” Electronic Frontier Foundation, July 30, 2020, https://www.eff.org/deeplinks/2020/07/turkeys-new-internet-law-worst-version-germanys-netzdg-yet; Janosch Delcker, “Germany’s Balancing Act: Fighting Online Hate While Protecting Free Speech,” POLITICO, 2020, https://www.politico.eu/article/germany-hate-speech-internet-netzdg-controversial-legislation/.
John Kingdon, Agendas, Alternatives, and Public Policies, Update Edition, with an Epilogue on Health Care. 2nd ed. (Boston: Pearson, 1995), 5.
Richard Hoefer, “The Multiple Streams Framework: Understanding and Applying the Problems, Policies, and Politics Approach,” Journal of Policy Practice and Research 3, no. 1 (March 1, 2022): 1–5, https://doi.org/10.1007/s42972-022-00049-2.
John Barlow, “A Declaration of the Independence of Cyberspace,” Duke Law & TechnologyReview 18, no. 1 (August 11, 2019), 5–7, 5.
Ronald Deibert, Access Controlled: The Shaping of Power, Rights, and Rule in Cyberspace (Cambridge, MA: MIT Press, 2010); Jonathan Zittrain, Access Denied: The Practice and Policy of Global Internet Filtering (Oxford Internet Institute, 2007), https://pdfs.semanticscholar.org/7f84/e31e6dfa7c7b9e479f9f7f157fca32834139.pdf.
Hannah Bloch-Wehba, “Global Platform Governance: Private Power in the Shadow of the State,” SMU Law Review 72, no. 1 (2019): 27–80, 36.
Elettra Bietti, “A Genealogy of Digital Platform Regulation,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, June 3, 2021), https://doi.org/10.2139/ssrn.3859487; Bloch-Wehba, “Global Platform Governance.”
Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (London: Profile Books, 2019).
Philip N. Howard, Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives (New Haven and London: Yale University Press, 2020).
Samantha Bradshaw, Lisa-Maria Neudert, and Philip N. Howard, “Government Responses to Malicious Use of Social Media,” Working Paper. The Malicious Use of Social Media (Riga: NATO StratCom Centre of Excellence, November 2018), https://www.stratcomcoe.org/government-responses-malicious-use-social-media; Ulrike Klinger, Daniel Kreiss, and Bruce Mutsvairo, Platforms, Power, and Politics: An Introduction to Political Communication in the Digital Age. 1st ed. (Medford: Polity, 2023).
Daniela Stockmann, “Tech Companies and the Public Interest: The Role of the State in Governing Social Media Platforms,” Information, Communication & Society 26, no. 1 (January 2, 2023): 1–15, https://doi.org/10.1080/1369118X.2022.2032796.
Kamya Yadav, Ulaş Erdoğdu, Samikshya Siwakoti, Jacob N. Shapiro, and Alicia Wanless, “Countries Have More than 100 Laws on the Books to Combat Misinformation. How Well Do They Work?” Bulletin of the Atomic Scientists 77, no. 3 (May 4, 2021): 124–8, https://doi.org/10.1080/00963402.2021.1912111.
Hemmert-Halswick, “Lessons Learned from the First Years with the NetzDG,” 419.
Jan Fleischhauer, “Facebook: Die Hass-Maschine,” Der Spiegel, October 5, 2016, sec. Politik, https://www.spiegel.de/spiegel/facebook-die-hass-maschine-a-1115052.html; Till Krause, and Hannes Grassegger, “Inside Facebook: So arbeitet das Arvato-Löschteam,” Süddeutsche.de, December 15, 2016, https://www.sueddeutsche.de/digital/exklusive-sz-magazin-recherche-inside-facebook-1.3297138.
Schmitz and Berndt, “The German Act on Improving Law Enforcement on Social Networks (NetzDG)”; Heidy Tworek, and Paddy Leerssen, “An Analysis of Germany’s NetzDG Law” (Amsterdam, Netherlands: Translantic Working Group, 2019); Zurth, “The German NetzDG as Role Model or Cautionary Tale?”
Robert Gorwa, “Elections, Institutions, and the Regulatory Politics of Platform Governance: The Case of the German NetzDG,” Telecommunications Policy, Norm entrepreneurship in Internet Governance, 45, no. 6 (July 1, 2021): 102145, https://doi.org/10.1016/j.telpol.2021.102145.
Danya He, “Governing Hate Content Online: How the Rechtsstaat Shaped the Policy Discourse on the NetzDG in Germany,” International Journal of Communication 14, (June 29, 2020): 23.
Related concepts for the study of policy change, including the advocacy coalition framework and punctuated equilibrium, were considered less suited for this analysis. The former typically focuses on coalitions of people with shared beliefs, whereas NetzDG was characterized by a more direct government response to pressing online speech issues. The latter emphasizes nonlinear policy shifts following long periods of stability, whereas NetzDG arose from more continuous policy frameworks regulating speech on the internet.
Paul Cairney, Understanding Public Policy: Theories and Issues. Vol. 2 (London: Bloomsbury Publishing, 2019), 189.
Kingdon, Agendas, Alternatives, and Public Policies, Update Edition, with an Epilogue on Health Care, 1.
Ibid.
Cairney, Understanding Public Policy, 189.
Kingdon, Agendas, Alternatives, and Public Policies, Update Edition, with an Epilogue on Health Care.
Ibid., 5.
Ibid.
Michael D. Jones, Holly L. Peterson, Jonathan J. Pierce, Nicole Herweg, Amiel Bernal, Holly Lamberta Raney, and Nikolaos Zahariadis. “A River Runs Through It: A Multiple Streams Meta-Review,” Policy Studies Journal 44, no. 1 (2016): 13–36, https://doi.org/10.1111/psj.12115.
Pawel Popiel, “The Tech Lobby: Tracing the Contours of New Media Elite Lobbying Power,” Communication Culture & Critique 11, no. 4 (2018): 566–85. https://doi.org/10.1093/ccc/tcy027; Pawel Popiel, “Digital Platforms as Policy Actors,” in Digital Platform Regulation: Global Perspectives on Internet Governance, ed. Terry Flew and Fiona R. Martin. Palgrave Global Media Policy and Business (Cham: Springer International Publishing, 2022), 131–50, https://doi.org/10.1007/978-3-030-95220-4_7; Tarleton Gillespie, “The Politics of ‘Platforms’,” New Media & Society 12, no. 3 (May 1, 2010): 347–64, https://doi.org/10.1177/1461444809342738.
Jose van Dijck, Thomas Poell, and Martin de Waal, The Platform Society (Oxford: Oxford University Press, 2018); Martin Moore, and Damian Tambini, Digital Dominance: The Power of Google, Amazon, Facebook, and Apple (Oxford: Oxford University Press, 2018).
Robert A. Dahl, “The Concept of Power,” Behavioral Science 2, no. 3 (1957): 201–15, 202–3, https://doi.org/10.1002/bs.3830020303.
According to Mann, the concept of “infrastructural power” contrasts with what he refers to as “despotic power,” which is the power of an elite class to impose its will over society through the control of their own elite support system. Conversely, “infrastructural power” is the state’s ability to exercise power through society by using institutions and various resources, such as the legal system, the welfare system, and public health, to effectively implement policies. John Lucas, “The Tension between Despotic and Infrastructural Power: The Military and the Political Class in Nigeria, 1985–1993,” Studies in Comparative International Development 33, no. 3 (September 1, 1998): 90–113, https://doi.org/10.1007/BF02687493; Michael Mann, “The Autonomous Power of the State: Its Origins, Mechanisms and Results,” European Journal of Sociology/Archives Européennes de Sociologie 25, no. 2 (1984): 185–213, https://doi.org/10.1017/S0003975600004239.
Mann, “The Autonomous Power of the State,” 59.
Johannes Lindvall, and Jan Teorell, “State Capacity as Power: A Conceptual Framework,” State Capacity as Power: A Conceptual Framework. STANCE Working Paper Series, Department of Political Science, Lund University, Lund, Sweden, May 2016.
Ibid.
Ibid., 16.
Julia Black, “Enrolling Actors in Regulatory Systems: Examples from UK Financial Services Regulation,” Public Law 2003, no. Spring Issue (2003): 63–91.
Lisa-Maria Neudert, “Regulatory Capacity Capture: The United Kingdom’s Online Safety Regime,” Internet Policy Review 12, no. 4 (December 1, 2023), https://policyreview.info/articles/analysis/regulatory-capacity-capture-united-kingdoms-online-safety-regime.
Katharine Dommett, and Junyan Zhu, “The Barriers to Regulating the Online World: Insights from UK Debates on Online Political Advertising,” Policy & Internet, 14, no. 4 (2022): 16, https://doi.org/10.1002/poi3.299; Albert Mills, Gabrielle Durepos, and Elden Wiebe, “Instrumental Case Study,” in Encyclopedia of Case Study Research, ed. Albert J. Mills, Gabrielle Durepos and Elden Wiebe (Thousand Oaks, CA: SAGE, 2010), 473–5, https://doi.org/10.4135/9781412957397.
Magnus G. Schoeller, Leadership in the Eurozone: The Role of Germany and EU Institutions. Palgrave Studies in European Union Politics (Cham: Springer International Publishing, 2019), https://doi.org/10.1007/978-3-030-12704-6.
Julia Pohle, “Digital Sovereignty. A New Key Concept of Digital Policy in Germany and Europe” (Munich and Berlin: Konrad-Adenauer-Stiftung, 2020), https://www.econstor.eu/handle/10419/228713.
Dvora Yanow, Conducting Interpretive Policy Analysis (Thousand Oaks, CA: SAGE Publications, 2000); David Collier, “Understanding Process Tracing,” PS: Political Science & Politics 44, no. 4 (October 2011): 823–30, https://doi.org/10.1017/S1049096511001429
Mark Bevir, and Roderick Arthur William Rhodes, Interpreting British Governance (Abingdon-on-Thames: Routledge, 2003); Hendrik Wagenaar ed. Meaning in Action: Interpretation and Dialogue in Policy Analysis (Amonk, NY: M.E. Sharpe, 2011).
Gorwa, “Elections, Institutions, and the Regulatory Politics of Platform Governance”; Rotem Medzini, “Enhanced Self-Regulation: The Case of Facebook’s Content Governance,” New Media & Society, 24, no. 10 (2021), https://doi.org/10.1177/1461444821989352.
Interviews 12, 18.
Interview 14.
See section on document analysis below and Appendix C for more information. Note: No official statements on the proposed draft Act issued by Google, Twitter, or its subsidiaries were identified. Similarly, no media commentary made by spokespersons for Twitter was identified.
Translations from German to English by the author.
Stefan Krempl, “Plattform-Regulierung: Deutsches Digitale-Dienste-Gesetz soll NetzDG ersetzen,” heise online, April 23, 2023, https://www.heise.de/news/Plattform-Regulierung-Deutsches-Digitale-Dienste-Gesetz-soll-NetzDG-ersetzen-8982385.html.
European Commission, “The Digital Services Act Package,” Shaping Europe’s Digital Future, May 5, 2023, https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.
BMJV; German: Bundesministerium der Justiz.
Bundesregierung, Gesetzesentwurf der Bundesregierung, 1; Deutscher Bundestag, Gesetzentwurf der Fraktionen der CDU/CSU und SPD, 18/12356 § (2017), 1, https://dserver.bundestag.de/btd/18/130/1813013.pdf.
Bundesregierung, Gesetzesentwurf der Bundesregierung, 1; Deutscher Bundestag, Gesetzentwurf der Fraktionen der CDU/CSU und SPD, 1.
Interviews 9, 16, 19.
Lisa-Maria Neudert, Philip Howard, and Bence Kollanyi, “Sourcing and Automation of Political News and Information During Three European Elections,” Social Media + Society 5, no. 3 (July 1, 2019), https://doi.org/10.1177/2056305119863147; Roopika Risam, “Now You See Them: Self-Representation and the Refugee Selfie,” Popular Communication 16, no. 1 (January 2, 2018): 58–71, https://doi.org/10.1080/15405702.2017.1413191; Jonas Andersson Schwarz, “Platform Logic: An Interdisciplinary Approach to the Platform-Based Economy,” Policy & Internet 9, no. 4 (2017): 374–94, https://doi.org/10.1002/poi3.159.
Caterina Froio, and Bharath Ganesh, “The Transnationalisation of Far Right Discourse on Twitter,” European Societies 21, no. 4 (August 8, 2019): 513–39, https://doi.org/10.1080/14616696.2018.1494295; Michael Hameleers, “Putting Our Own People First: The Content and Effects of Online Right-Wing Populist Discourse Surrounding the European Refugee Crisis,” Mass Communication and Society 22, no. 6 (November 2, 2019): 804–26, https://doi.org/10.1080/15205436.2019.1655768; Neudert et al., “Sourcing and Automation of Political News and Information During Three European Elections.”
Interview 16.
Interview 21.
See interview 19 with Dr. Anna-Katharina Meßmer who launched the campaign against sexualized hate speech on the internet through the 2013 Twitter campaign #aufschrei, which received a Grimme Online Award; see also Gorwa, “Elections, Institutions, and the Regulatory Politics of Platform Governance.”
Interviews 19, 21.
Sebastian Stier, Lisa Posch, Arnim Bleier, and Markus Strohmaier, “When Populists Become Popular: Comparing Facebook Use by the Right-Wing Movement Pegida and German Political Parties,” Information, Communication & Society 20, no. 9 (September 2, 2017): 1365–88, https://doi.org/10.1080/1369118X.2017.1328519; Sebastian Stier, Nora Kirkizh, Caterina Froio, and Ralph Schroeder, “Populist Attitudes and Selective Exposure to Online News: A Cross-Country Analysis Combining Web Tracking and Surveys,” The International Journal of Press/Politics 25, no. 3 (July 1, 2020): 426–46, https://doi.org/10.1177/1940161220907018.
David Kaye, Speech Police: The Global Struggle to Govern the Internet (New York: Columbia Global Reports, 2019), 66.
Interview 3, 10, 21.
Kingdon, Agendas, Alternatives, and Public Policies, Update Edition, with an Epilogue on Health Care.
Joachim Huber, “Antwort auf Brief von Heiko Maas: Facebook: ‘Kein Ort für Rassismus’,” Der Tagesspiegel Online, 2015, https://www.tagesspiegel.de/gesellschaft/medien/facebook-kein-ort-fur-rassismus-8517787.html.
Mass interviewed by Krauss, “Null Toleranz bei Hassparolen.” Translated by the author(s). The original quote is, “Was offline verboten ist, ist auch online nicht erlaubt.”
Angela Merkel, Interview mit Bundeskanzlerin Angela Merkel: „Grundrecht auf Asyl kennt keine Obergrenze“, September 11, 2015, https://rp-online.de/politik/deutschland/angela-merkel-das-grundrecht-auf-asyl-kennt-keine-obergrenze_aid-9533771.
Joachim Gauck über die Angst der Deutschen und Hass im Internet, Interview by Florian Gathmann, Florian Harms, Roland Nelles, and Joachim Gauck, November 7, 2016, https://www.spiegel.de/politik/deutschland/joachim-gauck-wer-hasst-wird-nicht-die-mehrheit-erringen-a-1119752.html.
Frank-Walter Steinmeier, Interview: Interview mit der Funke-Mediengruppe. Funke Mediengruppe, 2017, https://www.bundespraesident.de/SharedDocs/Reden/DE/Frank-Walter-Steinmeier/Interviews/2017/170415-Interview-Funke-Mediengruppe.html.
Bernhard Rohleder, “Von Der Ente Zu Fake News” (Berlin: Bitkom Research, 2017), https://www.bitkom.org/sites/default/files/file/import/Bitkom-Charts-PK-Fake-News-02-02-2017.pdf; Stephan Weichert, and Leif Kramp, “Zentrale Ergebnisse forsa-Befragungen 2016 bis 2018 zur Wahrnehmung von Hassrede” (Nordrhein-Westfalen, Germany: Landesanstalt für Medien NRW, 2018), https://www.medienanstalt-nrw.de/zum-nachlesen/forschung/abgeschlossene-projekte/forsa-befragung-zur-wahrnehmung-von-hassrede.html.
Fleischhauer, “Facebook.” Translated by the author(s).
Interview 21.
Interviews 21, 22.
Task Force Umgang mit rechtswidrigen Hassbotschaften im Internet. “Gemeinsam Gegen Hassbotschaften Von Der Task Force „Umgang Mit Rechtswidrigen Hassbotschaften Im Internet“ Vorgeschlagene Wege Zur Bekämpfung von Hassinhalten Im Netz,” Ergebnispapier, December 15, 2015, 2, https://fragdenstaat.de/anfrage/bmjv-task-force/.
Krempl, “Plattform-Regulierung.”
Jugendschutz, “Löschung Rechtswidriger Hassbeiträge Bei Facebook, YouTube Und Twitter. Ergebnisse Des Monitorings von Beschwerdemechanismen Jugendaffiner Dienste,” 2016, https://www.bmfsfj.de/resource/blob/111514/f5c7e2a1e1087fc21f91235208be03ad/20160926-testergebnisse-jugendschutz-net-hassbotschaften-data.pdf.
Heiko Maas, “Brief Vom Justizminister Heiko Maas an Eva-Maria Kirschsieper Und Richard Allan,” July 15, 2016, https://fragdenstaat.de/anfrage/brief-von-maas-an-facebook/53091/anhang/BMJVIFG-Bescheidvom27.07.2016_geschwaerzt.pdf.
Kaye, Speech Police, 68.
Gorwa, “Elections, Institutions, and the Regulatory Politics of Platform Governance,” 7.
Interviews 4, 6, 9.
Interviews 2, 13, 18, 19.
Interview 20.
Interview 20.
Interview 7.
Interview 4.
Melissa Eddy, “How a Refugee’s Selfie With Merkel Led to a Facebook Lawsuit,” TheNew York Times, February 6, 2017, https://www.nytimes.com/2017/02/06/business/syria-refugee-anas-modamani-germany-facebook.html.
Interviews 2, 3.
Interviews 4, 20.
Interviews 9, 21.
Interview 20.
Interview 6.
Interview 1.
“Plenarprotokoll 18/244.” Berlin, June 30, 2017, 25116, https://dserver.bundestag.de/btp/18/18244.pdf.
Ibid., 25121.
Ibid., 25122.
Interviews 1, 4, 6, 19, 20.
Künast later became involved in various legal actions against social media networks. In 2019, she sued Facebook seeking to force the company to provide the identities of users who had posted abusive and potentially defamatory comments about her. The case eventually reached the German Federal Constitutional Court (Bundesverfassungsgericht).
“Plenarprotokoll 18/244,” 25123.
Deutscher Bundestag, Antrag der Abgeordneten Dr. Konstantin von Notz, Renate Künast, Tabea Rößner, Dieter Janecek, Luise Amtsberg, Katja Keul, Monika Lazar, Irene Mihalic, Özcan Mutlu, Ulle Schauws, Hans-Christian Ströbele und der Fraktion BÜNDNIS 90/DIE GRÜNEN, 18/11856 § (2017), https://dserver.bundestag.de/btd/18/118/1811856.pdf.
Deutscher Bundestag, Kleine Anfrage der Abgeordneten Heike Hänsel, Sevim Dağdelen, Inge Höger, Andrej Hunko, Niema Movassat, Norbert Müller (Potsdam), Dr. Petra Sitte, Kersten Steinke, Alexander Ulrich, Katrin Werner und der Fraktion DIE LINKE., 18/11986 § (2017), https://dserver.bundestag.de/btd/18/119/1811986.pdf.
Article 19. “Germany: The Act to Improve Enforcement of the Law in Social Networks.” Legal Analysis. London: Article 19, 2017, https://www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-German-NetzDG-Act.pdf; Reporter ohne Grenzen, “Meldung,” June 19, 2017, https://www.reporter-ohne-grenzen.de/pressemitteilungen/meldung/netzdg-grundlegend-neuer-ansatz-noetig.
Maria-Terese Weber, and Bitkom, “Bitkom Stellungnahme Zum Regierungsentwurf Eines Netzwerkdurchsetzungsgesetzes,” April 20, 2017, https://www.bitkom.org/sites/main/files/file/import/FirstSpirit-149275573214220170420-Bitkom-Stellungnahme-zum-Regierungsentwurf-NetzwerkDG.pdf; eco Verband der Internetwirtschaft e.V., “Stellungnahme Zum Referentenentwurf Eines Gesetzes Zur Verbesserung Der Rechtsdurchsetzung in Sozialen Netzwerken (Netzwerkdurchsetzungsgesetz),” March 30, 2017, https://www.bundesgerichtshof.de/SharedDocs/Downloads/DE/Bibliothek/Gesetzesmaterialien/18_wp/NetzDG/stellung_eco_refe.pdf?__blob=publicationFile
Eva-Maria Kirschsieper, “Eva-Maria Kirschsieper zum NetzDG,” Presented at the Im Dialog – Hate Speech und Co./Postfaktische Welt, SPD Bundestagsfraktion, Berlin, May 18, 2017, https://www.wwwagner.tv/?p=36152.
Dietmar Neuerer, “NetzDG: Drei Viertel der gemeldeten Youtube-Inhalte bleiben im Netz,” July 27, 2018, https://www.handelsblatt.com/politik/deutschland/netzdg-drei-viertel-der-gemeldeten-youtube-inhalte-bleiben-im-netz/22852184.html.
Rohleder, “Von Der Ente Zu Fake News”; Weichert and Kramp, “Zentrale Ergebnisse forsa-Befragungen 2016 bis 2018 zur Wahrnehmung von Hassrede.”
Frank Pergande, “NetzDG: Ein Gesetz gegen die AfD?” FAZ.NET, January 8, 2018, https://www.faz.net/aktuell/politik/inland/netzdg-ein-gesetz-gegen-die-afd-15378459.html; Interviews 1, 12.
Andreas Jungherr, Ralph Schroeder, and Sebastian Stier, “Digital Media and the Surge of Political Outsiders: Explaining the Success of Political Challengers in the United States, Germany, and China,” Social Media+ Society 5, no. 3 (2019), https://doi.org/10.1177/2056305119875439.
Interview 14.
European Parliament, “Shaping the Digital Transformation: EU Strategy Explained,” April 22, 2021, https://www.europarl.europa.eu/news/en/headlines/society/20210414STO02010/shaping-the-digital-transformation-eu-strategy-explained; Ursula von der Leyen, A Union That Strives for More: My Agenda for Europe : Political Guidelines for the Next European Commission 2019-2024 (Brussels, Belgium: Publications Office of the European Union, 2019), 13.
Interview 19.
“Plenarprotokoll 18/244.”
On platforms not sharing information with policymakers or civil society, see also Tarleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media, New Haven, CT: Yale University Press, 2018.
Interviews 3, 9.
German: Auschuss Digitale Agenda.
Interviews 9, 13.
Interview 23.
Interview 23.
Interviews 7, 13, 19.
Interview 13.
Interview 13.
Interview 23.
Interview 18.
Interview 6.
Interview 21.
Interview 9.
Interview 17.
German: Gleichschaltung.
Article 5 of German Basic Law guarantees the freedom of expression, but this freedom is subject to limitations by general laws, including the German Criminal Code (StGB). §130 StGB criminalizes incitement to hatred and Holocaust denial. If someone in Germany denies the Holocaust, this would prompt an investigation under §130 StGB.
Bradley A. Appleman, “Hate Speech: A Comparison of the Approaches Taken by the United States and Germany,” Wisconsin International Law Journal 14, no. 2 (1995): 422–39, 450.
Interview 9.
Interview 1.
In this regard, NetzDG resonates with Kingdon’s idea of “value acceptability” that posits that policy ideas that conform to existing values are more likely to succeed.
Interview 19.
Interviews 1, 9, 14.
Interview 18.
Interview 21; European Commission, A Digital Single Market Strategy for Europe, COM(2015) 192 final § (2015), https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52015DC0192.
Interviews 6, 20
Interview 18.
Interview 23.
Interview 15.
Interview 16.
Interview 6.
Bundesregierung, “Bericht Der Bundesregierung Zur Evaluierung Des Netzwerkdurchsetzungsgesetzes”, September 10, 2020, 29, https://dserver.bundestag.de/btd/19/226/1922610.pdf. Note: Official statistics on staff numbers were not available.
Bundesregierung, 64.
Martin Krauss, “Null Toleranz bei Hassparolen,” Jüdische Allgemeine, September 22, 2015, https://www.juedische-allgemeine.de/politik/null-toleranz-bei-hassparolen/.
Ibid.
Linda Schleif, and Matthias Kettemann, “Komplementär Oder Konkurrierend: NetzDG Und DSA (Platform-Governance Im Superwahljahr 2021). Leibniz-Institut Für Medienforschung| Hans-Bredow-Institut (HBI),” Media Research Blog (blog), 2021.
Julian Jaursch, “New EU Rules for Digital Services: Why Germany Needs Strong Platform Oversight Structures,” Berlin, Germany: Stiftung Neue Verantwortung, May 17, 2022, https://www.stiftung-nv.de/de/publication/dsa-why-germany-needs-strong-platform-oversight-structures.