Top

Google to combat illegal content with smart regulation

A smart regulatory framework is essential to enabling an appropriate approach to illegal content.

Google has written before about how they are working to support smart regulation, and one area of increasing attention is regulation to combat illegal content.

As online platforms have become increasingly popular, there’s been a rich debate about the best legal framework for combating illegal content in a way that respects other social values, like free expression, diversity and innovation. Now, various laws provide detailed regulations, including Section 230 of the Communications Decency Act in the United States and European Union’s e-Commerce Directive.

Google invests millions of dollars in technology and people to combat illegal content in an effective and fair way. It’s a complex task, and–just as in offline contexts—it’s not a problem that can be totally solved. Rather, it’s a problem that must be managed, and Google is constantly refining their practices.

In addressing illegal content, Google is also conscious of the importance of protecting legal speech. Context often matters when determining whether content is illegal. Consider a video of military conflict. In one context the footage might be documentary evidence of atrocities in areas where journalists have great difficulty and danger accessing. In another context the footage could be promotional material for an illegal organization. Even a highly trained reviewer could have a hard time telling the difference, and Google needs to get those decisions right across many different languages and cultures, and across the vast scale of audio, video, text, and images uploaded online. Google will make it easy to easily submit takedown notices; at the same time, they will also create checks and balances against misuse of removal processes. And Google will look to the work of international agencies and principles from leading groups like the Global Network Initiative.

A smart regulatory framework is essential to enabling an appropriate approach to illegal content. Google wants to share four key principles that inform their practices and that (Google suggests) make for an effective regulatory framework:

Shared Responsibility: Tackling illegal content is a societal challenge—in which companies, governments, civil society, and users all have a role to play. Whether a company is alleging copyright infringement, an individual is claiming defamation, or a government is seeking removal of terrorist content, it’s essential to provide clear notice about the specific piece of content to an online platform, and then platforms have a responsibility to take appropriate action on the specific content. In some cases, content may not be clearly illegal, either because the facts are uncertain or because the legal outcome depends on a difficult balancing act; in turn, courts have an essential role to play in fact-finding and reaching legal conclusions on which platforms can rely.

Rule of law and creating legal clarity: It’s important to clearly define what platforms can do to fulfill their legal responsibilities, including removal obligations. An online platform that takes other voluntary steps to address illegal content should not be penalized. (This is sometimes called “Good Samaritan” protection.)

Flexibility to accommodate new technology: While laws should accommodate relevant differences between platforms, given the fast-evolving nature of the sector, laws should be written in ways that address the underlying issue rather than focusing on existing technologies or mandating specific technological fixes.

Fairness and transparency: Laws should support companies’ ability to publish transparency reports about content removals, and provide people with notice and an ability to appeal removal of content. They should also recognize that fairness is a flexible and context-dependent notion—for example, improperly blocking newsworthy content or political expression could cause more harm than mistakenly blocking other types of content.

With these principles in mind, Google supports refinement of notice-and-takedown regimes, but also has significant concerns about laws that would mandate proactively monitoring or filtering content, impose overly rigid timelines for content removal, or otherwise impose harsh penalties even on those acting in good faith. These types of laws create a risk that platforms won’t take a balanced approach to content removals, but instead take a “better safe than sorry” approach—blocking content at upload or implementing a “take down first, ask questions later (or never)” approach. Google regularly receives overly broad removal requests, and analyses of cease-and-desist and takedown letters have found that many seek to remove potentially legitimate or protected speech.

There’s ample room for debate and nuance on these topics and Google promises to continue to seek ongoing collaboration among governments, industry, and civil society on this front. Over time, an ecosystem of tools and institutions—like the Global Internet Forum to Counter Terrorism, and the Internet Watch Foundation, which has taken down child sexual abuse material for more than two decades—has evolved to address the issue. Continuing to develop initiatives like these and other multistakeholder efforts remains critical, and Google looks forward to progressing those discussions.

( Source : deccan chronicle )
Next Story