Skip to content

This Week in Technology + Press Freedom: Oct. 20, 2019

Post categories

  1. Policy

Here’s what the staff of the Technology and Press Freedom Project at the Reporters Committee for Freedom of the Press is tracking in what’s been a big week for online content moderation and regulation issues.

A Fifth Estate alongside the other power structures of society

Facebook founder and chief executive Mark Zuckerberg gave a speech this week at Georgetown University where he “went on the offense” against critics of the company’s decision not to moderate or fact-check politicians’ speech or advertisements. Zuckerberg defended that policy on the grounds that even false speech by politicians is newsworthy.

The address covered a lot of ground but one comment by Zuckerberg stands out with respect to press rights. Zuckerberg said that personal expression “at scale” represents a new kind of “Fifth Estate” to stand alongside “other power structures.” In context, Zuckerberg seems to be saying that the Fifth Estate operates like the Fourth Estate — the press — as an independent check on the formal “power structures” of government. It’s an interesting framing as the “Fourth Estate” in the U.S. is often used to refer to the press’s independence from and role as a check on the three branches of government.

Also notable is how the policy stands in contrast to some cable programmers who have refused to air a Trump campaign advertisement charging that former Vice President Biden promised Ukraine $1 billion to fire a prosecutor investigating a gas company with ties to Biden’s son. Broadcasters aren’t permitted to make such determinations under Federal Communications Commission “no censorship” rules but are shielded from liability for content in candidate ads.

— Gabe Rottman

Quick Hits

More Section 230 action on the Hill: The House Committee on Energy and Commerce held a hearing on Wednesday to discuss Section 230 of the Communications Decency Act — a law that some fear is responsible for the proliferation of objectionable content online, but is also considered by many to be responsible for the creation of the modern internet. During the hearing, members asked questions of witnesses from Reddit, Google, academia, and advocacy organizations. Section 230 has garnered a lot of attention recently, with some saying that the law provides too much immunity to web platforms hosting content, and others arguing that it actually incentivizes companies to moderate content, which was its original intent. At its core, Section 230 commands that, with some exceptions, the hosts of content produced by third parties, like Facebook or Twitter, cannot be held legally liable for that content under most state or local laws, or in a federal civil suit. While lawmakers and panelists at the hearing rejected the notion of revoking Section 230’s protections entirely, there could be a desire among some representatives to amend the law in a way that would create more liability for platforms. Open internet groups have, however, warned that any alteration to the law could be problematic, as it could lead to a rise in lawsuits against websites based on third-party created content and could chill protected speech online.

U.S. and U.K. reach agreement under CLOUD Act: The U.S. and United Kingdom earlier this month signed an agreement, now made available by the U.K., that would govern how law enforcement in either country can access electronic data held on companies’ servers in the other jurisdiction. The agreement, the first of its kind, creates a designated authority in each country that would screen the opposing country’s orders for access to electronic data. It also obligates the U.K. to adopt procedures to limit the retention and dissemination of information incidentally acquired about people in the U.S. There’s an interesting use limitation in Article 8, which gives the U.K. veto power over the introduction of any evidence collected through the agreement in U.S. death penalty cases and the U.S. has a veto in cases raises free speech issues (Official Secrets Act, hmm?). Professors Jen Daskal and Peter Swire have a relatively optimistic take here. The Electronic Frontier Foundation expressed concern based on public statements about the agreement here.

Comprehensive federal privacy bill developments: This week, a bill introduced earlier this year by Rep. Suzan Delbene (D-Wash.) received a boost when the New Democrat Coalition, a group of centrist Democrats, announced their intention to endorse the measure. The bill would permit consumers to opt out of data collection and obligate internet companies to disclose how and why they collect user information. The Delbene bill can be contrasted with other proposals, including a recently updated one from Sen. Ron Wyden (D-Ore.). Unlike the Delbene bill, which would designate the Federal Trade Commission as the sole enforcer of privacy violations, Wyden’s Mind Your Own Business Act would allow states to enter the arena with their own privacy laws and give a right of action to civil society watchdogs.

Drone ordinance challenge: The National Press Photographers Association, the Texas Press Association, and others have filed a lawsuit in federal court challenging a Texas law governing the use of drones. The law currently makes it a crime for journalists to use drones for newsgathering purposes if the drone captures images of a person or of private property, regardless of where the drone is located. The coalition is challenging the law on First and Fourteenth Amendment grounds, arguing that the law trenches on newsgathering and chills a protected form of expression — photography. Other state and local drone laws have been challenged in the past. For instance, in September 2017, a federal judge> struck down a Newton, Massachusetts, drone ordinance because the regulation was preempted by federal law.

Detecting real ‘fake news’: Researchers are warning that developments in artificial intelligence technology have made realistic fabricated news articles easier to produce, while such stories are simultaneously more difficult to detect. Unlike false reports generated and circulated by human writers, the software can reportedly produce news articles by drawing on millions of bits of data. The possibility that the software could be used to generate misinformation has garnered the attention of the U.S. government, which in August unveiled a project to defend against “a wide range of automated disinformation attacks” through what the Defense Advanced Research Agency calls “semantic forensics.” Notably, news outlets have increasingly relied on this type of technology to generate “real” computer-generated news on, for instance, baseball games,earthquakes, and the financial market.

Gif of the Week: This week’s gif, from the animated sitcom “Rick and Morty,” was inspired by both the drone and robot journalism stories.


Like what you’ve read? Sign up to get This Week in Technology + Press Freedom delivered straight to your inbox!

The Technology and Press Freedom Project at the Reporters Committee for Freedom of the Press uses integrated advocacy — combining the law, policy analysis, and public education — to defend and promote press rights on issues at the intersection of technology and press freedom, such as reporter-source confidentiality protections, electronic surveillance law and policy, and content regulation online and in other media. TPFP is directed by Reporters Committee attorney Gabe Rottman. He works with Stanton Foundation National Security/Free Press Fellow Linda Moon and legal fellows Jordan Murov-Goodman and Lyndsey Wajert.

Stay informed by signing up for our mailing list

Keep up with our work by signing up to receive our monthly newsletter. We'll send you updates about the cases we're doing with journalists, news organizations, and documentary filmmakers working to keep you informed.