
They raise fundamental questions not just about YouTube’s business but about what social-media platforms have become and what they should be. The technical, social, and political challenges of moderating such a system are profound. Every minute, its users upload five hundred hours of new video. The site now attracts a monthly audience of two billion people and employs thousands of moderators. of Google, which owns YouTube, told Axios last month. “YouTube has the scale of the entire Internet,” Sundar Pichai, the C.E.O. In the intervening thirteen years, a lot has changed. The company removed videos often, rarely encountering pushback. At first, the site was moderated largely by its co-founders in 2006, they hired a single, part-time moderator. YouTube was founded, in 2005, by Chad Hurley, Steve Chen, and Jawed Karim, who met while working at PayPal. Dealing with it used to be fairly straightforward. Schaffer told me that hate speech had been a problem on YouTube since its earliest days. (A typical subject line: “WTF is Going on at YouTube?”) Sitting in a dentist’s office, he started typing a response on his phone, trying to lay out what he thought had gone wrong at the company.

It combined into a Voltron of bad news.” (A YouTube spokesperson said that the launch date was not in response to any individual event.) Former colleagues deluged Schaffer, who had left the company in 2009, with bewildered e-mails and texts. “Like, ‘Hey, can we move up that launch to change the news cycle?’ Instead, it made it worse. “It’s a classic move from a comms playbook,” Micah Schaffer, a technology adviser who wrote YouTube’s first community guidelines, told me. YouTube employees, siding with Maza, began denouncing their bosses on Twitter and in the press. Maza said that he wanted Crowder’s channel removed completely conservatives, including the Republican senator Ted Cruz, complained about censorship. Then it announced that Crowder would be cut off from the platform’s ad dollars. The next day, it announced its new policy, which included a warning that the company would no longer share ad revenue with YouTubers who repeatedly brushed up against its rules. A week later, it tweeted that Crowder hadn’t violated its community guidelines in any of the videos that Maza highlighted. Maza’s supporters, meanwhile, shared screenshots of ads that had run before Crowder’s videos, suggesting that, because YouTube offers popular video producers a cut of ad revenue, the company had implicitly condoned Crowder’s messages. On Twitter and his YouTube channel, Crowder insisted that, in labelling Maza a “lispy queer” and a “token Vox gay-athiest sprite,” he had been trying to be funny. When Crowder mentioned Maza in a video, his fans piled on last year, Maza’s cell phone was bombarded with hundreds of texts from different numbers which read “debate steven crowder.” Maza said that he’d reported the behavior to YouTube’s content moderators numerous times, and that they had done nothing. Maza posted a video supercut of bigoted insults that he’d received from Steven Crowder, a conservative comedian with nearly four million YouTube followers the insults focussed on Maza’s ethnicity and sexual orientation. In a Twitter thread that quickly went viral, Maza argued that the company’s publicity campaign belied its lax enforcement of the content and harassment policies it had already put in place. On May 30th, Carlos Maza, a media critic at Vox, upended these efforts. In May, meanwhile, it started preparing for Pride Month, turning its red logo rainbow-colored and promoting popular L.G.B.T.Q. YouTube planned to roll out its new policy as early as June. In the course of the spring, YouTube drafted a new policy that would ban videos trafficking in historical “denialism” (of the Holocaust, 9/11, Sandy Hook) and “supremacist” views (lauding the “white race,” arguing that men were intellectually superior to women). Wojcicki had two competing goals: she wanted to avoid accusations of ideological bias while also affirming her company’s values. “There are no sacred cows,” Susan Wojcicki, the C.E.O.
On grounds of freedom of expression and ideological neutrality, the platform has long allowed users to upload videos endorsing noxious ideas, from conspiracy theories to neo-Nazism.

Photograph by Peter Prato / NYT / ReduxĮarlier this year, executives at YouTube began mulling, once again, the problem of online speech. The company is facing profound technical, social, and political challenges in moderating its own system.
