YouTube on Wednesday published a lengthy explanation of its harassment and hate speech policies, following a controversy over its refusal to remove videos by a popular right-wing host who has made homophobic and racist comments targeting another video maker.
“As an open platform, we sometimes host opinions and views that many, ourselves included, may find offensive,” wrote Chris Dale, YouTube’s head of communications, in a blog post. “These could include edgy stand-up comedy routines, a chart-topping song, or a charged political rant — and more. Short moments from these videos spliced together paint a troubling picture. But, individually, they don’t always cross the line.”
Dale went on to explain that for a video to violate YouTube’s harassment policies, its purpose must be to incite harassment against someone, threaten or humiliate someone, or reveal personal information about someone. To violate the site’s hate speech guidelines, a video’s main purpose must be to “incite hatred toward or promote supremacism over a protected group,” or aim to incite violence, he said.
“To be clear, using racial, homophobic, or sexist epithets on their own would not necessarily violate either of these policies. For example, as noted above, lewd or offensive language is often used in songs and comedic routines,” he wrote. “It’s when the primary purpose of the video is hate or harassment. And when videos violate these policies, we remove them.”
Dale’s statement continued, alluding to recent criticism of YouTube’s handling of the issue:
Not everyone will agree with the calls we make — some will say we haven’t done enough; others will say we’ve gone too far. And, sometimes, a decision to leave an offensive video on the site will look like us defending people who have used their platforms and audiences to bully, demean, marginalize or ignore others. If we were to take all potentially offensive content down, we’d be losing valuable speech — speech that allows people everywhere to raise their voices, tell their stories, question those in power, and participate in the critical cultural and political conversations of our day.
Dale said that YouTube will examine its harassment policies in the coming months with the aim of updating them.
Dale’s blog post comes after several days of mostly anonymous communication from YouTube about the situation involving Carlos Maza, a Vox reporter who produces a video series for the media platform, and Steven Crowder, a conservative commentator with nearly 4 million followers who has frequently used homophobic and racist slurs towards Maza in his videos.
Crowder’s derogatory statements toward Maza came to widespread attention in the last week after Maza posted a cut of some of Crowder’s comments to Twitter, and called out YouTube for not doing more to penalize Crowder.
Since I started working at Vox, Steven Crowder has been making video after video "debunking" Strikethrough. Every single video has included repeated, overt attacks on my sexual orientation and ethnicity. Here's a sample: pic.twitter.com/UReCcQ2Elj
— Carlos Maza (@gaywonk) May 31, 2019
YouTube at first declined to take action against Crowder, leading to widespread condemnation of the site and reigniting a longstanding debate about the responsibility of social media companies to regulate behavior on their platforms. The site’s response to the controversy has been mixed, but it has essentially maintained that Crowder’s language, while offensive and hurtful, does not violate its policies.
However, after further review of Crowder’s videos, YouTube decided to demonetize his channel, “because a pattern of egregious actions has harmed the broader community” and violates the YouTube Partner Program policies.
In a separate move on Wednesday, YouTube announced an update to its hate speech policies, in which the company said it would ban videos with white supremacy and neo-Nazi viewpoints.