YouTube has continued to muck up its own attempts to regulate hate speech on its platform with two recent developments.
First, it said today that users can make hateful, even homophobic statements targeting a single individual as long as the statements are part of a larger argument. Second, the company’s recent efforts to ban extremists has included demonetizing videos of journalists who make their money covering racism and white supremacists.
These developments come after yesterday’s decision by YouTube to allow right-wing vlogger Steven Crowder to remain on the platform despite videos in which he mocks gay Latino journalist Carlos Maza as a “lispy sprite,” “an angry little queer” and the “gay Latino from Vox.” Crowder also said that Maza eats a lot of d*cks and imitated Maza as having a high-pitched voice.
In a now-viral tweet posted on May 30, 2019, Maza brought attention to the fact that Crowder’s videos inspire his fans to flood Maza’s phone and social media inbox with hateful, threatening messages.
Related: How to debunk the religious right’s transgender scare tactics
Afterwards, YouTube promised to look into it. But after finding no instances of Crowder actually directing his viewers to harass Maza, YouTube said that it would allow Crowder to continue to operate with impunity, though it would continue to investigate his channel for other possible issues.
Later on, YouTube finally said it would demonetize Crowder’s channel for “continued egregious actions that have harmed the broader community,” adding, “To be reinstated, he will need to address all of the issues with his channel.”
The company’s response left some critics feeling like it had only addressed this particular issue because Maza’s criticisms of the platform went viral, not due to an actual change in its inconsistent enforcement practices.
After continued criticism of YouTube’s decision, YouTube’s global head of communications and public affairs, Chris Dale, posted a statement explaining its decision making processes, an explanation which only created further upset:
There are two key policies at play here: harassment and hate speech. For harassment, we look at whether the purpose of the video is to incite harassment, threaten or humiliate an individual; or whether personal information is revealed. We consider the entire video: For example, is it a two-minute video dedicated to going after an individual? A 30-minute video of political speech where different individuals are called out a handful of times? Is it focused on a public or private figure? For hate speech, we look at whether the primary purpose of the video is to incite hatred toward or promote supremacism over a protected group; or whether it seeks to incite violence. To be clear, using racial, homophobic, or sexist epithets on their own would not necessarily violate either of these policies. For example, as noted above, lewd or offensive language is often used in songs and comedic routines. It’s when the primary purpose of the video is hate or harassment. And when videos violate these policies, we remove them.
Not everyone will agree with the calls we make — some will say we haven’t done enough; others will say we’ve gone too far. And, sometimes, a decision to leave an offensive video on the site will look like us defending people who have used their platforms and audiences to bully, demean, marginalize or ignore others. If we were to take all potentially offensive content down, we’d be losing valuable speech — speech that allows people everywhere to raise their voices, tell their stories, question those in power, and participate in the critical cultural and political conversations of our day.
In response, Maza told Vox.com:
“It’s a batsh*t policy, and YouTube knows it. A policy that says that all you need to do to get away with hate speech on the platform is to mix it with something else is an instruction manual to monsters who want to figure out a way to target people based on identity.”
“Anyone who’s experienced bullying knows that harassment is always coupled with other criticisms.” [For queer people and other marginalized users wanting to create content on YouTube], the price of entry is that you have to accept that the people who respond to you are allowed to use hate speech to target your identity as part of their criticism. That’s a miserable policy. It’s not actually an anti-harassment policy. It’s a loophole.”
Yesterday, YouTube also declared its initiative to remove content promoting group superiority, “discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”
The New York Times explained the policy further:
“Channels that post some hateful content, but that do not violate YouTube’s rules with the majority of their videos, may receive strikes under YouTube’s three-strike enforcement system, but would not be immediately banned.
The company also said that channels that “repeatedly brush up against our hate speech policies,” but don’t violate them outright, would be removed from YouTube’s advertising program, which allows channel owners to share in the advertising revenue their videos generate.”
However, YouTube’s new efforts have apparently caused the video sharing platform to demonetize the videos of content creators who make their livelihoods exposing racism, white supremacy and other hateful biases.
This includes Ford Fischer, a video journalist and editor-in-chief of the independent news outlet News to Share whose footage has been featured in Oscar- & Emmy award winning films about racial justice, the KKK and the alt-right. Fischer’s channel has been demonetized, forcing him to beg social media followers for financial donations.
YouTube’s recent missteps come after years of people criticizing the company for a proliferation of far-right content that radicalizes viewers into racist, misogynist and queerphobic enclaves. Last year, YouTube also came under fire for releasing a Pride video praising its LGBTQ content creators while at the same time displaying anti-LGBTQ advertisements and de-monetizing the videos of LGBTQ vloggers.
In short, YouTube’s inability to coherently and consistently apply its standards merely reveal a serious issue that the company has failed to substantially address since its inception.
And now that it’s beginning to gain more widespread attention, even a group of supportive Google employees (the company that owns YouTube) have said in a tweet, “Despite YouTube capitalizing on Pride as a marketing campaign, it’s clear they have no issue making policy decisions that harm LGBTQ people like @gaywonk [Carlos Maza]. We have #NoPrideInYT.”
Buzzfeed News reports, “Two sources familiar with the dialogue inside Google said employees are currently circulating a petition demanding that management remove pride branding from its public social media accounts in the wake of the uproar over Crowder’s videos.”