But nearly two weeks after those bans, platforms are still flooded with clips of Tate making derogatory comments about women – underscoring what some media experts suggest is part of a dangerous system whose algorithm can be manipulated to radicalize young men to adopt harmful views against women and the LGBTQ community. And as Tate’s case shows, banning controversial figures can actually make the problem worse. Tate, a former kick-boxer, rose to fame after appearing on UK reality show Big Brother in 2016. He was kicked off the show after a video emerged of him attacking a woman with a belt. Tate said the incident was consensual. Recently, it went viral for soundbites shared on platforms like TikTok. These clips feature Tate, often wearing sunglasses and shirtless, making offensive comments about women. A notable example includes clips of Tate saying that if a woman dates a man, she “belongs” to him. In another clip, Tate suggested that women in relationships who have their own social media accounts are cheating. In a video posted to Vimeo on August 23, Tate responded to the bans by saying he has been “unfairly maligned” and his comments were taken out of context. Tate did not respond to a CBC News request for comment.
From harmless memes to outright misogyny
Content like Tate’s often begins in a way that seems relatively harmless, but then slowly becomes more malicious, says Joanna Schroeder, a writer whose work focuses on gender and media representation. For example, he says, young boys often visit sites like YouTube to look for videos related to Minecraft, a hugely popular video game. But YouTube’s algorithm often guesses their age and gender — and Schroeder says it can then push them toward harmful content. WATCHES | Algorithms and their agenda:
How algorithms target young men
Joanna Schroeder, a writer focusing on gender and media, explains why social media algorithms target young men and how this can affect what they see online. “There are people who want to target that demographic and start showing them increasingly racist content.” Schroeder says Tate’s appeal is due, in part, to how his views are framed. The idea that what he’s saying is an “unpopular opinion that no one else will say out loud” can suggest to a young person that he has value, he says. And since “edgy” content is often presented as something that a younger demographic should consider normal – or even find funny – it’s slowly becoming problematic. An example of this is the Pepe the Frog meme, which started as a harmless cartoon frog and turned into a symbol of hate. Pepe the Frog started out as an apolitical meme, but was later adopted by the alt-right movement. (Wikipedia) It started as an apolitical meme popular on sites like Myspace and 4chan in the 2000s. But as its popularity grew, it was appropriated by the alt-right movement. Schroeder says Pepe came to represent “anti-gay” and “anti-women” sentiments. And he says that teenagers may initially perceive memes as jokes, but over time it can affect how and what young people think. And clips like Tate’s are a common way people are radicalized, says Ellen Chloë Bateman, a documentary and podcast producer who has researched online radicalization among young men and the incel subculture. Violence against women is normalized, he says, embedded in the psyche of young men through images and memes, in what he calls “a culture of intense competition and solitary behavior.” Schroeder says this can often be seen on TikTok. Clip videos by creators like Tate also often share a screen showing videos from games like Minecraft or Call of Duty in order to keep teens engaged. This screenshot features a TikTok video by controversial creator Sneako, paired with the Minecraft game. Creators are trying to catch the attention of young men and teenage boys by combining their clips with video games. (@sneako.talks/TikTok) At this point, he says, some social media algorithms notice the user’s high levels of engagement – and then start serving them more “overtly racist” content. “Algorithms push content that is often extreme. Extreme views, hateful views get a lot of traction on places like YouTube … and TikTok,” says Schroeder.
Enter the “manosphere”
The parts of the internet where these memes circulate, and often more outright racist or misogynist content, is a place Bateman calls the “manosphere.” It describes it as a place where “men’s rights activists, male separatists, nihilists, sexual predators and trolls – often sharing membership with neo-Nazi and alt-right groups – congregate.” WATCHES | The “manosphere”: Where incels, trolls and neo-Nazis meet:
What is the “manosphere”?
Documentary and podcast producer Ellen Chloë Bateman analyzes what is known as the “manosphere”, an area of the internet where extremist groups often gather and target young men. “What they all have in common is an extreme anti-feminist worldview,” says Bateman. And alt-right groups often use this space to target young and impressionable men, he says.
Where do social media bans come in?
Social media companies say they are actively working to remove this kind of content — as studies have found that online hate speech has been associated with an increase in physical violence and hate crimes; In Tate’s case, TikTok, Facebook and Instagram removed his content. A TikTok spokesperson said that “misogyny is a hateful ideology that is not tolerated on TikTok” and that it continues to investigate other accounts and videos that violate its policies. The spokesperson also said that TikTok is looking at ways to “strengthen enforcement” against this type of harmful content. This includes a partnership with UN Women and other non-governmental organizations seeking to end violence against women and girls to create a new in-app hub to educate users about gender-based violence. Bateman says such partnerships are necessary to make social media spaces safer and more educational, especially for young people. Twitter has also taken action against controversial creators. The platform has issued temporary bans to creators such as Jordan Peterson, Matt Walsh and Steven Crowder. (Each creator was later allowed to return to the app.) But Schroeder says the bans can sometimes be counterintuitive. In Tate’s case, it may, in a way, have actually helped him. “Banning just draws attention to it,” he said. “It’s given him a really big microphone.”
Turn to other platforms
Bateman agrees, noting that these creators often find new apps, like Reddit, Gab, Telegram, and Discord, to publish their content. He says some of these platforms are actually harder to follow because of their closed group structures or registration requirements, making it harder to study and follow content. A website about the incel subculture, which promotes misogyny and violence, has more than 17,000 users, it has discovered. “It’s such a complex online world. It’s fluid … it’s moving. It’s spreading and these groups are basically interconnected in one big cesspool of hate.”