The reason that the Russian attack on the 2016 election worked so effectively is because of the way that the internet abhors moderation. From massive waves of false information spread virally on Facebook to the dark web corners where QAnon was born, the hands-off ideology allowed fascism and the alt-right to, in their own words, meme a president into office. It’s not the only reason that Donald Trump sits in the Oval Office, but it is a massive factor that cannot be overstated. If we want 2020 and politics beyond to be different, we have to begin the process of cleaning the radiation off the wasteland of the internet.
The question is: how do we do that? With Kamala Harris out of the race, the Democrats seem to have lost the only candidate who wished to tackle online hate speech and fake news as a major issue, maybe because Harris was the candidate who drew so much of it herself that an entire sub-culture grew just to debunk it. That means that whatever happens will have to be because of pressure that the people put on the next leader, whoever she may be.
But it’s not an easy thing. Even as someone who never misses an episode of Ian Danskin’s Alt-Right Playbook I can tell you that as hard as it is to understand the alt-right, knowing how to combat them is even more difficult. I spent a little time in my own head and talking with experts, and here is what I can propose for the coming year.
First, if you are dealing with an individual that you actually know such as a family member and close friend, you might be able to deprogram them after a lot of time and effort. “Deprogram” might seem like a loaded word, but as Danskin points out in “How to Radicalize a Normie” and the Endnote video supplement, many aspects of the alt-right and QAnon especially are essentially cult-like and will require dedication to combat. Much like how many people do not want to admit a substance use problem in the house, the desire to play down alt-right radicalization of a loved one as merely a difference of opinion is strong.
Most of us are not licensed counselors, and the idea of kidnapping someone to deprogram them went out of vogue in the 1970s. As a person who wants to help, it’s important to forge a relationship with the affected party by connecting with them through other things like mutual interests. In-person hate group participation dropped dramatically in 2013 as white nationalists and other reactionaries realized that a decentralized, leaderless movement housed on message boards and other spaces would keep them from falling prey to the usual state controls that ended previous groups. One year later, Gamergate happened as a test of the theory with pleasing results to hateful bigots, and it's now become the standard. In-person groups still exist, and there is a lot of overlap, but it’s this approach that led to things like 8chan becoming a place that has been directly linked to multiple murders.
Opposition to the things your affected loved one says will only reinforce their problems. As a general guide, it’s best to either steer them away from those topics of conversation and onto things that belong to the two of you instead of the alt-right. If they insist on engaging, ask questions calmly and non-judgmentally without trying to debunk. The alt-right deals in memetic content and in-jokes as an isolation mechanism. Forcing people to articulate why a pizzeria is secretly a child sex ring run by Hillary Clinton often makes them realize how stupid it sounds. If you’re very patient and very lucky, you might be able to eventually nudge them into counseling and get your loved one back.
That’s unfortunately the only way to fight this as individuals, but what about as a society? That’s even harder, and it will require a lot of activism.
The primary opposition right how is will. Companies like Facebook simply feel no need to tackle the issue, and with 8chan the current owner is mostly immune to financial repercussions (8chan has never made any money) and legal ones (owner Jim Watkins currently lives in the Philippines so good luck serving a subpoena). On top of that is America’s fervent dedication to an absolutist interpretation of free speech despite the fact that American fascists and hate groups have consistently reframed the argument about what they say as an attack on their right to say it. The average American is for free speech, with no desire to look at the fine print of the matter.
Because of that, I wouldn’t start trying to jail Mark Zuckerberg over not fighting hate speech or fake news too much. Even if you take on people much further down the food chain such as Mark Meechan (Count Dankula) who taught his dog to give a Nazi salute when asked “"Do you wanna gas the Jews?" you often end up starting a giant backlash that empowers people who use free speech arguments to protect white nationalism. As a Gamergate target myself, it is frustrating that there seems to be little to no legal repercussion for the online hate mobs, but often the case is that it’s just practically more harm than it’s worth to prosecute even when laws are broken.
Nonetheless, these companies can be influenced by the free market to change and it is probably the best path forward. Twitter, for example, has taken a much harder line against white supremacy content after years of being bashed for allowing it to thrive. Reddit as well has cleaned up its act considerably, recognizing that the constant bad press for being home to so many racists was hurting its image. They are far more careful about banning boards or quarantining sub-reddits. It’s definitely a step in the right direction, small as it may seem to people who still get attacked through the sites.
8chan is currently down because even though it’s hard to hit Watkins for his work personally, the people who have to host the site increasingly want nothing to do with it. Storm Front as well, the traditional home of neo-Nazis online, has found it much harder to find a home for their brand of rancid mayonnaise. All of that comes from pressure on the people who control where the platforms are hosted. These are conservative companies that generally don’t want any trouble, and they have every right to tell a client “no thank” you without raising the specter of official censorship.
Companies like PayPal, Patreon, and Venmo are also susceptible to pressure. Groups like the Proud Boys and people like Milo Yiannopoulos have systematically found themselves ousted from platforms that enable them to raise money for hateful causes. As their reach declines, so does their influence. Fighting hate by going after their servers and payment sources has been a proven tactic over the last couple of years. It should continue in earnest.
Legislation has a place as well as a way to put public pressure on companies. It’s a bit early to see if grilling by members of Congress like Alexandria Ocasio-Cortez will have much of an effect on Facebook, but it’s a start. One law I would personally like to see is something mandating a certain amount of human moderators per number of users, and also mandating they be regionally distributed wisely. Non-English speakers should not be the primary moderators of American content, for instance.
It’s important to understand that companies like Facebook are perfectly capable of fixing a good chunk of the problem. They simply don’t want to. There is this idea that all this can be solved with a better algorithm to automatically shut down hate speech. Algorithms have their place. Facebook launched a great one after the Christchurch shooting to take down content. That said, there is no substitution for human oversight as any woman who reacted to a rape threat with “men are trash” and ended up with a suspended account will tell you. I landed in Facebook jail for a week over sharing this with a text description for the blind. We need people, not cost-saving bots.
We Believe Local Journalism is Critical to the Life of a City
Engaging with our readers is essential to the mission of the Houston Press. Make a financial contribution or sign up for a newsletter, and help us keep telling Houston’s stories with no paywalls.
Support Our Journalism
There’s this misconception that we can improve technology to deal with the alt-right and hate speech. While monitoring bots can get better, they only care about what we teach them and they do not contextualize adequately. We’ll build a machine that can beat nearly anyone at chess but we’ll probably never build one that can beat hardly anyone at Dungeon’s and Dragons. There must be an increased human presence on the ground that has adequate incentive to fight the problem. A corps of workers in this country who have the human capacity to understand the problems and the ability to fight it, will change the argument in ways that mindless machines built to replace human thought never will.
We have to care if anything is going to be done because left to their own devices companies will just do whatever is profitable no matter the risk to everyone else. Make no mistake, the rise of the alt-right and the new fascist movement in America is big, powerful, and mostly operating un-checked right now. These spaces have already bred multiple killers. And they continue to empower far-right interests by inundating our public consciousness with hate and falsehoods. It’s not censorship to demand that lies be treated like lies and it’s not the death of free speech if Nazis aren’t allowed to have a 100,000-member Facebook group.
But it will only stop if we start to agree that there is a problem and demand that people who can make a difference do so. The loved ones in your life who have been led astray by toxic online communities deserve and need your compassion to be free of them. The corporations that make millions off these highly-engaged groups and the radical right politicians who benefit from fake news and hate speech do not. They must be pressured into doing something about it with every tool we have at our disposal as a democracy and a free market. If a large enough shift occurs, it will become the new, less-hateful normal.
And that is poor ground to grow the next fascist leader in.