If there’s one thing I learned growing up Jewish, it was that history repeats itself. We pass on stories of survival, persecution and discrimination, vowing to never let it happen again, but we’re always alert and aware that it can – and probably will. It certainly seems as if those warnings are once again coming to fruition. All around the world, we’re seeing the return of nationalist and authoritarian governments that use fear and hate to build support and gain power. Anti-Semitism is once again on the rise, coupled with Islamophobia and all-around xenophobia.
Social technology researchers are increasingly arguing that Facebook and social media sites are complicit in all of this. These global platforms, which were supposed to connect us and foster a more friendly culture, have instead linked extremists to one another and fostered a culture of trolling, hate and abuse.
There is a parallel in history. In many ways, the rise of social media mirrors the political effects that came with the proliferation of radio. For Adolf Hitler, radio was a powerful tool for amplifying prejudice and hate in Germany and beyond, and to mobilize populations to engage in horrific acts of violence. (We saw something very similar during the Rwandan genocide, as well.) Communication scholars like Marshall McLuhan warned of the tribal effect of radio, which amplified emotions and had a powerful impact on audiences.
Radio, and later television, were heavily regulated, an acknowledgement that with great power comes great responsibility. In Canada, as in many other countries, hate speech laws were adopted to limit how radio and other media could be used. Today, social and digital media hold a similar power, but it is worth noting what makes them different and why we should be particularly concerned, especially when it comes to regulation.
On the one hand, social media is very much like radio, in that it amplifies emotions quite effectively. Whether outrage or comedy, these emotions can rapidly connect, motivate and mobilize people. But on the other hand, social media is unique in the way it amplifies insular cultures. The echo chambers and filter bubbles that organize digital information foster communities based on conformity rather than difference. This makes it easy for a small group of like-minded people to have similar access and impact as a large group, creating a more level playing field when it comes to the battle of ideas.
There is little doubt that we are in the midst of an information war, where memes are weapons and competing groups employ guerrilla tactics to win support at the expense of perceived enemies. (For those new to the online lexicon, a “meme” is a kind of cultural virus – it could be a joke, an image, a video or any kind of viral media that is designed to infect minds and encourage people to share it with their friends and followers.)
In reality, the ability to “go viral” – to have one’s content reach the largest audience – is actually a process of trial and error. People keep putting out memes, in the hopes that they will resonate with an audience and be passed around. Most successful viral media are just remixes or iterations of previously successful memes, with a slight modification. Repetition is a proven propaganda technique – we gravitate toward what is familiar.
And that’s one reason why anti-Semitic memes are so popular and effective. They’re literally building upon centuries of propaganda, extending a narrative that has already been tragically effective at dehumanizing people. Blaming the Jews is a powerful trope, in part because it has been used so frequently in the past.
The other important distinction between radio and social media is the difference between open and closed systems. Radio is open – when broadcast, anyone can tune in and hear it. Social media, by contrast, tends to be a black box.
It is largely inscrutable, making it easy for groups to organize and operate in secret, and then jump out into the open when it is advantageous to do so. When researchers attempt to quantify the extent to which extremists are using social media to organize and spread anti-Semitism, they easily find examples, but the truth is that they’re not in a position to fully understand the scope and scale of the issue.
We find ourselves living in a black box society precisely because we’ve allowed companies like Facebook to evade regulation and not share the same responsibilities that broadcasters and publishers face. If we do not address this gap in regulation (and responsibility) immediately, the problem of extremism and anti-Semitism will get worse – and fast. The good news is that there is a growing chorus of elected officials and researchers who are finally recognizing that Facebook, in particular, and social media in general, require regulation and oversight.
But the larger question remains: who is able and willing to provide such oversight?
One of the reasons governments have been hesitant to step into this role is that they may not have the ability to do so in the first place. After all, social media companies, like the information technology industry in general, tend to evolve rapidly, and this makes it difficult for traditional authorities to understand the situation and context.
This was evident when Facebook CEO Mark Zuckerberg testified before the U.S. Congress and fielded some rather foolish questions from senators who did not seem to understand how Facebook makes money (answer: advertising), or comprehend the difference between email and messaging. (There is also a severe lack of understanding of the fact that Facebook does not sell the information it collects, but rather sells access to that information. This is an important distinction to make, especially in the context of regulation.)
Of course, companies like Facebook would probably prefer some form of self-regulation. The problem is that they do not currently have the trust of either the public or elected officials to credibly propose that.
So what’s the answer? A hybrid solution might see Facebook establish its own form of government. Facebook, as it’s currently constituted, is effectively a dictatorship – majority owned and controlled by Zuckerberg. Yet with over 2.2 billion monthly active users, Facebook is actually the largest “society” on the planet. What if a solution to its current crisis, a solution to the use of the platform by extremists, was for Facebook to establish a parliament comprised of, and elected by, its users?
Facebook claims it could never employ enough moderators to effectively police the platform, and therefore it depends upon users to report content that is in violation of the rules. Every Facebook user is expected to act as a kind of “content deputy.” Why not go one step further and ask users to become citizens, to take on greater responsibility and participate in the broader governance of Facebook as a whole? Instead of a black box dictatorship, Facebook could become an open society committed to weeding out violent extremists and authoritarians.
We can demand that Facebook not be a bastion of hate, but it’s not clear that the company is in a position to do anything – at least not without our help. Perhaps a better demand is that Facebook be regulated, that it be subject to oversight and, rather than government regulation, maybe the solution is for Facebook to be accountable to its users. If we truly want to stop the rapid rise of anti-Semitism on social media, we need to democratize the online landscape, starting with Facebook.