Part way through the American election campaign, I realized I was living an illusion. I hadn’t seen a single piece of pro-Trump content on social media, despite his surging popularity. Since I wasn’t seeing this content in my feeds, I was pretty certain pro-Trump people weren’t seeing any of the Clinton content my network was sharing, either.
This is no mere quirk of software. Now that there’s a President-Elect Trump, this lack of a wider, shared perspective online has fueled a raging debate. It may have influenced the American election; some are going so far as to say it threatens democracy itself.
People have been talking about the “social media echo chamber” for years. A June of 2016 research paper showed that not only do filter bubbles exist, they tend to pull their members to more extreme viewpoints in a confirmation bias “loop” (the more often you see something, the more true you think it is).
This groupthink isn’t all self-imposed. Technology allows us to ignore what we don’t want to see, but in many cases it’s now actually doing this for us. CEO Mark Zuckerberg has strongly denied that Facebook’s personalization algorithms (wherein you click on or share content on a particular topic, and the platform serves you up more of the same) had anything to do with shaping or polarizing public opinion ahead of the U.S. Election.
For now, we’ll set aside the irony of his claims that a site advertisers spend billions on to influence buyers has, in fact, no influence.
After November 8th, many people discovered, to their shock, that there were two, completely opposed, Americas. 62% of U.S. adults get news from social media, and they’re being algorithmically separated into communities of interest, with largely no access to the moderating effect of other opinions
This does not make for civil discourse, this makes for civil war – opposing factions that don’t know about, understand, or care for differing perspectives. It’s also not something that bodes well for national unity or peaceful co-existence between winners and losers, and it’s further complicated and entrenched by the fact that fake news is an epidemic on Facebook in particular. A recent Buzzfeed analysis found that the most popular fake election-related stories received more shares and overall engagement than stories from reputable outlets like The Washington Post and the New York Times.
When I was first enchanted by social media over a decade ago, the promise was the democratization of opinion; the ability to be heard without the need for a broadcast license, the opportunity for governments, companies and communities to connect directly, without interference. Little did I think we would end up more isolated than ever, thanks to software features that were initially intended to give us more of what we liked, but which have driven us instead into blind alleys where we have no exposure to differing viewpoints.
Mark Zuckerberg is not allowed to say that Facebook is simply a channel for sharing, that it doesn’t influence. The social network isn’t a modern-day equivalent of Canada Post. It is the biggest and most influential media company the world has ever seen. Over fifty years ago, Marshall McLuhan so presciently noted that the medium and the message cannot be separated. They are inextricably intertwined.
Facebook, Twitter and other platforms have a civic duty to understand how deeply they influence what we know; they must change their algorithms to provide all of us with a more balanced view of the world (whether we like it or not), and they must address the very harmful proliferation of propaganda and fake news. We cannot allow the innovators of Silicon Valley to hide in their own filter bubbles and ignore what has just happened. There’s too much at stake.