Brain Wads

View Original

Polarization via Algorithm: Not All Platforms Created Equal

One thing I look forward to every day is Nicholas Thompson’s “Most Interesting Thing In Tech” daily vlog. It’s usually a quick 2-4 minute observation on some bit of technology news that could have wide implications. Thompson is the Editor-in-Chief of Wired Magazine and I love a lot of what he has to share and say in the tech space on social.

One of his latest posts talks about a new study about how social media can push us into filter bubbles. They tracked 200k people, how they tracked with online news and how it shaped their beliefs. You can check out the full video here or watch below. It’s just a little over three minutes and breaks down the research study quickly and succinctly.

See this content in the original post

The Washington Post op-ed the researchers wrote, and what Thompson cites in the video, is found here. While their high-level findings are in the opinion section of a national paper, the full body of research is going to be published soon - and I can’t wait to read it.

From the op-ed, this paragraph sums up their observations well. Nicholas Thompson reiterates some of these points in his video.

Why would Facebook lead conservatives to read more polarized news sites and Reddit to more politically moderate ones? The answer may lie in three ways their algorithms differ — namely, how they consider social networks, topical interests and engagement history. First, on Facebook you can only be friends with people who agree to be friends with you, too. This minimizes the chances of seeing content from people with more diverse, opinion-challenging viewpoints. Second, Reddit users express interest in content by joining topic-based communities, the majority of which are focused on nonpartisan topics such as entertainment, hobbies and sports. Finally, while both Reddit and Facebook closely track what content is the most popular, they differ in how they use that data. Reddit prioritizes content based on what users vote to be the most interesting or informative, but Facebook gives priority to what has garnered the most engagement, which can span from positive affirmation to angry disagreement. This can lead to the most intensely passionate, most partisan Facebook users drowning out moderate voices.

Is this research biased against conservatives?

Not at all. Thompson even notes that. Some of what could be affecting the data points are the natural browsing habits of different people. Reddit does tend to be more liberal leaning in terms of audience and may organically attract more liberals than Facebook naturally might.

Also, if you follow @FacebooksTop10 daily updates, many of the top 10 most shared posts on the platform are from conservative voices. Yes, they’ve taken down some videos and fact check a lot more than some may be comfortable. Despite that, conservative figureheads have an objectively strong share of voice in terms of raw metrics.

What the research indicates is the actual functionality of Facebook versus Reddit. It doesn’t point fingers at either side being more technically responsible than the other. It focuses on what drives content amplification on Facebook’s algorithm versus Reddit.

Long story short, Reddit’s recommendation engine (votes on lists) is much more neutrally driven. Facebook’s engagement signals are emotionally driven. Negative emotions are stronger and cause more reactions among readers. As the Wall Street Journal reported, Facebook profits from outrage. It is a much stronger catalyst for polarization than any network we have purely based on how its algorithm works compared to other platforms.

So…what now?

Honestly, I just thought this was an interesting study and I’m looking to reading the full report, how the behaviors of their 200k participants changed over time and how different networks affected the way people consume information. As a marketer and a technology enthusiast, how content spreads interests me. As a human, understanding how that works is also a way I can try and protect how I engage (and how my kids will ultimately engage) online.

But the biggest takeaway I have is to be diligent on where you are finding news. If Facebook or Twitter are your primary sources of news, you’re ultimately less informed and become more partisan. Pew Research confirmed that in a recent study. Relying only on social media for news exposes you to less of what’s going on and creates a partisan bubble around you.

It’s no good.

So let’s be safe out there. Use social to keep up with friends. Share funny things. Share this blog post and get me traffic :)

But be careful how you engage. Seek out new ideas and perspectives. But also seek out new places to find it that aren’t all algorithmically driven.

The highlights of the video and WaPo article focus on conservative internet habits but I think it speaks to a larger technical issue that we need to address. The platforms have a responsibility to fix things but we also have to take responsibility ourselves. Be honest about how we’re staying informed - or think we’re staying informed.

Stay safe out there y’all!