The views and opinions expressed in this article are those of the author alone and do not necessarily represent the position of ZU Media or APU

Social media platforms should create more intelligent algorithms to minimize the spread of misinformation

By Nathan K. Foster

 

Every Sunday, my Instagram is flooded with dozens of posts of the San Francisco 49ers from both team and fan accounts. It begins around 9 a.m. with the pregame hype, followed by quarterly updates, game highlights, the end score, and finally, all the reaction posts late at night. My social media feed gets so cluttered that I don’t even try to scroll to the bottom anymore. It wasn’t always this way though.

Just two years ago, I only followed my friends and family on social media. I didn’t follow any brands, organizations, celebrities or athletes. Then one day I decided to give my favorite football team a follow, and it all went downhill from there. Following the team led me to follow the franchise quarterback, then the star tight end, the second-best receiver, the third-string linebacker and eventually the kicker. That’s when I knew I was in too deep. I unfollowed most of them and got a little bit of my life back, but why had I even gotten into this mess in the first place? 

The answer to that question is also a big part of the reason why misinformation is so rampant in America today: algorithms. Social media algorithms are a way of sorting posts in a users’ feed based on relevancy instead of publish time, according to Sprout Social. That sounds simple, right? 

Wrong.

Facebook, Twitter, Instagram and other social media giants have invested heavily into constantly improving their algorithms to give users the best experience. These algorithms are unbelievably complex, and just when you feel like you’ve finally grasped how an algorithm works, it’ll change. Algorithms are always evolving

However, the basic premise remains the same. Social media companies are trying to put the most relevant content at the front of your feed. They do this so you’ll spend more time on their platform. If you like the content you see, you’ll be inclined to keep scrolling, and they’ll make more money from the ads you scroll past. While this concept makes sense in theory, it doesn’t always work well in real life. 

For example, the fake news epidemic is compounded by flaws in these algorithms. While fake news is nothing new, the rate at which it spreads over social media is unprecedented, causing it to become one of the most pressing global issues today.

Let me tell you about my friend John Smith (not his real name). While he’s a great guy, Smith has some pretty erroneous views on important topics. He’s certain that Donald Trump won reelection and rampant voter fraud is the only reason the race was even close. Smith also thinks that China intentionally spread coronavirus across the world. He also thinks people are making way too big of a deal about COVID-19 and it will all disappear soon after the election. Our idea of what facts are true has been clouded, creating a massive problem.

Of course, we’re all entitled to our own opinions; however, opinions never trump facts. Let’s look at the facts. Biden won the election and Trump has finally let his administration begin the power switch. The coronavirus is real. While it is true that it originated in China, and Chinese officials hid important information about COVID-19 from the world in January, there is very little evidence to suggest that the Chinese government intentionally spread the virus. Hundreds of thousands of people have died from it in the U.S. alone. The virus did not disappear after the election. In fact, it’s gotten worse. 

Should I blame my friend for holding his opinions? No. Smith is not alone in his beliefs. Millions of Americans think just like him, and social media is to blame.

When you sign up for Facebook or any social platform, it will ask you what topics you’re interested in. Based on which ones you choose, the social media platform’s algorithm will suggest a number of pages and accounts for you to follow. You’ll follow the ones you like and immediately; their content will be pushed to the top of your feed. But it doesn’t end there. 

The platform will note which posts you like, which profiles you follow, which hashtags you search, and it will save the information for its algorithm. In turn, the algorithm will find similar accounts and content and will insert them into your feed. Then it will do it again and again. Remember, these algorithms are always learning, always growing

Case in point, let’s look at how easily an average person could start seeing more biased and false stories. My friend Gwen Baker (also not her real name) considers herself a strong liberal. She loves watching CNN, and she recently began following their social media accounts. Liking CNN posts on Facebook informed the algorithm that Baker must be interested in more stories like these, and the algorithm started pumping more left-leaning stories in. First came a story from Politico, followed by one from The New Yorker, then Vox, The Daily Beast, AJ+, AlterNet, and eventually, The Onion. Without realizing it, Baker was caught in an echo chamber, only seeing stories from more and more left-leaning outlets side that reinforced her own opinions. 

Did you recognize all of the outlets? The first three are mainstream, then they get farther left and less trustworthy. The Onion isn’t even a real news site. It’s satire. While The Onion openly admits that its stories are fake, sources like AlterNet claim their stories are real and accurate, which is far from true. Most people don’t vet their sources. That’s the catch. It can be challenging to tell which sources are legit and which are fake. Many fake news outlets have a massive following, amassing hundreds of thousands of followers because they know how to work the social media algorithms. 

Some outlets, such as America’s Last Line of Defense, will even admit that their news is fake, but many people don’t realize it. They just see these posts with radical claims on social media and believe them, even though they don’t do any research to check whether the stories are factual or not. For example, just last month, Baker also told me that Tom Hanks was a pedophile. Instead of just replying, “That sucks. I guess I’ll never watch Toy Story again. He’s canceled,” I asked. “What’s your source?” The answer did not shock me. Baker couldn’t remember what the outlet was, but she remembered that she saw it on Snapchat. I quickly Googled “Is Tom Hanks a pedophile” and was presented with a dozen legit sources debunking this rumor and connecting it to a larger conspiracy theory. Baker admitted she was wrong and said she would be more cautious when it came to believing news stories on Snapchat.

This is the sad reality that we’re living in right now. We don’t know which outlets to trust anymore because social media algorithms are constantly bombarding us with fake news from bad sources. This needs to change. Facebook, Twitter and other social media platforms are working on addressing this issue in part, but until they can create more intelligent algorithms that filter through false information, people will continue to be misinformed.