The past several months have been hard to describe.
In myriad ways, they’ve been exciting or interesting days in the sense that, for better or for worse, there is always breaking news, and it affects everyone.
On the flip side, though, it’s also been incredibly boring as we stay confined to our houses as much as possible.
An oh-so-easy salve for that boredom: wiling away hours and hours on Facebook.
During the pandemic, my feed has been more active than ever before.
A lot of it comes from folks who suddenly went from full calendars to nada and turned to commenting, liking and sharing to fill their days.
In addition to the ongoing lack of media literacy, there seems to be a dearth of social media literacy these days as well. Though those weaknesses can regularly work hand in hand, it’s not always the case. The latter is arguably harder to get a handle on and, combined with networks’ insidious business models, more detrimental as well.
Not only can misinformation spread like wildfire, especially with the networks’ unwillingness to fact check the conspiracy theories that they’re more than happy to platform, but these sites are also set up in a way that pages can essentially game the algorithm.
Even though Facebook, for example, claims that it catches 90% of hate speech before users flag it, there are posts that may not fit the description to the letter of the law, but could certainly do their part to nudge users in that direction.
One such post I saw recently was a “history lesson” claiming that since lawn jockey statues were used to help escaped slaves access the Underground Railroad, their dehumanizing depictions of Black people couldn’t possibly be racist. Nevermind that as a white person, it’s not the poster’s place to claim that this portrayal is fine, but clicking on his profile led down a rabbit hole of general misogyny, racism and xenophobia.
And therein lies the rub. You can interact with a post that’s not overtly hateful, because the algorithm sees you interacting with and tacitly approving the account, it will start to feed you more and more content from those sources. With a click or share of an “I love Canada” post, for example, you could be taking your first steps on the road to increased anti-immigrant content.
Facebook’s internal reporting has found that more than 60% of extremist group ‘joins’ stem from the platform’s own recommendations.
Thus, if you are considering liking or sharing something from a source you’re not familiar with, just take a few moments and vet their most recent posts. If those aren’t things you want your friends and family to associate with you, it might not be advisable to share the one kernel you agree with either.
Dan Falloon is the sports editor for Pique Newsmagazine and is writing about various topics for The Chief during the pandemic.