BookWatch: How to battle the bots on Facebook, Twitter and other social media and think for yourself

This post was originally published on this site

The late U.S. senator, ambassador and academic Daniel P. Moynihan once noted that “Everyone is entitled his own opinion, but not his own facts.” To suggest the world has changed since his day is a massive understatement. These days, anyone with a smartphone can share their opinions and feelings with just about everyone. And as “likes” and shares accumulate, a façade of truth is built. Over time, the distinction between fact and opinion blurs.

Presenting opinions as facts is only one side of the coin; the other is dismissing facts as opinions.  Everything becomes relative and popularity soon reigns over accuracy. Likes, shares, and retweets are the ammunition in a war for attention, one that can be waged over the relatively trivial (the “best” player in a sport) to the absolutely critical (public health).  Nuance is lost and as the stakes rise, new weapons are brought to the battle. 

Consider the use of bots to boost the likes and shares of many posts. Think I’m a conspiracy theorist?  A recent investigation of open source materials by the BBC’s Benjamin Strick revealed a Chinese network associated robotic accounts that magnified the visibility of certain tweets and Facebook FB, -1.10% posts.  Another report from researchers at Carnegie Mellon University found that between 45% and 60% of Twitter TWTR, +8.12% activity about post-pandemic re-opening plans were posted by bots.  Further, more than 80% of the 50 most influential COVID-19 retweeters were bots. The ultimate objective of these efforts is to distort the perception of reality as presented by social media.  And as the U.S. heads into a presidential election, how people perceive reality is only gaining in importance.  

Distorted reality

Let’s be honest — social media was a distorted version of reality long before the active measure to persuade was implemented.  Life online is simply not reflective of daily life in the real world. When was the last time you tweeted or posted something negative that happened to you?  Exactly.  Most personal posts are positive.  Now, imagine a teenager — a digital native — who does not fully understand or appreciate the fact that social media presents a biased perspective.  Suppose he fails a test and also breaks up with his girlfriend, yet when he goes online all he sees are happy pictures and positive posts. It has been said that comparison is the thief of joy. Might his lack of perspective lead to or exacerbate feelings of loneliness, anxiety or depression?

Further, because social media lets us curate our content and choose who to follow, we create our own filter bubbles.  Most people fill their feeds with those with whom they are likely to agree. Despite the enormous diversity of thinking online, it tends to be segmented and cross-fertilization of ideas appears to be less common than one might hope. Instead of forcing critical thinking by presenting conflicting data that demands some form of reconsideration, our social media feeds tend to validate, confirm and reinforce our ideas.

Read: Some of the people who literally helped write Facebook’s community standards say Zuckerberg is wrong on Trump posts

As more and more attention is focused upon social media and the ramifications of its widespread use, it’s increasingly important that we address the root sources of these problems rather than the surface symptoms. Yes, there is a confusion of facts and opinions. There is a distorted and biased perspective presented. And yes, social media supports and encourages increasingly polarized views. For many, the right approach to address these issues is to regulate social media companies.  Policymakers could force companies to fact-check content or eliminate the meters tracking shares, likes and retweets.  

But might such approaches only be addressing symptoms? Might the underlying cause of these conditions be that we have, almost universally, stopped thinking about the sources of our information or the context of our decisions? Have we been habituated to mindlessly believe whatever enters our frame of view? Is it conceivable that this makes us more vulnerable to disinformation campaigns run to manage public opinion? 

There is an overwhelming volume of posts, tweets, and shares. The algorithms that choose what we see gain inordinate power over our thinking. And because humans suffer from a representative bias, we tend to extrapolate and believe the big picture is similar to the little slice we’ve been shown. We lose track of the fact that our focus is being managed by an algorithm motivated by clicks and keeping us engaged, not necessarily informed.  And as we cede control over the filtering process, we stop thinking about how the picture presented may be biased.  We’re lost in the dark and we’re letting an algorithm shine the spotlight. 

Connect with, follow, and pay attention to those with whom we are likely to disagree. 

How might we, as individuals, combat these forces? 

First, seek to connect with, follow and pay attention to those with whom we are likely to disagree.  Regularly watch Fox?  Follow CNN. Concerned about climate change?  Pay attention to fossil fuel companies. 

A second strategy is to supplement the social media you consume with non-social media. Read newspapers, magazines, and other curated, fact-checked sources of information.  And whenever possible, read the media in physical form as the act of flipping through pages forces you to scan and absorb unrelated information that may help improve the breadth of your thinking.  It will help you connect dots.

Third, try to change your default to be a belief that everything you read is false.  Doing so makes trust an active choice, rather than a passive one, and forces us to mindfully develop beliefs rather than blindly adopt them. 

Real life is complex, and social media adds an additional layer of complexity.  While we can wait for corporate self-regulation, policymaker-imposed mandates, or legislated regulation, there is one sure-fire way to immunize ourselves from the biases of social media: Take a step back, understand the sources of our information, and think for ourselves. 

Vikram Mansharamani is a lecturer at Harvard University and author of Think for Yourself: Restoring Common Sense in an Age of Experts and Artificial Intelligence (HBR Press, 2020).

Read: Snapchat joins Twitter in putting a brake on Trump’s social-media sway

More: Twitter lets President Trump’s tweets stand but suspends another account for tweeting the exact same thing

Add Comment