I started this post nearly two years ago, but gradually stopped working on it because criticizing social media is like beating a dead horse. I recently picked it back up because why not? The title is a riff off of Nicholas Carr's article "Is Google Making Us Stupid?" where he criticizes the internet's effect on our cognition.
It's claimed that Western society is the most divisive it's been in a long time, and arguably this is primarily due to the advent and proliferation of social media, or at least played a significant role. I've already addressed how social media turns you into an asshole, and there's some overlap with this post, but I think it's important to understand that social media has the capability to turn an ordinary intelligent person into an idiot.
From an evolutionary perspective, the human brain did not evolve to comprehend thousands of other humans talking all at once, re-sharing memes and pictures of their personal lives, much less the fire hose of (mostly irrelevant) information masquerading as "news". At best, we evolved to operate in small social groups where we have at most 150 stable social relationships. In today's world, we can have far more superficial "interactions" than that number simply from scrolling a typical social media feed.
Humans did, however, evolve to be members of tribal groups, and to this day it's an important element of what makes us human. Identifying with and being members of social groups has been incredibly important for not just our survival as a species, but cooperation with others is what built our modern world. In the 21st century, the tribes we identify with are increasingly becoming no longer physical. Social groups that once existed in "meatspace" are now tribes exclusively digital in nature.
While cooperating in groups has been one of our greatest strengths, it's also a point of weakness. Take political parties, gaming culture, and sports fans for example. We're not just more likely to dismiss views and opinions from those we deem as outside of our tribe, we're also prone to being outright aggressive toward them. Once we identify another individual as being part of a different group than ourselves, we're more capable of ostracization and dehumanization than we would otherwise.
If our beliefs are deeply rooted in emotion as opposed to reason, we are unlikely to be willing to even entertain the idea of understanding the other side. Oftentimes trying to understand where the other side is coming from (or even trying to understand the mind of someone who committed a horrific act) can get you labeled as really being a member of that group and ostracized yourself. Same goes for in-group criticism. You can't reason your way out of a position that you didn't reason yourself into.
The most extreme people are often the loudest, and what's deemed "important" according to the algorithms does not mean it's reflective of the general population. As a result, instead of amplifying the user's intelligence, it amplifies their stupidity. To get traction on any post, all one needs to do is to post something inflammatory about an out-group to their in-group audience. We're more likely to accept information from our group as fact and take it at face value rather than researching a claim for ourselves.
We tend to 'un-follow' or 'un-friend' those that aren't in a group we identify with. At the very least, we do not engage with the content outside our purview. Every action and inaction is treated as a data point and this behavior gets fed into the algorithms where it shows you things you're more likely to engage with based off of previous data. Often extreme, loud, or just outright stupid mindless content which asks little effort of the user other than to view or engage. But you can rest assured that the new content you're fed is more than likely align with your past behavior nad your existing views. Echo chambers can be self imposed, or they can be created without us even knowing.
There's also the view that all arguments are equally valid or should receive equal criticism. A while back I listened to a podcast where Neil deGrasse Tyson was interviewed about his new book Starry Messenger: Cosmic Perspectives on Civilization. One thing he pointed was that 'social media has become an outlet where all information, whether it's scientific or nonsense, is perceived and treated equally'. If you see a post by an expert in the field followed by a rant or "hot take" from a conspiracy theorist, we are more likely to weight them both as closer to being equally valid as opposed treating the absurd claims by some random jackass with no credentials as nonsense. One reason why is that both posts are identical in terms of visual presentation (ie text post, link card, image, video, etc) with a name and profile image and the method of delivery (a post within a feed on your smartphone). This is on par with reading an outrageous headline claim on a magazine at the checkout stand, then treating it as equally valid as a conclusion based on data in a peer reviewed academic science journal.
Social media encourages bad thinking and promotes bad mental hygiene. By "mental hygiene", I'm not just referring to misinformation and disinformation, I'm also talking about the seemingly irrelevant unimportant stories we somehow think really are important because lots of people are reacting to it. Yes, social media, is easily capable of making an otherwise smart person stupid.
All this isn't to say that I think I'm not capable of being divisive, falling into echo chambers, and being a complete idiot, because I'm absolutely guilty of all this. Deleting all of my social media accounts hasn't changed this, but it has drastically changed the extent and frequency. Seeing as how I'm human, I'm also prone to logical fallacies without being aware of that I'm doing it. And I very much know, as you probably do, that most people are radically different in person than they are online.
The best we can do is make an effort to understand the underlying mechanisms that play for our attention. It's also important to understand our minds and recognize we're capable of cognitive errors. Researching dark patterns and logical fallacies can reduce the amount of cognitive mistakes we make in the future.
Thanks for reading. Feel free to send comments, questions, or recommendations to email@example.com.