The role of technology in the ‘social outbreak’
Our society is burning and the fibers that keep it burst. He just looks at what is happening in Colombia with a month of seemingly endless protests, the radicalization of pro- and anti-masking attitudes in the United States, and the escalation of the conflict between Palestinians and Jews and polarization. On any issue, let us understand that since the time of barbarism we have not separated, so divided, so blind, so convinced that ‘only we are right’
And while it would be illogical to say that this situation is the fault of technology and social networks – after all, our problems are economic, social, political, ethnic and religious – the truth is that the democratization of technology and the fact that today everyone can write, edit, record and produce texts, images and videos in a professional format And it can be shared almost without filtering with the whole planet (added to the naivety and ignorance of the majority of the users), yes it is a catalyst for that the hottest state of mind in which we live. In this “social outbreak”.
For years we thought that the more information, the better our society would be. The more people have access to the Internet, the more progress there is. That the more people have a voice, the more perspectives we can listen to and evaluate, the more empathy we have and the more consensus and bridges we can build.
But that was not the case.
The combination of the promotion of cell phones with cameras, the intensification of Internet connections and the proliferation of social networks, today is the gasoline that ignites and fuels all kinds of conflicts, and the role of technology is undeniable. In this “zaperoco”.
There are 4 aspects in which I see this role and I think it is worth analyzing and understanding.
1 – Down the rabbit hole – the role of algorithms
Addicted to sharing, likes and retweets, poisoned with the sole aim of getting our attention, keeping us captivated by consuming their services, and not giving up on them, social networking algorithms quickly learned that the more controversial a post was, the more interactions would be. It has, that the more radical its content, the more relevant it is, and the more time the user spends on the platform, he needs to go deeper into the “rabbit hole” and consume increasingly “heavy” content. And so they gave priority to this type of content.
This is why Flat Earthers have grown larger than ever, and today’s counter vaccines are being made and disintegrated all over the world, evangelicals strongly support a corrupt megalomaniac who violates the Ten Commandments on a daily basis. This is why Christians have forgotten the phrase “you will help your neighbor” and campaign to violate the basic rights of all kinds of “non-traditional” people.
This is despite the fact that the foundations and foundations of their movements are so illogical that they would be funny if their effects were not so serious.
Last year I heard myself It is a podcast by The New York TimesWhere they analyzed how QAnon used the same methods ISIS used to radicalize thousands of people in the Middle East to brainwash and radicalize millions of Americans. I recommend it.
And behind the algorithms came digital strategists, influencers, agencies, brands, politicians, and anyone who wants content to go viral: disguise it as “official,” “professional,” and “serious,” but make it controversial and engaging. Talk about “us” and “them”. And it struck a chord that leads to its immediate adoption and enthusiasm by the target population. And ask them to share it, so that they know, so that “everyone knows.”
2 – echo chambers and polarization
Yesterday afternoon I was in Digital America talking about algorithms, artificial intelligence and social networking (if you want you can watch the conversation Here) and ended up talking with Clarybell about Echo Chambers, a perverse influence of social media and a topic I wrote about 5 years ago.
Today, more than then, the problem has become valid because over the years we have “refinished” our social networks: we have increased the contacts that think like us and we have curated and trained those algorithms with a profile that shows us more and more content using what we agree with and less with who we stand out. with it.
During that time, we have been silenced and removed from our social networks, friends and family who post things we don’t like and who share content we disagree with. We have locked ourselves in a bubble, almost impenetrable, in which we all think alike and nothing enters into it that we do not communicate with.
The result is a serious conviction that “all Colombia agrees,” “all citizens agree with the” X and Y issue “or” the more good “without fully understanding who the good guys are and who the bad guys are. Our position is the position of the majority. Our opinion is the truth and public opinion. If nothing happens as we hope, it is because of cheating (as, for example, Trumpists in the United States believe in the face of a Joe Biden victory.? “The election was stolen! Let’s take the Congress!).
The result is an extremism, and an increasingly hostile one, with those who think differently are now “the enemy,” “indoctrinated,” or certain they are “getting paid.”
The result is a perpetual verbosity of content, which we almost never produce but “is routed on multiple occasions”, overwhelms family chats, walls, and social media profiles, and so clumsy our minds that we convince ourselves of this sheer truth without being able to hear different situations.
3 – Information overload = less ability to understand
The information we receive so much, or better than it bombards us, that we lose the ability to understand. Today it is estimated that we receive more than 90 gigabytes of information in a single day, 15 times more than what we handled at the end of the 1980s. Exponential growth has not been accompanied by an improvement, not even partial, in the physical capacity of our minds.
This is why we not only read less but understand less and always understand what we want to understand and not what the author is saying. This is why despite having a sea of data at our fingertips, we are drowning in a lack of information.
as you say Nicholas Carr, Author Shallows: What the Internet is doing to our brains:
“We trade depth for breadth, meditation for motivation, creating important individual and collective ramifications”
This, combined with an increase in the level of “professionalism” of those who create the content, has led us to believe and share things that are 100% wrong but seem right at first glance. This has led us to be the useful fool through whom fake news spreads.
4 – Lies and half-truths
A few years ago I wrote about the concept of “younger brother” which is similar to Big brother George Orwell described it in his classic (and increasingly important) 1984 book, but with a twist: It’s no longer the government that spies on us and sees everything we do. We all created this Panopticon, a permanent surveillance model where there is at least one camera and one microphone in every block we pass and everything we do is recorded by someone. A form in which our past is forever recorded and can appear at any time.
And yes, we lost our privacy. But this is not the worst.
The worst thing is that today we use these photos and videos to tell our version, always distorted, of a fact or a story. The worst thing is that we only record and share parts of the event and share them in this way, in a partial and biased way, regardless of the context in which it happened or the totality of events. We take what we want, what helps us support our speech and use it, and we eliminate the parts that go in the opposite direction (or that do not support) our way of thinking or our speech.
We can call them selective facts and they are almost more dangerous than the lies we tell.
I don’t see a solution to this problem. Technology and manipulation are improving dramatically, and our ability to understand what is happening and how to use it is stagnant to the point that it will become more dangerous every day. Let me give you an example:
DeepFakes danger
You may have seen these Tom Cruise videos by now. If not, then do so; I hope that.
Whoever appears in the video isn’t Tom Cruise, but they correspond to a technology called DeepFake in which AI algorithms are used to edit the videos in such a way that a certain person appears to be appearing in them.
What will happen when Tom Cruise is not the one starring in the video but is a leader or politician and says or does something that conflicts with his way of thinking? Will we be able to distinguish the truth from the lie? Or we will go out like sheep to follow instructions, to defend the indefensible.
What will happen when this guy is caught by a real camera doing something he shouldn’t and in his defense they say it’s definitely a deep fake? Will we have the cognitive and moral capacity to blame this act and force it to deal with the consequences? Or will we convince ourselves that everything is a lie because “everyone who knows it knows that it is not”?
Today we are no longer able to believe what we read, so can we then believe in what we see?
I conclude with a call: Let’s educate ourselves about how they use technology to divide us, radicalize us, and use us. We understand technological developments and their benefits, and take their risks with responsibility and knowledge. Let’s take back control of our privacy, our security and our thoughts. And let’s understand that extremism does not lead to anything good. At least not for most people.
“Pop culture advocate. Troublemaker. Friendly student. Proud problem solver.”