The Evolution of Social Media is a Story in Parasitic Virality
How the original intent of connectivity has started to kill America’s — and the world’s — commonsense
There is a story you’ve never heard. The one that began on the other end of your phone’s communication node; the one that large companies will hope you never understand. The one that you jokingly declare might be listening in on your conversations.
We’re experiencing a revolution in technological aspects of every part of our lives. What started as dorm room or apartment-born ideas have exploded into funding rounds for an infinite number of rounds before entering public markets.
Companies like Facebook, Google, Amazon, Snapchat, Microsoft, Apple, and Twitter have catapulted themselves into otherworldly valuations. At least, in comparison with reality.
Profiting in a capitalistic society has never been a bad thing. Especially when it’s under fair practice and meets regulatory standards. The social media and internet world, however, are neither fair nor regulated. In some cases, tech CEOs have even begun to develop a narrative around “self-regulating”. The story of the profit-motive on these companies has only begun to meet the public eye; either through congressional testimony, bot viewership, advertising scams, or new shadowy means of calling oneself an entrepreneur.
But what has truly occurred in the last decade? How has this story unraveled into this, and, as the Social Dilemma began to scratch the surface of, why is it unethical?
Data transmission; filling the gaps.
Data speed might sound like a moot point. It’s faster, but why is that even important?
Data transmission is closely paired with data accessibility. Over the last decade or so, the world has been almost completely covered with, at minimum, 3G capability. The prominence of having the internet — and social media platforms like Facebook — in your hand has created byproducts that were never intended with the original vision. Namely, the introduction of broadcasting information, and misinformation, at speeds that were unheard of before 2007.
Think about a time in your own life where you didn’t know you had to swipe down to refresh your feed instantly. If that habit you’ve been conditioned on doesn’t scare you, the rest of the building blocks will.
Easy to control from the shadows.
Where there is an easy profit motive; there is natural exploitation. This is doubly, triply, and infinitely the case when you can distance the exploiter from the exploited.
Sophie Zhang was a former data scientist at Facebook. Unlike the upper management, Mark Zuckerberg included, Zhang wasn’t blind to the bad actors that Facebook was filled with.
“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions.” — Sophie Zhang
The memo continues to gloss over countless failures of Facebook’s team to capture the abuses within their network. It goes over decisions that had “affected national presidents” to things that used “thousands of inauthentic assets to boost… [and] mislead the Honduran people” to ignoring Azerbaijan’s blatant use of inauthentic entities to create a hivemind narrative to brainwash their people.
As fast as Zhang worked, it was, in the end, futile. Facebook’s management and leadership were never interested in pursuing these obstructions in information. In their own words, they had larger projects to pursue. When Zhang wouldn’t let up on her moral obligations to the world, she was asked to leave the company.
In those three years, Zhang’s experience isn’t documenting a version of Facebook that is lazy. It is glossing over, much like the Social Dilemma, a portrait where Facebook can profit off the bad actors whilst simultaneously claiming that they didn’t know those bad actors were abusing the platform.
Congressman David Mckinley painted something fairly simple for Zuckerberg in testimony on April of 2018:
“Your platform is still being used to circumvent the law, and allow people to buy highly addictive drugs without a prescription. With all due respect, Facebook is actually enabling an illegal activity and, in so doing, you are hurting people. Would you agree with that statement?”
To which Zuckerberg gave the same static, lazy reply he always gives:
“I think that there are a number of areas of content that we need to do a better job policing on our service.”
According to Zhang, two years later, and three years into her job, Facebook’s policy wasn’t about developing a “better job policing”. In fact, it’s simply ignoring the areas that don’t generate enough traffic or revenue. This is tied to another, larger issue at play.
You’re enslaved to the shareholder.
Facebook isn’t Mark Zuckerberg anymore. Facebook is 29.3% Mark Zuckerberg and 70.7% everybody else. And there are some big holders within that “everybody else”. Facebook needs to make money. Outside of the fact that governments are obfuscating the platform away from a social media engine and rendering it a propaganda machine, the shareholders demand that Facebook make money regardless.
And they demand that Facebook accelerates on that growth.
This feeds into the previous point with Zhang.
Facebook’s connection with the end-user isn’t to make the user happy or render the world a better place. Not when the users are seen by the metric of “average revenue” depending on their geographical relevance. The reason why countries like Azerbaijan or Honduras or Brazil or anywhere else isn’t looked at with interest is because average revenue per person in “Rest of World” is anywhere from a few pennis to around 2 dollars. The same goes for Asia-Pacific — All while the average revenue for a user in the United States & Canada can go for as much as $35.58.
You don’t need to listen to Mark Zuckerberg’s strange policy updates to understand what is at play on a larger scale. The numbers and the wording and the details of what is meant for the shareholders reveal the clearer picture, and that is true for any of the behemoths in the world of connecting people. Google, Amazon, Microsoft, Apple; they are all equally guilty of immoral business practices designed to garner the maximum dollar available for the shareholder. No amount of artificial intelligence tools will help stem the problem of corrupt governments using Facebook as a propaganda machine or curb mass misinformation to the American public. Facebook’s intent isn’t to stop them, it’s to have them continue to exist on a platform where you can’t notice them while they continually receive payments from them.
Why else would it take Zuckerberg almost two decades to conclude that Holocaust denial is a bad thing? And why wouldn’t he include all the other genocides as hate speech?
Virality is the only thing that catches your attention; you’ll always need the next big thing.
Losing feels worse than winning feels good. Part of Vin Scully’s genius was the ability to understand the driving element from within us.
While that element might be axiomatically born with good notions, they can often end up in a nasty place.
Venues like Facebook or YouTube weren’t born to be virality machines. But the content within those platforms that draws the eyeballs are viral. Whether you’re drinking misinformation straight, on the rocks, or with a meal filled with truthiness, there is no denying that the internet has become the hub for all things entertainment.
Those venues were designed to be illuminators of information. However, truth does not sell. If you looked over any segment on YouTube or Facebook, you’ll find that forming division and hatred sells far better than calls for unity.
Short sentences, bursts of anger, quick snippets, and declaration. You’ll see the following in the titles:
- “Burn it down” “Virality” “Flips out”
- “Scathing takedown” “Speechless”
- “Crush” “Paralyzed”
- “Destroys transgenderism” “Crushes atheism question”
- “Rips pro-choice student” “Plot to destroy America”
Whether it’s from a traditional media outlet or it’s from a newly minted YouTuber with a million subscribers, the outlook is the same. What ends up selling is a manipulated form of the information that is fed to the same demographic of people who will consistently believe in it.
Shapiro was faced with these words from Andrew Neil, a journalist at the BBC:
“You say in your new book that America’s largest struggle at the moment is ‘the struggle for our national soul. We are so angry at each other right now.’ And I think that’s true, I just returned from the United States. But aren’t you a part of that problem with the way you go about your discourse — not the solution?”
Neil goes on:
“For example, you describe Mr. Obama’s State of the Union Address in 2012 as ‘fascist mentality in action’.”
Shapiro had no clear answer for why his characterizations were as they were. Even when he admitted that the wording was “bad and wrong”, Neil responded with, “Plenty of things are bad and wrong, but it doesn’t make them fascist.”
The succinct point of it all is exactly that. To top whatever the last phrasing was, the last piece of content, or even the last piece of information, the next needs to get crazier. It needs to be louder, it needs to be more divisive.
As long as that profit motive exists within the realm of social media, divisiveness will only grow. It won’t be Zuckerberg’s ban on Holocaust denial to save us all because the world might not have another two decades for Zuckerberg to “look at data” in order to make a decision on how to regulate the information that’s broadcasted within his venue.
Truth doesn’t sell well. In truth, it’s never sold well. Social media has accelerated that process by creating a platform in which propaganda and virality have the most optimal path to the end-user.
You no longer need the publisher, the broadcasting company, or an approval from a regulating agency to give you a green light. If you want your content pushed on others by force, you can simply do what Ben Shapiro did and have a network of 14 large Facebook pages exclusively promote your content. When you begin to look at the distribution this way, you can begin to see how Facebook has handed countries like Iran, Turkey, Azerbaijan, Honduras, and Brazil the keys to content distribution as they wish.
When Turkey and Azerbaijan use those venues as propaganda machines to generate hyper-nationalistic sentiment that’s unified around committing genocide, Facebook still won’t care. Today, right now, as I write this, Armenia stands as a small country lying in the Caucuses, defending itself against a potential genocide and ethnical cleansing in Artsakh, or Nagorno-Karabakh. Towns within Artsakh are destroyed by cluster bombs. Civilians are killed. Human rights have been blatantly violated. When countries like Azerbaijan don’t allow foreign journalists, control all the media, and don’t have their bot traffic under control, we shouldn’t wonder how actors such as Adolf Hitler were born.
One day, there might not be an interconnection problem; likely because there won’t be people left on the planet to connect. We won’t have an issue with social dilemmas or ethics or morality; likely because the bad actors will have won, and mass disinformation will throw the world into permanent non-existence.
If we can’t understand that the truth has value in society above and beyond the profit motive and virality, we’re doomed and our expiration date is slowly approaching.