I promise that this is the last time I’ll be talking about the Facebook Papers for a bit. I finally got around to reading all 50 of the articles that were published back on October 25, and while I am sure we will revisit some of the revelations from those papers throughout the coming months, this will be my last word on them for a while.
I also promise that this week’s newsletter is shorter than the last two, which were two of the longer pieces of I’ve written all year. I have a book to write, and it has definitely not received adequate attention the last couple of weeks. :-)
Okay let’s get into it.
The Facebook Problem
What is Facebook’s problem? Goodness, where do we begin, right? :-) Look. I am hard on Facebook; but it’s not like I’m alone, and I would say there are great reasons Facebook bears the brunt of global criticism regarding the perceived ills of social media. Here are a few reasons I think Facebook justly receives a disproportional amount of attention:
Facebook makes more money than any other social media platform (not including Google/Alphabet). They made around $29 billion last quarter whereas Twitter didn’t even make $2 billion.
Facebook/Meta owns Instagram and WhatsApp, two of the other most significant players in social media.
Facebook has more users than any other social media platform in the world, not even counting Instagram and WhatsApp, its sister social media platforms.
Facebook has been used for ill around the world in much more clear, significant ways than Twitter, Reddit, Pinterest, Snapchat, or whatever other social media platforms you can name.1
So, really, what is Facebook’s problem? If we were to boil it all down, sifting through all of the revelations of the Facebook Papers and Cambridge Analytica and the other controversies the company has endured over the years, what is at the root of all of their problems? What is the core problem, the common thread that runs through them all? Here it is:
Facebook is too big to control its own platform.
It’s like Facebook created a specimen in a lab with the intent to use it for the good of mankind, but the specimen has escaped the lab due to some shoddy safety protocols and now it’s struggling to track down the specimen and all of the havoc it’s wreaking around the world.
Facebook’s earliest company motto, as you may know, was, “Move fast and break things.” That is such a great motto for a scrappy internet startup in Silicon Valley. Unfortunately it isn’t a great motto for the most prevalent communications medium around the globe. And while Facebook, to its credit, no longer uses the motto, it is built on a foundation laid by that philosophy. You cannot undo the influence of the philosophy that drove the nascent years of the company.
This graphic is a perfect example of the problem:
During this speaker’s one hour talk, Facebook will effectively make over 615,000 content moderation decisions, or First Amendment decisions.2 The tweeter says that the Supreme Court has decided 246 First Amendment cases ever.
Of course a First Amendment case before the Supreme Court and Facebook deciding to pull down a video of a dog being shot or a woman being raped are not quite the same in terms of gravitas, but they are a lot more similar than we think, and many Americans see social media content moderation as First Amendment issues.
People, especially political conservatives, regularly complain about social media platforms not respecting First Amendment rights and censoring content.3 Many clients I have consulted on social media strategy over the years believe Facebook rejects their ads because they hate Christians when, in reality, they just had too much text on their image.
When I look at the numbers above, I realize a couple of things: 1) Facebook’s problem is not just Facebook’s problem, and 2) Facebook cannot be expected to solve this problem. Like, I look at those numbers and I feel bad for Facebook, which is a lot coming from me.
How can anyone expect Facebook, even with the power of AI, to moderate their platform effectively at this scale?! You look at these numbers and you start to realize just how unwieldy and impossible it must be to actually manage this platform. It’s amazing Facebook hasn’t had more PR and content crises than it has, frankly.
Facebook is too big to control its own platform. I don’t know what the answer is. Some people say its government regulation—Facebook itself even wants the government to help. I think that some regulation would be wise, like perhaps an FCC focused on the internet, but I also think regulation can get out of hand in a hurry.
The reality is…
No One Can Solve the Problem
As I have read and written about social media over the years, I have often thought about how we might solve some of the most prevalent problems with social media (content moderation, bullying, etc.) while keeping the good of social media (connecting with friends, watching funny videos, etc.).
As you can imagine, I haven’t come up with any solutions. But the conclusion I’ve come to is this: Maybe no one can solve the social media problem. I think of Postman when I get to this place, and I think of what he wrote about the evening news in Amusing Ourselves to Death, “It has not yet been demonstrated whether a culture can survive if it takes the measure of the world in twenty-two minutes.”
Likewise, I think to myself, “It has not yet been demonstrated whether a culture can survive if it takes the measure of the world in a brief scroll of the news feed.” I don’t know that there is a way to do social media in a scalable, sustainable, healthy way that benefits users as much as it benefits the billion-dollar companies that profit off of them.
I don’t think government regulation solves the content moderation problem.
I don’t think Facebook firing Mark Zuckerberg solves the Facebook problem.4
I don’t think shutting down Facebook solves the social media problem.
We cannot un-ring the social media bell. I think the best we can hope for is a future iteration of the internet that makes it impossible for predatory platforms like Facebook to survive and puts the power in the hands of the users. Frankly, Web3 is the best chance we have at a real reformation of the social internet, but we don’t have time to get into that here, and I’m not as optimistic about the future of Web3 as maybe some others are.
We Must Learn to Live With It…for Now
I’ve cited David Foster Wallace and his fish plenty here before, but it’s worth citing again. Wallace, American author and novelist, once wrote/spoke:
There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes “What the hell is water?”
We are the fish. The social internet is the water. Our lives are so enmeshed with the social internet that when an older, wiser fish swims by and asks if the water is clean and safe to inhabit, we don’t even realize we’re swimming in water. Like fish cannot escape water and live on dry land, the social internet is so woven into our modern world that escaping it and existing outside the social internet is virtually impossible. So if, like fish, we can’t live outside the water, what do we do?
Honestly, the best we can do is recognize that the water in which we swim is toxic. The water is very much not fine. Our job is to do what we can to clean up the water and not add to its toxicity.
No one can solve the Facebook problem, but maybe we can take some steps to make it less of a problem for ourselves and others.
I’m not going to list all the ways Facebook has been used for ill here. I’m sure you’re aware of them. If you’re not, you can google it.
Really, though, they’re making more decisions than that because they’re pulling down 615,000 pieces of content, which means many more pieces of content were actually evaluated.
And, just because we’re on the First Amendment rights bit here, let’s remind each other that Facebook is a private company and can censor a lot more speech than you would expect.
This would also never happen because no one, not even Facebook’s board, can fire Mark Zuckerberg as he owns the majority of the voting shares. He would have to step down.