Facebook Isn't the Problem; Facebook Is the Software the Problem Currently Uses
We need to differentiate cause from effect
Big, Important, Facebook Stories are, at this point, basically their own genre of journalism. The articles are a lot of fun: They involve moral grandstanding, secret documents, and unironic sharing on Facebook. But probably my favorite element is the unalloyed hell that the Facebook “f” gets put through by various graphics departments. In article after article, the “f” gets a treatment that a medieval executioner would consider harsh — in the last week alone, the “f” has been melted, drowned, and crushed to bits. Look:
I want in on this action. As you see, I went with the “f” being burned at the stake for this article, but I considered other options that signal “ominous”, “trouble”, and “bad”. Here are a few:
This is okay; I like that it’s nice ‘n spooky for Halloween. It definitely conveys that Facebook is in trouble (in fact: completely dead). But does is say “Facebook is evil”? I don’t think it does. Which is why I made this:
Getting warmer. See the horns? Horns, like the devil. Because the devil is evil. And so is Facebook. Horns, devil, evil, Facebook…get it? I worry that it might still be too subtle. And so I put this together:
Too far? Okay, yes. But I really want a graphic that communicates how evil Facebook is! I’m afraid that the point might not be coming across!!!
Facebook is completely friendless (and “friendless” counts as wordplay if you’re scoring at home, because friends are a Facebook thing). They got themselves on Democrats’ shit list after the 2016 election and on Republicans’ shit list in 2020. They’re a giant corporation, so the far left hates them, and they’re full of nerdy Bay Area types, so the far right hates them, too. NOBODY likes them. They’re the kid who gets picked on by the entire class, and when the teacher intervenes, it’s only to give the little loser a wedgie and kick him into the mud.
I agree that Facebook sucks. I’m never having a worse time than when I’m on Facebook. My feed is nothing but comedians promoting their shows and cousins of ex-coworkers having arguments about ivermectin. The Facebook “Memories” feature would, in my opinion, be more accurately called “Remember That Time You Got Divorced?” I’m honestly stunned that a company can build a trillion dollars in market cap with software that makes it easier for people you went to high school with to find you. If I had been investing in the mid-2000s, I would have gone all-in on software that does the opposite.
And yet, I’m just about the closest thing to a defender that Facebook has. After last week’s Senate testimony, anger towards Facebook is at an all-time high; many people would like to see Mark Zuckerberg get the punishment being meted out to the poor Facebook “f”. I understand the anger, but I feel that a lot of the criticism of Facebook misses the point. I think the problem is much larger than Facebook and much more difficult to fix.
Many people called last week’s Senate testimony from former Facebook project manager Frances Haugen Facebook’s “big tobacco moment”. The comparison was everywhere — look:
Okay, I’m being a bit of a dick. There were some similarities, most notably that internal research painted a different picture than what the company was saying publicly. And I think there’s a lesson here: Never do internal research. If you research the effects of your product, then you’re vulnerable to the “they knew!” narrative. I promise you that if I Might Be Wrong ever starts causing mass birth defects, I will not have lifted a finger to learn about it.
Another similarity with big tobacco is that most people basically knew everything that came out in Senate testimony. I find it hard to believe that anyone who follows this issue didn’t already know that Facebook is built to maximize engagement, and that engagement often prioritizes inflammatory and outrage-inducing content. We’ve already had the “ex-Facebook whistleblower” narrative several times: There was Chamath Palihapitiya in 2017, and Sandy Parakilas that same year, and Roger McNamee in 2019 — there’s a whole Axios list of these people. It honestly has to be considered part of the Facebook career path at this point: Get in, work hard, wait for your stock options to vest, and then have your moment (and possible book deal) as a truth-telling whistleblower. After the stock options vest. AFTER.
My problem with the current media narrative is that it treats Facebook as a uniquely bad actor. If Facebook were singularly awful, then things would get substantially better if they would just shape up. But I don't think that’s the case. I think this narrative is an “OJ searching for the real killers” situation, in that it’s a misdirection drawing attention away from the actual source of harm.
Haugen’s testimony added to the already-large body of evidence showing that inflammatory content does well on social media. At this point, the market signal could not possibly be more clear: People want content that's provocative and in-group affirming. And they also want sex — when it comes to hard abs and big, jiggling butts, America says “keep ‘em comin’!” Which highlights the non-revelatory nature of these revelations; if you didn’t already know that people want news that confirms their biases and images that engorge their privates, then may I humbly suggest that you haven’t been keeping up with the media trends of the past 200,000 years.
People are mad at Facebook for essentially giving in to market pressures. And, again, I understand the anger; Facebook’s actions have a real “something’s gonna fuck people up so it might as well be us” vibe. But the crime here is basically giving people what they want. Which makes me think that it’s probably not possible to build a social media landscape that functions in a substantially different way.
Consider the suggestions Haugen makes in her testimony. Her first recommendation is for Facebook to do less cultivation of users’ News Feeds; content should pop up chronologically, like it did in the old days. Her second suggestion is that Facebook should require users to actually read an article before they share it, which would slow down sharing by just a bit. Of course, “read” surely means “click on”, because you can’t make someone read something — that’s in the Constitution! (I think. I haven’t read it.)
These seem like reasonable suggestions. But are they game-changers? Definitely not. The News Feed recommendation is particularly funny because we already know what this looks like. Haugen’s proposed change would essentially roll back an alteration that Facebook made in 2018; that change is the source of much of the criticism Facebook is receiving right now. So, the News Feed would look a lot like it did in 2016. And 2016 was the year that misinformation on Facebook helped elect Trump; it’s the year that people started demanding that Facebook take more control over the News Feed. There’s an incoherence here — do we want more News Feed cultivation, or less? Some of Facebook’s critics can’t pick a side.
A good deal of Facebook criticism boils down to “promote the stuff I like and get rid of the stuff I don’t like.” That clearly-unworkable heuristic was well-captured by a passage from a recent New York Times op-ed by Brookings fellow Dr. Kate Klonick. Klonick argues that Facebook should tweak its algorithm to favor “good things” (her words). You probably already see the Jupiter-sized flaw in that logic, but: 1) The Times didn’t, and 2) I encourage you to enjoy the ride anyway. Klonick writes:
Facebook is perfectly capable of measuring “user experience” besides the narrow concept of “engagement,” and it is time those measurements were weighted more heavily in company decision-making. That doesn’t mean just weighing harmful effects on users; it could also mean looking at and measuring the good [emphasis hers] things Facebook offers — how likely you are to attend a protest [emphasis mine] or give to a charitable cause you hear about on Facebook.
So…attending a protest is “good”, is it? How could she possibly not anticipate the counter-argument to that? If I was coaching a high school debate team, and a kid made that argument, I would write “must expunge from team before state finals” in my notes.
Obviously, not all protests are necessarily good. To pick an example completely at random: Some might consider a gathering in which people storm the Capitol to overturn a democratic election to be “bad”. And thus, Klonick’s “promote the good things” ethic would obviously fail to nudge people away from what she terms “salacious or extreme views”.
The most aggressive play to change Facebook’s behavior would be to alter or even repeal Section 230 of the Communications Decency Act. This is the cudgel Senators sometimes wave around to bully tech companies into doing what they want. It’s mostly a bluff; they can’t repeal Section 230, because doing so would put any site built around user content out of business. There have been a few instances of companies losing Section 230 protections, and the result wasn’t better-behaving companies; the result was that those companies shut down.
Minor tweaks to Section 230 might be possible, but they won’t solve most of the problems that are getting people’s undies in a bunch. You could maybe make it harder for sex criminals and drug dealers to use the site, but no change to 230 will solve the “my aunt thinks Obama was born in Kenya” problem or the “my daughter saw an Instagram thirst trap, and now she’s depressed that she doesn’t have the proportions of a butternut squash” problem. For all the grandstanding and fist-pounding and empty talk about “getting tough” on big tech, I haven’t heard any suggestions that I consider both significant and workable.
There’s another problem with heavy regulation: It might be the best thing to ever happen to Facebook and other tech giants. If you make social media companies heavily-mediate their platforms — if they need four lawyers and six compliance officers for every user — you might create a situation in which behemoths can pay those costs but plucky startups can’t. It’s notable that Facebook is calling for regulatory updates. If the “get tough on Facebook” crowd get their way, they might make Facebook impossible to dislodge.
It’s possible that Facebook could take Haugen’s suggestions, become 10 percent less of a trash pile, and do just fine. It’s also possible that they could become 10 percent less of a trash pile and be displaced by a company that delivers maximum trash. Because — and this is the key point — we want the trash. Social media companies succeed by building sites that we want to use. We all get on our high horse and act like we don’t want the sex and celebrity gossip and bias-affirming articles, but we obviously fucking do. I won’t claim to be above the trash; I actually think the trash can be kind of great. Sometimes, I have an opinion, and I seek out articles from writers that I know will agree with me. Other times, I’ll go down a Conan/Norm Macdonald/Coen Brothers rabbit hole on YouTube and it’s great and fuck you. Occasionally, I’ll be on Instagram, and I’ll click on a thirst trap, because…because g’dDAMN! I’m done feeling shame about liking the trash — more trash, please!
Social media sites are designed to give us what we want. It’s not Facebook — it’s Facebook and Twitter and TikTok and YouTube and Instagram and any successful social media site you can name. Growing up with these sites might cause people to better understand what they are, as evidenced by the fact that people over 65 are almost four times more likely than people 18-29 to share fake news.1 Social media might not be corrupting us so much as reflecting the extent to which we’ve already been corrupted.
The question here isn’t whether Facebook sucks. Of course Facebook sucks. And of course Mark Zuckerberg sucks; no man who posts a video of himself hydro-foiling while holding an American flag could possibly not suck. In fact, the flag should have literally said “I suck”, because that’s figuratively what it said. Here, for the unimaginative:
The question is whether we could ever expect the social media landscape to be dominated by so-called “ethical” companies. The answer is “no”, for one simple reason: We would not use those companies’ platforms. We would gravitate to the platforms that do a better job of delivering that sweet, sweet trash.
Personally, the main ethic I want social media companies to follow is “butt the fuck out”. Precisely because Mark Zuckerberg sucks, I don’t want him being highly active in deciding who’s allowed to say what. Empowering companies to have a major role in deciding what speech is allowed is a terrible idea, and getting the government involved in those decisions might be the only idea that's worse.
Many people will argue that that will result in a social media-verse filled with false, inflammatory, and otherwise unsavory content. And those people are right: It's going to be a shit show. But it’s going to be a shit show no matter what. The societal change that we’re experiencing is driven by technology, not by companies’ decisions. No tweak to the News Feed algorithm or carefully-crafted change to Section 230 will substantially alter the world we’re living in. We’ve created this social media universe, and it’s made in our image. If we don’t like what we see, then the change lies within.
I’m going to end with something that YouTube’s algorithm recommended to me: “Welcome to the Internet” by Bo Burnham. Another upper-deck home run, algorithm — you’ve got me completely pegged! Besides being musically sharp (which Burnham’s stuff always is), I like the the song’s message: The internet is a weird, interesting, kinda fucked-up place. In that regard, it’s a lot like the world itself. And I think we need to stop imagining that an enlightened despot tech executive will come to our rescue and accept that, in some form or another, this is what it’s going to be.
This is a little bit of a garbage stat; political affiliation is an obviously confounding variable here (old people are more likely to be conservative). So, I’d take that number with a grain of salt. But it’s still an interesting number.
So so spot on. Thank you!
I am grateful to know that you out there thinking so creatively and writing so engagingly. And thank you (and Google's algorithm) for the Bo Burnham "Welcome to the Internet." I've listened to it three times so far. More to come. Keep up the good work.