Stay updated with the latest in Tech, Science, Culture, Entertainment, and more by following our Telegram channel here.
Here’s the challenge of writing about the hot mess that is Facebook in 2021 — the mess is so hot, your take is soon out of date.
So it was with An Ugly Truth: Inside Facebook’s Battle for Domination, by New York Times reporters Sheera Frenkel and Cecilia Kang. Two days after the book published, we learned Facebook seems to be burying data that highlights its right-wing echo chamber. It just came under fire from President Biden for “killing people” with vaccine lies, and from English footballers for allowing racist abuse on their pages.
Oh, and the FTC’s antitrust case against the social media giant, which fills the first chapter, was just denied by a federal judge. Oops.
But with the exception of the FTC case, which may yet be refiled, none of this news will come as a surprise to the book’s readers. Frenkel and Kang’s careful reporting shows a company whose leadership is institutionally ill-equipped to handle the Frankenstein’s monster they built. Ignoring hate speech and lies on the platform, even when experts are sounding alarms about harmful effects up to and including genocide, is par for the course. As is trying to control the media narrative or silencing employees rather than, y’know, doing something about the underlying problem.
It’s not that CEO Mark Zuckerberg and COO Sheryl Sandberg come across as inherently evil; they seem genuinely shaken by each new crisis, evolve their thinking a little, and belatedly order fixes. It’s more that their airy idealism, total faith in algorithms, plus endless profits, put them in bubbles where they are blind to bad actors and bad ideas.
“People were not paying attention,” Facebook’s former security chief Alex Stamos tells the authors. He’s talking about engineers who were able to stalk their dates with god-like access to their profiles (and were belatedly fired for doing so). But it could apply to everything from election interference to stolen user data to the company’s ongoing lack of diversity (only 3.8 percent of employees are Black, barely up from 2 percent in 2014).
So if you need more reasons to delete Facebook apps from your life, here are the main points we gleaned from An Ugly Truth — a book that mostly focuses on the years 2016 to 2020, but doesn’t ignore the original sins baked into Facebook from the very beginning.
1. Zuckerberg, an ‘intellectual lightweight,’ was easy to manipulate.
To be fair, plenty of us thought dumb stuff when we were 20. But few of us were laying the foundations for a history-changing global addiction at the time. From the start, Zuckerberg designed the platform for mindless scrolling: “I kind of want to be the new MTV,” he told friends. His mantra for employees was “company over country”: Do what is good for Facebook, not for America.
Forget The Social Network; a more up-to-date cautionary tale would cut straight from this declaration to the Jan. 6 insurrection being plotted in Facebook groups.
It was 20-year-old Zuckerberg who fell under the sway of Peter Thiel and Marc Andreesen, Silicon Valley libertarians and free-speech fundamentalists. He didn’t read books at the time (veteran Valley journo Kara Swisher found him an “intellectual lightweight”) and inherited a view of the First Amendment that one NAACP voting rights expert calls a “dangerous misunderstanding.” Instead of protecting people from government censorship, Zuck’s platform would protect and amplify the speech of authoritarian leaders. Conservatives would learn they could work the referee simply by claiming they were being censored.
Once set in place, Zuckerberg’s views were hard to shift — and when they did, they often managed to morph into more disastrous forms. His response to an epidemic of fake news on the site in 2016 was to downgrade allall news in the algorithm’s eyes. He’d long allowed for maximum data collection on users; one longtime employee says that any other path presented to him was “antithetical to Mark’s DNA.” Then, when the Cambridge Analytica scandal laid bare the need for privacy, he pushed for private Facebook groups — which provided safe harbor for murderous militias, QAnon believers and insurrectionists.
But Zuck’s unexamined privilege hurts Facebook users in everyday ways too. “He couldn’t identify the systemic biases of the world: how, if you were Black, you might attract ads for predatory loans, or if you were lower income, you might attract ads for junk food and soda,” the authors write. The full effects of Facebook addiction may take decades to unpack.
2. Sheryl Sandberg is a Pollyanna, with less power than we knew.
Former Google exec Sandberg was long seen as the adult supervision at Facebook. She took on all the leadership roles that didn’t interest Zuckerberg — including growing the ad business that supercharged revenue. Behind the scenes, it was assumed that the Lean In author was a moderating influence on Zuck’s more clueless or dangerous designs for the service.
Not so, turns out. “I’ve been consistently puzzled about the public portrayal of them as amazing partners,” one business-side employee tells the book’s authors. “Meetings I’ve been in with Mark and Sheryl, he was explicitly dismissive of what she said.” Indeed, the book provides examples where Sandberg was afraid of getting fired, or being labeled as politically biased, and didn’t even try to push back — such as the case of the doctored video of Nancy Pelosi that Zuckerberg decided to allow on the site. Pelosi still won’t return Sandberg’s calls.
“To friends and peers, Sandberg tried to disassociate herself from Zuckerberg’s positions, but within the company she executed his decisions,” the authors write. They quote a Sandberg friend: “Her desire to appear perfect gets in the way of defending herself.”
Part of her perfectionism manifests in a desire to focus on the positive — to a Pollyanna-ish degree. Sandberg’s conference room is named “Only Good News,” and it’s a fair summary of what she wants to hear from underlings. She quickly dismissed the notion that Russian-bought election ads spread further on the platform than was initially known, and screamed “you threw us under the bus” at then-CSO Stamos for telling the Facebook board what he knew about the situation.
Like a lot of execs who didn’t bring only good news, Stamos was gradually pushed out.
3. Joel Kaplan may be the most dangerous man at Facebook.
A friend (and briefly former boyfriend) of Sandberg’s, Joel Kaplan is a veteran of the George W. Bush administration who heads up Facebook’s lobbying arm in Washington D.C. He’s a very close friend of Justice Brett Kavanaugh, and showed up to support the controversial judge at his Supreme Court appointment hearings while on the clock for Facebook. He has Zuckerberg’s ear on political matters, and on the evidence of An Ugly Truth he is more influential than Sandberg when it counts.
Kaplan may have been a Never Trump Republican in his personal beliefs, but in practical fact he was the best friend Donald Trump had at Facebook. It was Kaplan who, in Dec. 2015, persuaded Zuckerberg not to take down Trump’s first post calling for a Muslim ban. “Don’t poke the bear,” he advised, so instead Facebook carved out an exemption for hate speech if it was “newsworthy.” This brand-new standard for what billions of people could see from a candidate arguably helped hand a close election in 2016 to Trump.
That was just the beginning of the lobbyist’s reign. It was Kaplan who downplayed Stamos’ report on Russian interference, Kaplan who argued for keeping the Pelosi video up, Kaplan who helped kill a “Trending” news section when conservative politicians complained about it — laying the groundwork for the domination of Facebook’s algorithm by right-wing radio hosts like Dan Bongino and Ben Shapiro.
In 2019, Kaplan engineered Zuckerberg’s two meetings with Trump. In 2020, he argued that Trump didn’t actually suggest injecting bleach as a COVID-19 cure, therefore didn’t fall foul of Facebook’s rules on medical misinformation. He likewise defended Trump’s “when the looting starts, the shooting starts” comment during the George Floyd protests, and when Facebook belatedly started cracking down on QAnon groups in August, it was Kaplan who made sure that antifa-related groups were also banned. Engineers admitted that such “both sides” behavior was entirely political.
All of that appeasement, and for what? Trump was banned indefinitely anyway, but not before doing potentially irreparable harm to democracy by spreading the Big Lie. Government regulation of Facebook, the one thing Kaplan was trying to avoid, is now just about the only proposal that unites left and right in Washington. And yet Kaplan’s position seems more secure than ever — cemented by an alliance with policy chief Nick Clegg, who knows President Biden from his time as UK deputy Prime Minister.
4. Employees are fighting the good fight.
If anything can save Facebook from itself, it’s the ground-level employees of Facebook. Time and again, in posts on the company’s internal Workplace groups known as “Tribes,” and in Zuckerberg’s weekly Q&As, they are the ones forcing the uncomfortable questions.
In 2016, News Feed team members were furious that blatantly fake news sites didn’t run afoul of their rules, prompting a memo from early executive Andrew “Boz” Bosworth that gave the book its title. Boz wrote that the “ugly truth” was that connecting the world’s people might lead to more deaths, but that Facebook would continue no matter what. Boz later claimed he was only trying to “inspire debate,” but employees held his feet to the fire regardless.
Next to face the workers’ fury was Kaplan, for his overt support of Kavanaugh; employees dismissed Zuckerberg’s falsehood that he had taken time off to do so. When Trump’s post about shooting looters was allowed to stand, engineers on Tribe boards were openly asking if there were jobs at companies “willing to take a stance on their moral responsibility to the world—because Facebook doesn’t seem to be it.” An internal poll showed thousands of employees believed Zuck had made the wrong decision, and the company’s first employee walkout followed.
The one positive trend in An Ugly Truth is that employee statements, actions, and leaks to the media are getting bolder and clearer. “Stop letting people be racist on Facebook, maybe then we will believe that you support racial justice,” a member of the Black@Facebook internal group wrote in 2020. We’re a long way from Zuckerberg having to tell his employees not to deface Black Lives Matter signs on campus. These days, the education appears to be flowing the other way.
5. Facebook must listen to more experts — inside and out
Employee intentions are great, but they’re nothing without management taking action and creating guidelines. In 2008, the first team responsible for moderating Facebook posts had only vague rules that there should be “no nudity, no terrorism” and a general desire that was summed up as “don’t be a radio for a future Hitler.”
How’d that work out?
The most heartbreaking section in An Ugly Truth concerns Myanmar, where fake news about Muslim minorities on Facebook snowballed into riots and a military-led genocide in 2018. An expert in the country, Matt Schissler, was sounding the alarm to Facebook representatives back in 2014. Visiting the Menlo Park HQ, Schissler showed a post that had gone viral with a picture of a man feeding Muslim refugees; the man had become a target for death threats for doing so. To take down the picture, Facebook told him, the man would have to complain himself — even though he wasn’t on Facebook.
Schissler was equally horrified to learn that the Facebook reps “seemed to equate the harmful content in Myanmar to cyberbullying,” and that there was one Burmese speaker hired by the company to police the whole country’s content. (Burmese is only one of dozens of languages spoken in the country.) “It would be like them saying, well, we have one German speaker, so we can monitor all of Europe,” Schissler tells the authors.
In the end, his complaints got about as much traction as Stamos and other election security experts hired then eased out by Facebook. Which is to say, none at all.
Zuckerberg appears belatedly aware of the value of expertise — at least, in as much as it can take tough decisions off his plate. That was the thinking behind the Facebook Oversight Board, which rightly put the question of whether Trump should be banned from the platform permanently back in the CEO’s court.
Depending on what action Zuckerberg takes next, and its impact on the 2022 and 2024 elections, An Ugly Truth may end up being a relatively lightweight prologue to a coming Facebook dystopia.
ที่มา : Mashable