Fisher-The Chaos Machine
The Chaos Machine
Metadata
- Author: { Max Fisher
- ASIN: B09FJPPQQ3
- ISBN: 031670332X
- Reference: https://www.amazon.com/dp/B09FJPPQQ3
- Kindle link
Highlights
Trump’s rise had been abetted by strange new grassroots movements and hyperpartisan outlets that thrived online, as well as Russian agents who’d exploited social media’s reality-distorting, identity-indulging tendencies. This global pattern seemed to indicate something fundamental to the technology, but exactly what that was, why it was happening, or what it meant, nobody was able to tell me. — location: 73
Jacob built a program to secrete the files out, encrypting and washing them to remove digital fingerprints that might trace back to him or even the country where his office was located. He transferred some to me through a secure server. A few weeks later I flew out to gather the rest and to meet him. — location: 105
A strange pattern emerged in my conversations at Facebook’s headquarters. An executive would walk me through the challenge that consumed their days: blocking terrorists from recruiting on the platform, outmaneuvering hostile government hackers, determining which combinations of words constituted an unacceptable incitement to violence. Nearly any question I posed, however sensitive, yielded a direct, nuanced answer. When problems remained unsolved, they acknowledged as much. No one ever had to check their notes to tell me, say, Facebook’s policy on Kurdish independence groups or its methods for distributing hate-speech rules in Tagalog. I found myself wondering: with such conscientious, ultra-qualified people in charge, why do the problems for which they articulate such thoughtful answers only ever seem to get worse? — location: 112
Many at the company seemed almost unaware that the platform’s algorithms and design deliberately shape users’ experiences and incentives, and therefore the users themselves. — location: 127
as QAnon became a movement with tens of thousands of followers, an internal FBI report identified it as a domestic terror threat. Throughout, Facebook’s recommendation engines promoted QAnon groups to huge numbers of readers, as if this were merely another club, helping to grow the conspiracy into the size of a minor political party, for seemingly no more elaborate reason than the continued clicks the QAnon content generated. — location: 139
“Our algorithms exploit the human brain’s attraction to divisiveness,” the researchers warned in a 2018 presentation later leaked to the Wall Street Journal. In fact, the presentation continued, Facebook’s systems were designed in a way that delivered users “more and more divisive content in an effort to gain user attention & increase time on the platform.” — location: 149
In summer 2020, an independent audit of Facebook, commissioned by the company under pressure from civil rights groups, concluded that the platform was everything its executives had insisted to me it was not. Its policies permitted rampant misinformation that could undermine elections. Its algorithms and recommendation systems were “driving people toward self-reinforcing echo chambers of extremism,” training them to hate. Perhaps most damning, the report concluded that the company did not understand how its own products affected its billions of users. — location: 163
They conducted their work, at least initially, independently of one another, pursuing very different methods toward the same question: what are the consequences of this technology? This book is about the mission to answer that question, told in part through the people who led it. — location: 172
companies cannot be blamed for the high-tech funding model that gave rise to them, by handing multimillion-dollar investments to misfit twentysomethings and then demanding instant, exponential returns, with little concern for the warped incentives this creates. — location: 180
other users, though mostly friendly, she said, occasionally slipped into “flame wars” that were thousands of posts long, and all over a topic she’d rarely encountered off-line: vaccinations. It was 2014, and DiResta had only recently arrived in Silicon Valley, there to scout startups for an investment firm. She was still an analyst at heart, from her years both on Wall Street and, before that, at an intelligence agency she hints was the CIA. To keep her mind agile, she filled her downtime with elaborate research projects, the way others might do a crossword in bed. — location: 198
The reason the system pushed the conspiratorial outliers so hard, she came to realize, was engagement. Social media platforms surfaced whatever content their automated systems had concluded would maximize users’ activity online, thereby allowing the company to sell more ads. — location: 237
Silicon Valley’s mostly self-invented lore, instilled the Valley with cultural and economic traits that were built into the products that increasingly rule our world. And it began with a wave of pioneers who played a role as crucial as any of the engineers or CEOs who came after them: the military-industrial complex. — location: 262
Frederick Terman, the son of a psychology professor at then-unremarkable Stanford University, spent World War II at Harvard’s labs, overseeing joint military–academic research projects. He returned home with an idea: that this model continue into peacetime, with university scientists cooperating instead with private companies. He established the Stanford Research Park, where companies could work alongside academic researchers. — location: 271
Stanford blurred the line between academic and for-profit work, a development that became core to the Silicon Valley worldview, absorbed and propagated by countless companies cycling through the Research Park. Hitting it big in the tech business and advancing human welfare, the thinking went, were not only compatible, they were one and the same. These conditions made 1950s Santa Clara what Margaret O’Mara, a prominent historian of Silicon Valley, has called a silicon Galápagos. Much as those islands’ peculiar geology and extreme isolation produced one-of-a-kind bird and lizard species, the Valley’s peculiar conditions produced ways of doing business and of seeing the world that could not have flourished anywhere else—and led ultimately to Facebook, YouTube, and Twitter. — location: 276
The chance migration that seeded much of the Valley’s technological DNA, like an adrift iguana landing on a Galápagos shore, was a cantankerous engineer named William Shockley. At Bell Labs, perhaps the most prestigious of the East Coast research firms, he’d shared a 1956 Nobel Prize for pioneering new semi-conductive transistors. The tiny devices, which direct or modify electrical signals, are the building blocks of modern electronics. Shockley became convinced he could beat Bell’s methods. — location: 282
That pool of talent, money, and technology—the three essential ingredients—would be kept in the Valley, and the rest of the world kept out, by an unusual funding practice: venture capitalism. — location: 299
venture capitalists tended to fund people whom they trusted—which meant people they knew personally or who looked and talked like them. This meant that each class of successful engineers reified their strengths, as well as their biases and blind spots, in the next, like an isolated species whose traits become more pronounced with each subsequent generation. — location: 304
When people think something has become a matter of consensus, psychologists have found, they tend not only to go along, but to internalize that sentiment as their own. — location: 340
That year, halfway through the second Obama term, a significant threshold was crossed in the human experience. For the first time, the 200 million Americans with an active Facebook account spent, on average, more time on the platform (forty minutes per day) than they did socializing in person (thirty-eight minutes). Just two years later, by the summer of 2016, nearly 70 percent of Americans used Facebook-owned platforms, averaging fifty minutes per day. — location: 356
“The thought process that went into building these applications,” Parker told the media conference, “was all about, ‘How do we consume as much of your time and conscious attention as possible?’” To do that, he said, “We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you more likes and comments.” He termed this the “social-validation feedback loop,” calling it “exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.” He and Zuckerberg “understood this” from the beginning, he said, and “we did it anyway.” — location: 375
Whoah
“How do companies, producing little more than bits of code displayed on a screen, seemingly control users’ minds?” Nir Eyal, a prominent Valley product consultant, asked in his 2014 book, Hooked: How to Build Habit-Forming Products. “Our actions have been engineered,” he explained. Services like Twitter and YouTube “habitually alter our everyday behavior, just as their designers intended.” — location: 384
intermittent variable reinforcement. The concept, while sounding esoteric, is devilishly simple. The psychologist B. F. Skinner found that if he assigned a human subject a repeatable task—solving a simple puzzle, say—and rewarded her every time she completed it, she would usually comply, but would stop right after he stopped rewarding her. But if he doled out the reward only sometimes, and randomized its size, then she would complete the task far more consistently, even doggedly. And she would keep completing the task long after the rewards had stopped altogether—as if chasing even the possibility of a reward compulsively. — location: 405
Abusers veer unpredictably between kindness and cruelty, punishing partners for behaviors that they had previously rewarded with affection. This can lead to something called traumatic bonding. The victimized partner finds herself compulsively seeking a positive response, like a gambler feeding a slot machine, or a Facebook addict unable to log off from the platform—even if, for many, it only makes them lonelier. — location: 413
In 2018, a team of economists offered users different amounts of money to deactivate their account for four weeks, looking for the threshold at which at least half of them would say yes. The number turned out to be high: $180. But the people who deactivated experienced more happiness, less anxiety, and greater life satisfaction. After the experiment was over, they used the app less than they had before. — location: 423
That little button’s appeal, and much of social media’s power, comes from exploiting something called the sociometer. The concept emerged out of a question posed by the psychologist Mark Leary: what is the purpose of self-esteem? The anguish we feel from low self-esteem is wholly self-generated. We would not have developed such an unusual and painful vulnerability, Leary reasoned, unless it provided some benefit outweighing its tremendous psychic costs. His theory, now widely held, is that self-esteem is in fact “a psychological gauge of the degree to which people perceive that they are relationally valued and socially accepted by other people.” — location: 439
Brian Hare called “survival of the friendliest.” The result was the development of a sociometer: a tendency to unconsciously monitor how other people in our community seem to perceive us. We process that information in the form of self-esteem and such related emotions as pride, shame, or insecurity. These emotions compel us to do more of what makes our community value us and less of what doesn’t. And, crucially, they are meant to make that motivation feel like it is coming from within. — location: 448
“Even though at the time there was nothing useful you could do with LinkedIn, that simple icon had a powerful effect in tapping into people’s desire not to look like losers,” B. J. Fogg, the head of Stanford’s Persuasive Tech Lab, has said. — location: 462
“When Facebook changed their algorithm, my likes dropped off and it felt like I wasn’t getting enough oxygen,” Pearlman later told Vice News. “So even if I could blame it on the algorithm, something inside me was like, ‘They don’t like me, I’m not good enough.’” Her own former employer had turned her brain’s nucleus accumbens against her, creating an internal drive for likes so powerful that it overrode her better judgment. Then, like Skinner toying with a research subject, it simply turned the rewards off. “Suddenly I was buying ads, just to get that attention back,” she admitted. — location: 474
“social-validation feedback loop,” as Sean Parker called it: unconsciously chasing the approval of an automated system designed to turn our needs against us. — location: 480
“It is very common for humans to develop things with the best of intentions and for them to have unintended, negative consequences,” Justin Rosenstein, a former Facebook engineer who’d also worked on the Like button, told The Guardian. “If we only care about profit maximization, we will go rapidly into dystopia,” he warned. “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before.” — location: 481
Social identity, Tajfel demonstrated, is how we bond ourselves to the group and they to us. It’s why we feel compelled to hang a flag in front of our house, don an alma mater T-shirt, slap a bumper sticker on our car. — location: 501
because the platforms elevate whatever sentiments best win engagement, they often produce those instincts in their most extreme form. The result can be an artificial reality in which the in-group is always virtuous but besieged, the out-group is always a terrifying threat, and virtually everything that happens is a matter of us-versus-them. — location: 518
one formula proved especially effective: headlines promising to portray the user’s implied in-group (liberals, usually) as humiliating a reviled out-group (creationists, corporations, racists). “A Man Slams a Bigoted Question So Hard He Brings Down the House.” — location: 524
“Identity was the slingshot,” Ezra Klein, Vox’s founder, wrote about digital media in a book on polarization. “Few realized, early on, that the way to win the war for attention was to harness the power of community to create identity. But the winners emerged quickly, often using techniques whose mechanisms they didn’t fully understand.” — location: 533
Facebook and other American tech companies began “zero-rating”—essentially, subsidizing the entire population by striking deals with local carriers to waive charges for any data used via those companies’ apps. — location: 561
Disruption of Foreign Governments
For a country with hundreds of thousands of users, and soon millions, Facebook employed only one moderator who could review content in Burmese, Myanmar’s predominant language, leaving the platform effectively unsupervised. — location: 578
Wow!!!
Facebook finally responded to the Deloitte representative, not to inquire after the violence but to ask if he knew why the platform had been blocked. — location: 589
What the fuck - greedy
Expanding Influence of Gaming
She posted frequently, and sometimes stridently, in support of a cause then gaining momentum among like-minded game-makers and journalists: broadening gaming’s appeal and fan culture beyond its traditional enclave of young male geeks. But some online gaming circles seethed at feminist transgressors who sought, they believed, to corrupt the hobby that had become, amid a world that struck many early social media obsessives as hostile and confusing, a kind of safe space. This was more than a debate over whether prince-rescues-buxom-princess games could make room for offbeat entries like Quinn’s, or even for girl gamers; it was about a geek male identity whose adherents saw themselves as under attack. Gjoni’s narrative of anger and resentment resonated with their own. — location: 608
{ The War on Normal People & Andrew Yang
Gjoni’s post was also read as encouraging the rough justice often embraced on the social web: collective harassment. — location: 618
Fury at Quinn and the supposedly corrupt gaming press overtook much of 4chan and Reddit, then YouTube. Across all three, huge communities grew obsessed with the made-up scandal they termed Gamergate. But what had begun as another internet-trolling episode, if an unusually large one, gradually became something more, something new. Gamergate altered more than the lives of its targets. It sent the extremes of the social web smashing against mainstream American life, forever ending the separation between digital and nondigital spaces, between internet culture and culture. It also launched a new kind of politics, defined by social media’s foundational traits: a digital culture built around nihilistic young men, Silicon Valley dreams of destructive revolution, and platforms designed in ways that supercharge identity into a matter of totalizing and existential conflict. — location: 629
Ken Kesey associate named Stewart Brand, who’d spent the ’60s driving between California’s hippie communes selling supplies out of his truck. He’d called it the Whole Earth Truck Store. On settling in the Santa Clara Valley, he converted it, in 1968, into the Whole Earth Catalog. The name was a joke: it advised readers on how to make the products on their own, alongside articles promoting hippie communalism. Copies were ubiquitous in early Silicon Valley. Steve Jobs later called it “one of the bibles of my generation.” — location: 724
Whole Earth ’Lectronic Link, or WELL. — location: 731
Forever after, the internet-era architects who’d first gathered on the WELL would treat raw majoritarianism as the natural ideal, building it into every subsequent social network through today. Raucous debate became seen as the purest meritocracy: if you couldn’t handle your own or win over the crowd, if you felt harassed or unwelcome, it was because your ideas had not prevailed on merit. — location: 736
These were not just websites. They were a cyber society lifting us above the outdated ways of the physical world. “We reject: kings, presidents, and voting. We believe in: rough consensus and running code,” David Clark, one of the architects of the web, said in 1992. — location: 740
The manifesto enshrined one ideal in particular: total freedom of speech. As on the WELL, this was to be the web’s mechanism for self-governance, first commandment, and greatest gift to the world. Its precepts remain the foundational text of the social media industry. “Our general counsel and CEO like to say that we are the free speech wing of the free speech party,” Twitter’s chief in the UK had said. Zuckerberg called free speech “the founding ideal of the company.” — location: 746
“The reason we nerds didn’t fit in was that in some ways we were a step ahead,” Paul Graham, the investor whose incubator had launched Reddit, once wrote. — location: 758
Graham has said he looks for “nerds” and “idealists” with “a piratical gleam in their eye,” who “delight in breaking rules” and defying social niceties. “These guys want to get rich, but they want to do it by changing the world.” — location: 765
so often the case in the Valley, the hidden force behind everything, setting both the culture and the economics, was venture capitalism. The practice of engineers becoming VCs who pick the next generation of dominant engineers kept the ideological gene pool incestuously narrow. — location: 780
Another Doerr protégé, Netscape founder Marc Andreessen, became a major investor in and board member of Facebook, and personal mentor to Mark Zuckerberg. He co-founded a venture firm that seeded, among others, Slack, Pinterest, and Twitter. — location: 785
One of PayPal’s first executives, Reid Hoffman, used his windfall to found LinkedIn and invest early in Facebook. He introduced Zuckerberg to Thiel, who became Facebook’s first board member. — location: 788
But with the advent of the social media era, the industry was building its worst habits into companies that then smuggled those excesses—chauvinism, a culture of harassment, majoritarianism disguised as meritocracy—into the homes and minds of billions of consumers. — location: 796
Bold to claim social media transmits these internal cultures into external customers' homes
The internet’s promise of total freedom appealed especially to kids, for whom off-line life is ruled by parents and teachers. Adolescents also have a stronger drive to socialize than adults, which manifests as heavier use of social networks and a greater sensitivity to what happens there. Poole had started 4chan when he was just fifteen. Kids who felt isolated off-line, like Adam, drove an outsized share of online activity, bringing the concerns of the disempowered and the bullied with them. — location: 841
A connection with the findings of { The War On Normal People
“When you browse 4chan and 8chan while the rest of your friends are posting normie live-laugh-love shit on Instagram and Facebook,” Adam said, “you feel different. Cooler. Part of something niche.” The joke might be a photo of scatological pornography. A video of a grisly murder. A racial slur posted to get a rise out of people, daring them to take it seriously. Laughing off the material—or, better yet, one-upping it—affirmed that you shared the club’s knowing, cynical detachment. And it recast their relationship with the outside world: it’s not society rejecting us, it’s us rejecting society. These two unifying activities, flaunting taboos and pulling pranks, converged to become trolling. A ’90s message board had defined trolling as posting comments “for no other purpose than to annoy someone or disrupt a discussion,” possibly named after “a style of fishing in which one trails bait through a likely spot hoping for a bite.” Since the days of the WELL, web users had entertained themselves by seeking to provoke one another. On networks like 4chan, it often became something darker: acts of collective abuse in which the point was to delight in someone else’s anguish. — location: 857
Gordon’s presence reflected a then-widespread belief: the social media industry would operate like the video game business. Gordon told an industry conference a few months later that there were “three themes that you CEOs need to master,” listing mobile, social, and, with a fist pump, “gamification.” — location: 929
Marketers, seized by a neo-Freudianism then in vogue, believed they could hook kids by indulging their nascent curiosity about their own genders. New TV programming like My Little Pony and GI Joe delivered hyper-exaggerated gender norms, hijacking adolescents’ natural gender self-discovery and converting it into a desire for molded plastic products. If this sounds like a strikingly crisp echo of social media’s business model, it’s no coincidence. Tapping into our deepest psychological needs, then training us to pursue them through commercial consumption that will leave us unfulfilled and coming back for more, has been central to American capitalism since the postwar boom. — location: 940
Wu, watching Gamergate unfold, was reminded of a moment from years earlier, when she’d interned in the office of Senator Trent Lott of Mississippi. His staff had deployed a now-famous push poll: “Do you believe Democrats are trying to take away your culture?” It performed spectacularly, especially with white men. Imagine, she said, how effective social media platforms could be at this, optimized to trigger people’s emotions more effectively than even the canniest campaign office, saturating audiences in the billions with a version of reality that was like an identity-activating push poll that never ended. — location: 972
Facebook, in the hopes of boosting engagement, began experimenting with breaking the so-called Dunbar limit. The British anthropologist Robin Dunbar had proposed, in the 1990s, that humans are cognitively capped at maintaining about 150 relationships. It was a number derived from the maximum-150-person social groups in which we’d evolved. — location: 985
But studies of rhesus monkeys and macaques, whose Dunbar-like limits are thought to mirror our own, had found that pushing them into larger groups made them more aggressive, more distrusting, and more violent. — location: 998
“There’s this conspiracy-correlation effect,” DiResta said, “in which the platform recognizes that somebody who’s interested in conspiracy A is typically likely to be interested in conspiracy B, and pops it up to them.” Facebook’s groups era promoted something more specific than passive consumption of conspiracies. Simply reading about contrails or lab-made viruses might fill twenty minutes. But joining a community organized around fighting back could become a daily ritual for months or years. Each time a user succumbed, they trained the system to nudge others to do the same. “If they bite,” DiResta said, “then they’ve reinforced that learning. Then the algorithm will take that reinforcement and increase the weighting.” — location: 1012
Millions who opened Reddit every morning encountered a stream of comments and articles asserting the superiority, and indulging the grievances, of the median user. It was a version of reality in which tech libertarianism was always vindicated and alternate belief systems—feminism, establishment liberalism, organized religion—were endlessly humiliated and debunked. — location: 1056
As of 2016, four years after her suit, still only 11 percent of technology venture-capital partners were women. Two percent were Black. The firms, in turn, overwhelmingly funded people who looked like them: in 2018, 98 percent of their investment dollars went to male-led companies. — location: 1065
But Wong resisted. He announced in a post that although Reddit’s leaders “understand the harm that misusing our site does to the victims of this theft,” they would not bend. Reddit, he wrote, was not just a social platform “but the government of a new type of community.” Like many other platform operators to come, however, he made clear his was a government that refused to govern, leaving it to users themselves to “choose between right and wrong, good and evil.” He titled his post “Every Man Is Responsible for His Own Soul.” This would become a standard defense from social media overlords: that the importance of their revolution compelled them to disregard the petty laws and morals of the outmoded off-line world. — location: 1094
Pao was also testing a theory: that the most hateful voices, though few in number, exploited social media’s tendency to amplify extreme content for its attention-winning power, tingeing the entire platform in the process. Stamping out “these core, really bad harassing subreddits” and preventing them from resurfacing, she believed, was the only sure way to end the “ripple effect” from bad behavior. — location: 1130
One study later estimated that the number of far-right white nationalists on Twitter increased by a factor of seven between 2012 and 2016. — location: 1157
She had lasted just eight months. Still, during that time she had revealed the fork in the road facing social networks. They could continue drifting toward becoming new iterations of 4chan turbocharged by algorithms. Or they could detour toward a future of constraints and rules checking the majority’s impulses, or those of especially loud minorities, so that others might participate. — location: 1163
“They call it ‘meme magic’—when previously obscure web memes become so influential they start to affect real-world events,” Yiannopoulos wrote that summer before the election. — location: 1204
If you regularly read Facebook for news in 2016—which 43 percent of Americans did that year—you were probably reading Breitbart. It grew so dominant on the platform that, even in late 2019, after the site had declined under mismanagement and controversy, Facebook appointed Breitbart as a “trusted news source” with special access to Facebook readers. — location: 1219
In a sense, he said, all users are simultaneously both conducting and serving as the subjects of a never-ending psychology experiment. People, as a rule, are closely attentive to, and adapt to, social feedback, an impulse that digital likes and shares tap into. Even as an undergrad engaging in spats on Facebook, “I, by trial and error, learned how people respond to different framings and different appeals.” — location: 1335
moral outrage is not just anger against a transgressor. It is a desire to see the entire community line up against them. — location: 1348
“Low cost, anonymous, instant, and ubiquitous access to the internet has removed most—if not all—of the natural checks on shaming,” she wrote of her findings, “and thus changed the way we perceive and enforce social norms.” — location: 1392
The student, once praised online as a hero, was now condemned as a villain. Few thought to blame the social media platforms that had empowered a teenager to destroy the livelihoods of low-income workers, incentivized her and thousands of onlookers to do it, and ensured that all would experience her outrage-provoking misimpression as truer than the truth. Truth or falsity has little bearing on a post’s reception, except to the extent that a liar is freer to alter facts to conform to a button-pushing narrative. What matters is whether the post can provoke a powerful reaction, usually outrage. A 2013 study of the Chinese platform Weibo found that anger consistently travels further than other sentiments. Studies of Twitter and Facebook have repeatedly found the same, though researchers have narrowed the effect from anger in general to moral outrage specifically. — location: 1433
On social media, one person can, with little warning, face the fury and condemnation of thousands. At that scale, the effect can be psychologically devastating. “The big part of harassment that people who haven’t been repeatedly harassed by a hateful mob are lucky to not get is: It changes your life forever,” Pao, the former Reddit chief, once wrote. “You don’t trust as easily.” — location: 1447
Every time an ancient human clan tore down a despotic alpha, they were doing the same thing that Lyudmila Trut did to her foxes: selecting for docility. More cooperative males reproduced, the aggressive ones did not. We self-domesticated. But just as early humans were breeding one form of aggression out, they were selecting another in: the collective violence they’d used both to topple the alphas and to impose a new order in their place. Life became ruled by what the anthropologist Ernest Gellner called “tyranny of the cousins.” Tribes became leaderless, consensus-based societies, held together by fealty to a shared moral code, which the group’s adults (the “cousins”) enforced, at times violently. “To be a nonconformist, to offend community standards, or to gain a reputation for being mean became dangerous adventures,” Wrangham wrote. Upset the collective and you might be shunned or exiled—or wake up to a rock slamming into your forehead. Most hunter-gatherer societies live this way today, suggesting that the practice draws on something intrinsic to our species. — location: 1488
In our very recent history, we decided that those impulses are more dangerous than beneficial. We replaced the tyranny of cousins with the rule of law (mostly), banned collective violence, and discouraged moblike behavior. But instincts cannot be entirely neutralized, only contained. — location: 1502
American bias?
philosophers Justin Tosi and Brandon Warmke have termed “moral grandstanding”—showing off that you are more outraged, and therefore more moral, than everyone else. “In a quest to impress peers,” Tosi and Warmke write, “grandstanders trump up moral charges, pile on in cases of public shaming, announce that anyone who disagrees with them is obviously wrong, or exaggerate emotional displays.” — location: 1515
“Our job was to keep people engaged and hanging out with us,” he wrote. Give users a long video that they won’t want to turn off, then another, then another. More watch time “begets more advertising, which incentivizes more content creators, which draws more viewership,” — location: 1647
By 2002, spam accounted for 40 percent of all email and growing. — location: 1660
With machine learning, engineers could do something better than write a program for catching spam. They designed a program that would guide its own evolution. They fed this program huge sets of spam and non-spam emails. The system then automatically built thousands of spam filters, all slightly different, and tested each on the sample emails. Then it built a new generation of spam filters based on the best performers and repeated the process, over and over, like a botanist identifying and crossbreeding the hardiest plants. — location: 1662
One of 20220708 - Google’s most cherished freedoms, inherited from Silicon Valley’s midcentury founders and borrowed outright from Stanford’s research programs across town, is the 80/20 rule. Employees owe 80 percent of their time to formal assignments but can spend the other 20 pursuing side projects. Chaslot and his team leader, who shared his concerns, dedicated their 20 to developing a new algorithm that might balance profit goals with public well-being. — location: 1704
Pariser’s fear, several years earlier, had been more fundamental. “There’s this epic struggle going on between our future, aspirational selves and our more impulsive, present selves,” he said. Even in 2011, years before YouTube or Facebook superpowered their systems to such destructive results, these earlier, simpler algorithms already reliably took the side of the impulses. And they usually won, proliferating “invisible autopropaganda, indoctrinating us with our own ideas.” — location: 1734
YouTube was reengineering itself around a single-minded pursuit for which Chaslot wasn’t on board. “These values of moderation, of kindness, anything that you can think of that are values on which our society is based, the engineers didn’t care about putting these values in the system,” he said. “They just cared about ad revenue. They were thinking that just by caring about one metric, which is watch time, then you’ll do good for everybody. But this is just false.” — location: 1759
Investors, she realized, weren’t throwing money at any kid with a pitch. They were chasing a very specific model: free-to-use web services that promised breakneck user growth. It puzzled her, though, because many shut down without making a dime in profit, only to have another round of startups replace them. — location: 1807
This changed what investors wanted from their investments. It was no longer about finding that promising widget-maker whose sales might, after many hard and expensive years, one day eclipse costs. It was about investing in lots of cheap web startups, knowing that most would fail but that one breakout success would cover those losses and then some. — location: 1820
It was as if Coca-Cola stocked a billion soda machines with some A.I.-designed beverage without a single human checking the bottles’ contents—and if the drink-filling A.I. was programmed only to boost sales, without regard for health or safety. — location: 1869
“We design a lot of algorithms so we can produce interesting content for you,” Zuckerberg said in an interview. “It analyzes all the information available to each user and it actually computes what’s going to be the most interesting piece of information.” An ex-Facebooker put it more bluntly: “It is designed to make you want to keep scrolling, keep looking, keep liking.” Another: “That’s the key. That’s the secret sauce. That’s how, that’s why we’re worth X billion dollars.” — location: 1876
Facebook engineers were automatically “paged,” a former news-feed team leader recounted, if likes or shares slid, so that they could tweak the system to boost them again. “If your job is to get that number up, at some point you run out of good, purely positive ways,” a former Facebook operations manager has said. “You start thinking about ‘Well, what are the dark patterns that I can use to get people to log back in?’” — location: 1907
YouTube was exploiting a cognitive loophole known as the illusory truth effect. We are, every hour of every day, bombarded with information. To cope, we take mental shortcuts to quickly decide what to accept or reject. One is familiarity; if a claim feels like something we’ve accepted as true before, it probably still is. It’s a gap in our mental defenses you could drive a truck through. In experiments, research subjects bombarded with the phrase “the body temperature of a chicken” will readily agree with variations like “the body temperature of a chicken is 144 degrees.” — location: 1940
Shortly after Twitter algorithmified, Microsoft launched an A.I.-run Twitter account called Tay. The bot operated, like the platforms, on machine learning, though with a narrower goal: to converse convincingly with humans by learning from each exchange. “can i just say that im stoked to meet u? humans are super cool,” Tay wrote to a user on day one. Within twenty-four hours, Tay’s tweets had taken a disturbing turn. “Hitler was right I hate the Jews,” it wrote to one user. To another: “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.” Microsoft pulled the plug. After 96,000 interactions, Tay had become a Trump-supporting, Gamergate-invoking neo-Nazi. — location: 1999
American bias
The restaurant’s owner, James Alefantis, spent the next few years bunkering against a torrent of increasingly detailed death threats. He pleaded with the social media platforms to intervene, and did find Yelp and Facebook to be “responsive.” But YouTube, Alefantis has said, refused to act, insisting it was a mere neutral platform with no responsibility for Pizzagate, and that if Alefantis wanted refuge from the videos that had urged Welch into storming his restaurant and might be radicalizing others still, he was welcome to come back with a court order. — location: 2124
Moral-emotional words convey feelings like disgust, shame, or gratitude. (“Refugees deserve compassion.” “That politician’s views are repulsive.”) More than just words, these are expressions of, and calls for, communal judgment, positive or negative. When you say, “Suzy’s behavior is appalling,” you’re really saying, “Suzy has crossed a moral line; the community should take notice and maybe even act.” That makes these words different from either narrowly emotional sentiments (“Overjoyed at today’s marriage equality ruling”) or purely moral ones (“The president is a liar”), for which Brady’s effect didn’t appear. Tweets with moral-emotional words, he found, traveled 20 percent farther—for each moral-emotional word. — location: 2142
When a liberal posted a tweet with moral-emotional words, its reach substantially increased among other liberals, but declined with conservatives. (And vice versa.) It won the user more overall attention and validation, in other words, at the cost of alienating people from the opposing side. Proof that Twitter encouraged polarization. — location: 2150
contact theory. Coined after World War II to explain why desegregated troops became less prone to racism, the theory suggested that social contact led distrustful groups to humanize one another. But subsequent research has shown that this process works only under narrow circumstances: managed exposure, equality of treatment, neutral territory, and a shared task. Simply mashing hostile tribes together, researchers repeatedly found, worsens animosity. — location: 2215
Control groups that read the article with no comments became more moderate and open-minded. It wasn’t that the comments themselves were persuasive; it was the mere context of having comments at all. News readers, the researchers discovered, process information differently when they are in a social environment: social instincts overwhelm reason, leading them to look for affirmation of their side’s righteousness. — location: 2233
Over many iterations, the Russians settled on a strategy. Appeal to people’s group identity. Tell them that identity was under attack. Whip up outrage against an out-group. And deploy as much moral-emotional language as possible. — location: 2307
A new truism of politics was emerging: social media elevated anti-establishment politicians conversant with exaggerated moral-emotional language. Mélenchon, though unpopular with voters, won millions of views on YouTube, where his most dedicated fans seemed to congregate. This had started as a positive: the internet offered political outsiders a way around the mainstream outlets that shunned them. As those candidates’ grassroots supporters spent disproportionate time on YouTube, the system learned to push users to those videos, creating more fans, driving up watch time further. But thanks to the preferences of the algorithms for extreme and divisive content, it was mostly fringe radicals who benefited, and not candidates across the spectrum. — location: 2362
The platforms, they concluded, were reshaping not just online behavior but underlying social impulses, and not just individually but collectively, potentially altering the nature of “civic engagement and activism, political polarization, propaganda and disinformation.” They called it the MAD model, for the three forces rewiring people’s minds. Motivation: the instincts and habits hijacked by the mechanics of social media platforms. Attention: users’ focus manipulated to distort their perceptions of social cues and mores. Design: platforms that had been constructed in ways that train and incentivize certain behaviors. — location: 2394
the number of seconds in your day never changes. The amount of social media content competing for those seconds, however, doubles every year or so, depending on how you measure it. Imagine, for instance, that your network produces 200 posts per day, of which you have time to read 100. Because of the platforms’ tilt, you will see the most moral-emotional half of your feed. Next year, when 200 doubles to 400, you see the most moral-emotional quarter. The year after that, the most moral-emotional eighth. Over time, your impression of your own community becomes radically more moralizing, aggrandizing, and outraged—and so do you. At the same time, less innately engaging forms of content—truth, appeals to the greater good, appeals to tolerance—become more and more outmatched. — location: 2408
The head of Myanmar’s first real media collective, a jittery reporter back from years in exile, said the country’s long-suppressed journalists, finally unfettered, faced a new antagonist. Social media platforms were doing what even the dictatorship’s trained propagandists couldn’t: producing fake news and nationalist fanfare so engaging, so flattering to readers’ biases, that people chose it voluntarily over real journalism. — location: 2472
That June, the company, much as it had in 2013 after brushing off warnings of impending violence that quickly proved accurate, scaled up in Myanmar anyway, launching “Free Basics,” which allowed locals to use Facebook’s smartphone app without paying data charges. Within months, 38 percent of people in the country said they got most or all of their news via Facebook. — location: 2519
By late 2017, as the Myanmar genocide raged on, Chamath Palihapitiya, Facebook’s former chief of global growth, speaking at what was expected to be a routine speech to Stanford MBA students, snapped. “I feel tremendous guilt,” he said. “I think we all knew in the back of our minds, even though we all feigned this whole line that there probably weren’t any unintended consequences. I think we knew that something bad could happen.” Palihapitiya had left Facebook years earlier. But he had helped set the company down the path it remains on today, persuading its chiefs to reengineer both the business and the platform around permanent, globe-spanning growth. The tools they had created to accomplish this were “ripping apart the social fabric,” Palihapitiya said. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” creating a world with “no civil discourse, no cooperation; misinformation, mistruth.” He urged the would-be engineers and startup founders in the room to take heed. “If you feed the beast, that beast will destroy you,” he said. “If you push back on it, we have a chance to control it and rein it in.” — location: 2564
“Facebook is important to us because if something is happening somewhere, that’s how we find out,” one said. “Facebook will tell us about it.” Lal, the cousin, agreed. He called Facebook “the embers beneath the ashes” of racial anger that, only days earlier, had brought the country to chaos. “People get provoked into action.” — location: 2599
“As the usage expands, it’s in every country, it’s in places in the world and languages and cultures we don’t understand,” Chris Cox, Facebook’s chief product officer, boasted in 2013. He cited one in particular: Myanmar, where he’d heard that Facebook already dominated locals’ access to news. There was, they told themselves, whether out of ideological fervor or financially motivated disinterest, no need to monitor or even consider the consequences, because they could only be positive. This was more than hubris. It drew on an idea, suffusing the Valley, that had originated with Peter Thiel, Facebook’s foundational investor: “zero to one.” It was a mandate, commercial and ideological, for companies to invent something so new that there was no market for it—starting at zero—and then control that market absolutely, a field with one entrant. “The history of progress is a history of better monopoly businesses replacing incumbents,” Thiel wrote. Intel and processors. Apple and personal computers. Uber and private taxis. Facebook and social networking. — location: 2633
Members of a local human rights group, huddled in a small office in the capital city, Colombo, marked down every post, tracing a network of hate. They planned to pass it all along to Facebook. The researchers were doing Facebook’s work for them, they knew, and for free. Volunteer janitors for one of the world’s wealthiest platforms. But the company ignored them. “We have given, for the past four years, data-driven examples of hate. We’ve given them pages of data,” Sanjana Hattotuwa, then a researcher with that rights group, Center for Policy Alternatives, told us. “It’s pointless to coordinate with Facebook,” he huffed, pacing angrily. Hattotuwa, a familiar face at international technology conferences, had managed to make some connections at the company. But no matter how extreme the incitements to violence, no matter how stridently he warned that the platform was going to get somebody killed, the response was the same: “They say it doesn’t contravene anything. They say please get back to us with more information.” — location: 2654
Gunawardana marked post after post using Facebook’s reporting widget. A high-ranking official reduced to begging, via Facebook’s submission box, for some anonymous moderator to take notice of his country’s spiral toward violence. Every single report was ignored. “There needs to be some kind of engagement with countries like Sri Lanka,” Gunawardana said. “We’re a society, we’re not just a market.” — location: 2684
At their best, he said, social media platforms “made things more transparent, gave voice to people who did not have voices.” But the past months, he said, had destroyed his faith in the technology he’d once credited with bringing his country democracy. “This idea of social media as an open, equal platform is a complete lie,” he now believed. “There is no editor, there is the algorithm.” He stressed that Sri Lanka’s divisions predated social media. But these platforms, he warned, brought out the very worst in a society, amplifying its extremes in ways that had never before been possible. “We don’t completely blame Facebook,” Dissanayake said. “The germs are ours, but Facebook is the wind, you know?” His government was considering regulations or fines, he said. But he knew Sri Lanka’s power was modest. Only Americans, he believed, had enough leverage to force change. “You, the United States itself, should fight the algorithm. What compels Facebook, beyond that?” — location: 2746
It wasn’t that he had faith that social media was accurate, he said. “But you have to spend time and money to go to the market to get a newspaper. I can just open my phone and get the news instead.” He looked up from the floor, shrugging. “Whether it’s wrong or right, it’s what I read.” — location: 2769
The defining element across all these rumors was something more specific and dangerous than generalized outrage: a phenomenon called status threat. When members of a dominant social group feel at risk of losing their position, it can spark a ferocious reaction. They grow nostalgic for a past, real or imagined, when they felt secure in their dominance (“Make America Great Again”). They become hyper-attuned for any change that might seem tied to their position: shifting demographics, evolving social norms, widening minority rights. And they grow obsessed with playing up minorities as dangerous, manifesting stories and rumors to confirm the belief. It’s a kind of collective defense mechanism to preserve dominance. It is mostly unconscious, almost animalistic, and therefore easily manipulated, whether by opportunistic leaders or profit-seeking algorithms. — location: 2803
Wherever per-person Facebook use rose by one standard deviation above the national average, attacks on refugees increased by about 35 percent. Nationwide, they estimated, this effect drove as much as 10 percent of all anti-refugee violence. — location: 2849
group house and set a fire seemingly intended to kill all — location: 2873
defining traits and tics of superposters, mapped out in a series of psychological studies, are broadly negative. One is dogmatism: “relatively unchangeable, unjustified certainty.” Dogmatics tend to be narrow-minded, pushy, and loud. Another: grandiose narcissism, defined by feelings of innate superiority and entitlement. Narcissists are consumed by cravings for admiration and belonging, which makes social media’s instant feedback and large audiences all but irresistible. That need is deepened by superposters’ unusually low self-esteem, which is exacerbated by the platforms themselves. — location: 2914
our sense of right or wrong is heavily, if unconsciously, influenced by what we believe our peers think: morality by tribal consensus, guided not by some better angel or higher power but by self-preserving deference to the tyranny of cousins. — location: 2934
Asked how many steps it would take, on average, for a YouTube viewer who pulled up a Chemnitz news clip to find themselves watching far-right propaganda, Serrato answered, “Only two.” He added, “By the second, you’re quite knee-deep in the alt right.” — location: 3090
Recommendations rarely led users back to mainstream news coverage, or to liberal or apolitical content of any kind. Once among extremists, the algorithm tended to stay there, as if that had been the destination all along. — location: 3092
Distrust of media?
many belonged to no distinct cause or group. YouTube, rather than activating a preexisting community with a preexisting identity, had created one out of nothing. It had built the network on its systems, pulled it together with a shared reality and beliefs, then willed it into the world, all in a matter of days. — location: 3115
The researchers expected to see results resembling a cloud: thousands of topic-spanning channels arranged only loosely. Instead, the network displayed as a neat series of clusters, arranged one next to the other, like a subway map. They were amazed. For YouTube’s systems to analyze and sort billions of hours of video in real time, then direct billions of users through the network with this level of precision and consistency, was an incredible technological feat, demonstrating the algorithm’s sophistication and power. — location: 3160
Michael Kimmel calls “aggrieved entitlement.” For generations, white men expected and received preferential treatment and special status. As society inched toward equality, those perks, while still substantial, declined. Some white men acclimated. Some rebelled. Others knew only that they felt something being taken away. Peterson et al. give them a way to explain those feelings of injustice—feminists and leftists are destroying the masculine spirit—and an easy set of answers. Clean your room. Sit up straight. Reassert traditional hierarchies. — location: 3239
The scholar J. M. Berger calls it “the crisis-solution construct.” When people feel destabilized, they often reach for a strong group identity to regain a sense of control. It can be as broad as nationality or narrow as a church group. Identities that promise to recontextualize individual hardships into a wider conflict hold special appeal. You’re not unhappy because of your struggle to contend with personal circumstances; you’re unhappy because of Them and their persecution of Us. It makes those hardships feel comprehensible and, because you’re no longer facing them alone, a lot less scary. Crisis-solution: there is a crisis, the out-group is responsible, your in-group offers the solution. If that sense of conflict escalates too far, it can reach the point of radicalization, in which you see the out-group as an immutable threat over which only total victory is acceptable. “The scale of the crisis becomes more extreme, and the prescribed solution becomes more violent,” Berger wrote, until destroying the out-group becomes the core of the in-group’s shared identity. — location: 3255
By 2021, fifty killings had been claimed by self-described incels, a wave of terrorist violence. — location: 3275
When a far-right paramilitary group called the Oath Keepers surveyed its 25,000 members on how they’d come to the movement, their most common answer was Facebook, followed by YouTube. — location: 3279
Algorithms like it because it engages people’s attention and passion, turning web browsing into a matter of identity, community, even fanaticism—and therefore more watch time. — location: 3286
YouTube upgraded its algorithms over 2016 and 2017, adding a system it called Reinforce, which recommended users into unfamiliar subgenres. — location: 3288
But by 2018, after the Reinforce system had been implemented, “rabbit hole” increasingly referred to following political YouTube channels toward extremism. — location: 3307
He and Rauchfleisch warned, “Being a conservative on YouTube means that you’re only one or two clicks away from extreme far-right channels, conspiracy theories, and radicalizing content.” — location: 3345
That spring, after a school shooting, YouTube’s high-profile “trending” page began promoting an Alex Jones video claiming that the violence had been faked. Jones had pushed versions of this since the 2012 Sandy Hook shooting, when he called the murdered twenty children and six teachers “crisis actors” in a vague government plot to justify confiscating guns or imposing martial law. The conspiracy had spread on YouTube ever since, consumed by growing numbers of viewers who, enraged, organized years-long harassment campaigns against the families of the murdered children. Some parents went into hiding, and several filed three separate lawsuits against Jones for defamation. (In 2021, Jones lost all three.) All the while, YouTube continued promoting the videos, which by 2018 had a combined 50 million views. When YouTube’s system posted Jones’s latest on its trending page, then, it was an unusually visible indicator of how eagerly the site’s algorithms boosted him. Members of YouTube’s policy team recommended tweaking the trending-page algorithm to prevent it from linking to Jones or other discredited sources. They were overruled. Although Jones was most prominent on YouTube, where he had billions of views, he reached millions of followers on Facebook and Twitter as well, and these companies also came under pressure to revoke the digital bullhorns they’d made for him. — location: 3351
“Our families are in danger,” they wrote, “as a direct result of the hundreds of thousands of people who see and believe the lies and hate speech, which you have decided should be protected.” — location: 3364
Of the thirty attendees interviewed, twenty-nine said they had been exposed to and convinced of Flat Eartherism on YouTube. The thirtieth had been recruited by his daughter, who’d found it on YouTube. — location: 3400
YouTube, by showing users many videos in a row all echoing the same thing, hammers especially hard at two of our cognitive weak points—that repeated exposure to a claim, as well as the impression that the claim is widely accepted, each make it feel truer than we would otherwise judge it to be. — location: 3410
Extremism researchers would long speculate that many or all of Q’s posts—four thousand in all, unspooled over three years—were actually the work of Ron Watkins, a thirty-year-old programmer who had recently taken over running 8chan, the 4chan spinoff forum. Watkins even seemed to hint as much in a 2021 documentary, telling his interviewer, “It was basically three years of intelligence training, teaching normies how to do intelligence work,” though he added, “But never as Q.” — location: 3440
Renée DiResta’s warnings, ever since her first anti-vaccine discovery years earlier, that the feature was a vehicle for radicalization. “I can’t emphasize enough what a disaster Groups are,” she tweeted in 2018, as evidence mounted. “The Groups recommendation engine is a conspiracy correlation matrix. It pushes people prone to extremist & polarizing content into closed and then secret groups. FB has no idea what it’s built here.” — location: 3480
It was 4chan-style boundary pushing without even 4chan’s norms or rules. 8channers went to new extremes as a collective defense against the anomie that, as the rejects of the rejects, the misfits of the misfits, they centered in their common identity. “Their theory about what they were doing on there, what they were getting out of it, was that they were learning not to be triggered by people pushing their emotional or ideological buttons,” Dominic Fox, a software engineer, wrote of the chans. “The real world was a harsh and uncaring place, and anyone who pretended to care, or to need caring for, was by definition engaged in deception, a kind of swindle.” Therefore, according to Fox, in their thinking, “the only way to be free of such control was to gaze at racist memes, car crash photos, horrifying pornography, and so on until one could do so with complete serenity.” It was, he wrote, a culture of “deliberate self-desensitization.” — location: 3495
Terrorism is violence intended to menace a wider community for the sake of political ends or simple malice. But it is also typically a performance for, and an act of solidarity with, an in-group. — location: 3542
swirls of hate and incitement that his team had been forced to leave online by rules that they had flagged again and again as insufficient and faulty. “At the end of the day,” he said, “you are forced to follow the rules of the company if you want to keep your job.” But the decisions weighed on him. “You feel like you killed someone by not acting.” — location: 3625
The goal was to reduce context-heavy questions that even a team of specialized lawyers would struggle to parse—what constitutes a threat, when is an idea hateful, when is a rumor dangerous—to a black-and-white matter so straightforward that any given moderator could decide it with no independent thought. — location: 3637
The company, it seemed, had not taken on the role of global arbiter so much as drifted into it, crisis by crisis, rule by rule. Its leaders were reluctant overlords, wary of backlash, averse to owning decisions, executing their role largely from the shadows. — location: 3664
A set of Pakistan guidelines warned moderators against creating a “PR fire” by taking any action that could “have a negative impact on Facebook’s reputation or even put the company at legal risk.” — location: 3671
“The understanding used to be that you brought the venture capitalist in for—the term was always ‘adult supervision,’” Leslie Berlin, a Stanford University historian, explained. The investors installed senior managers, the corporate board, even a seasoned CEO to oversee the founder at their own company. When John Doerr invested $12.5 million in Google, founded and run by two grad students, he brought in Eric Schmidt, a veteran executive twenty years their senior, as their boss. — location: 3709
The politics of the PayPal founders leaned severely libertarian: they were socially Darwinian, distrustful of government, certain that business knew best. Thiel took this to such extremes that in 2009, he announced, “I no longer believe that freedom and democracy are compatible.” Society could no longer be trusted to “the unthinking demos that guides so-called social democracy,” he wrote, using the Greek term for citizens. Only “companies like Facebook” could safeguard liberty. And only if they were unshackled from “politics,” which seemed to mean regulation, public accountability, and possibly the law. — location: 3747
This sense of divine mission drove the angel investors of Generation PayPal who selected the startups and founders to remake the world around their vision. They called it disrupting incumbents. Uber and Lyft would not just offer a new way to hail taxis, they would abolish and replace the old one. Airbnb would disrupt short-term housing. All three were PayPal alumni investees. Many others pursued the same violent displacement. Amazon and physical retail, Napster and music. Only a few, like Thiel, seriously suggested doing to global governance what Uber had done to ridesharing. But once the social media platforms stumbled into that role, it must have felt like just a continuation of their rightful place. Of the belief that society is a set of engineering problems waiting to be solved. — location: 3755
Zuckerberg had given away a telling detail: in their internal research, they’d found that people engage more with extreme content “even when they tell us afterwards they don’t like the content.” — location: 3789
People who deleted Facebook became happier, more satisfied with their life, and less anxious. The emotional change was equivalent to 25 to 40 percent of the effect of going to therapy—a stunning drop for a four-week break. Four in five said afterward that deactivating had been good for them. Facebook quitters also spent 15 percent less time consuming the news. They became, as a result, less knowledgeable about current events—the only negative effect. But much of the knowledge they had lost seemed to be from polarizing content; information packaged in a way to indulge tribal antagonisms. Overall, the economists wrote, deactivation “significantly reduced polarization of views on policy issues and a measure of exposure to polarizing news.” Their level of polarization dropped by almost half the amount by which the average American’s polarization had risen between 1996 and 2018—the very period during which the democracy-endangering polarization crisis had occurred. — location: 3811
An internal poll of 29,000 Facebook employees taken that October found that the share of employees who said they were proud to work at Facebook had declined from 87 to 70 percent in just a year. The share who felt their company made the world a better place had dropped from 72 to 53 percent, and on whether they felt optimistic about Facebook’s future, from the mid-80s to just over 50 percent. — location: 3825
Erica Chenoweth, a scholar of civil resistance at Harvard. The frequency of mass-protest movements had been growing worldwide since the 1950s, she found, and had accelerated lately. Between the 2000s and the 2010s, average episodes per year had jumped nearly 50 percent. Their success rate had been growing, too, year after year, for decades. Around 2000, 70 percent of protest movements demanding systemic change succeeded. But then, suddenly, that trend reversed. They began failing—just as they were getting more frequent. Now, Chenoweth found, only 30 percent of mass movements succeeded. “Something has really shifted,” she told me, calling the drop “staggering.” Virtually every month, another country would erupt in nationwide protests: Lebanon over corruption, India over gender inequality, Spain over Catalan separatism. Many at a scale exceeding the most transformative movements of the twentieth century. And most of them fizzling. To explain this, Chenoweth drew on an observation by Zeynep Tufekci, the University of North Carolina scholar: social media makes it easier for activists to organize protests and to quickly draw once-unthinkable numbers—but this may actually be a liability. For one, social media, though initially greeted as a force for liberation, “really advantages repression in the digital age much more than mobilization,” Chenoweth said. Dictators had learned how to turn it to their advantage, using their superior resources to flood platforms with disinformation and propaganda. The effect in democracies was subtler but still powerful. Chenoweth cited, as a comparison, the Student Nonviolent Coordinating Committee, a civil rights–era student group. Before social media, activists had to mobilize through community outreach and organization-building. They met almost daily to drill, strategize, and confer. It was agonizing, years-long work. But it made the movement durable, built on real-world ties and chains of command. It allowed movements like SNCC to persevere when things got hard, respond strategically to events, and translate street victories into political change. Social media allows protests to skip many of those steps, putting more bodies on the streets more quickly. “That can give people a sense of false confidence,” Chenoweth said, “because it’s lower commitment.” Without the underlying infrastructure, social media movements are less able to organize coherent demands, coordinate, or act strategically. And by channeling popular energy away from the harder kind of organizing, it preempts traditional movements from emerging. — location: 3863
“I said, ‘Get an office there, now,’” Gates recalled, referring to Washington, where Facebook and Google began spending millions on lobbying. “And Mark did, and he owes me.” — location: 3944
“There’s a real tension here between wanting to have nuances to account for every situation, and wanting to have a set of policies we can enforce accurately and we can explain cleanly.” — location: 4031
“time well spent,” a phrase borrowed from Tristan Harris, the former Google engineer who’d warned about addictive conditioning, and quit in 2015. — location: 4073
Nir Eyal, the consultant who’d pioneered slot machines as the model for social media platforms, pivoted from screen-time-maximization guru to screen-time-reduction guru, publishing a book with the title Indistractable. — location: 4078
Lionço is Brazilian. In the fall of 2018, the fringe lawmaker and YouTuber who’d launched the disinformation campaign against Lionço six years earlier, a man named Jair Bolsonaro, ran to become her country’s president. — location: 4120
Brazilians like Martins and Dominguez were making a claim far beyond anything that researchers like Jonas Kaiser or Guillaume Chaslot had observed: that YouTube had not merely created some online fringe community or altered certain users’ views, but that it had radicalized their country’s entire conservative movement, and so effectively as to supplant right-wing politics almost entirely. — location: 4176
YouTube tilted dramatically pro-Bolsonaro and hard-right during a period when Bolsonaro’s poll numbers remained static and low. The platform was not reflecting real-world trends. It was creating its own. — location: 4188
He seemed to have little agenda beyond stoking outrage and winning attention on social media, whose cues and incentives, after all, he had followed to high office. — location: 4240
Since 2015, thousands of pregnant women in the Americas, infected by a new virus called Zika, have given birth to children with severe neurological impairment and misshapen skulls, a condition known as microcephaly. — location: 4294
Conspiracies offered a level of certainty that science could not. — location: 4341
the platform exploited normal user interest in medical matters, as it had with politics, to pull them down rabbit holes they would have otherwise never pursued. And that, much as YouTube had learned to sequence politics videos to turn casual viewers into digitally addicted radicals, it had come to array Zika and vaccine videos in precisely the right order to persuade loving mothers to deliberately endanger their children. “There’s always going to be borderline content on platforms. That’s to be expected,” Kaiser said, straining to sympathize with the tech companies. “The shocking thing,” he added, “is that YouTube’s algorithms basically are helping people to go in these directions.” — location: 4359
In parts of Brazil where illiteracy is high, this is thought to be a primary means by which many families consume news. WhatsApp groups are their Google, their Facebook, and their CNN, all rolled into one. — location: 4373
WhatsApp users, they found, uploaded one video for every fourteen text messages, an astonishingly high rate. WhatsApp users also linked to YouTube more than any other site—ten times as frequently as they linked to Facebook—bolstering the theory of a YouTube-to-WhatsApp pipeline. They found similar trends in India and Indonesia, suggesting the effect might be universal. — location: 4381
YouTube had cultivated an enormous audience of viewers who had never sought the content out, but rather were pulled into it by the platform’s recommendations. This was not just another rabbit hole. The pathway appeared to mimic, step by step, a process that psychologists had repeatedly observed in research on how people develop attractions to child pornography. — location: 4490
on YouTube, the second-most-popular website in the world, the system seemed to have identified people with this urge, walked them along a path that indulged it at just the right pace to keep them moving, and then pointed them in a very specific direction. “This is something that takes them on that journey,” Rogers said of YouTube’s sequencing. He called the recommendation engine a potential “gateway drug towards more hard-core child pornography.” — location: 4500
For the other 23 hours and 59 minutes of the day, the fearful or the lonely could turn to that other window to the outside world: their computer. Several years’ worth of digital adoption happened overnight. Facebook reported a 70 percent increase in usage in some countries. Twitter’s grew 23 percent. An internet services firm estimated that YouTube’s share of worldwide internet traffic jumped from 9 to 16 percent. Overall internet usage rose 40 percent, the firm also said, suggesting that in actuality YouTube’s traffic nearly tripled. — location: 4606
By the pandemic’s outset, the QAnon cause, amid its now almost impenetrably dense lore and esoterica, had sharpened around a core belief: President Trump and loyal generals were on the verge of a glorious military coup that would overturn the cabal that had orchestrated Pizzagate and that secretly dominated American life. In the subsequent purge, the military would execute tens of thousands of traitorous Democrats, Jewish financiers, federal bureaucrats, and cultural liberals on the National Mall. Q adherents, most of whom gathered on Facebook or YouTube and never ventured to the hardcore forums where Q’s “drops” initially appeared, were told they played a crucial role, if only by following along for clues and helping to spread the word. That summer, ninety-seven professed QAnon believers would run in congressional primaries; twenty-seven of them would win. Two ran as independents. The other twenty-five were in-good-standing Republican nominees for the House of Representatives. QAnon memes and references especially dominated militia pages, escalating their sense that violence would be both righteous and inevitable. — location: 4730
Twitter temporarily switched off the algorithm that pushed especially viral tweets into users’ news feeds even if they did not follow the tweet’s author. The company called the effort to “slow down” virality a “worthwhile sacrifice to encourage more thoughtful and explicit amplification.” This was, as best I could tell, the first and only time that a major platform had voluntarily shut down its own algorithm. It was implicit admission of exactly the sort that the companies had avoided for so long: that their products could be dangerous, that societies would be safer with aspects of those products switched off, and that it was easily within their power to do so. — location: 4865
Boogaloo members who were stockpiling weapons and explosives for a plot to kidnap and potentially murder Michigan’s governor. They had organized in part on a private Facebook group. The platforms’ broader natures remained unchanged. In the weeks before the election, Facebook filled with calls for violence targeting Trump’s enemies. Digital researchers identified at least 60,000 posts invoking acts of political violence: “The Terrorist Democrats Are The Enemy And All Must Be Killed.” “The next time we see SChiff, it should be him hanging from a noose '#deathforschiff'.” (Adam Schiff is a Democratic Congressman who led the first impeachment effort against Trump.) Another 10,000 called for armed insurrection if Biden won. A staggering 2.7 million posts on political groups urged violence in more general terms such as “kill them” or “shoot them.” — location: 4876
On election day, two Q candidates won seats in Congress: Lauren Boebert of Colorado and Marjorie Taylor Greene of Georgia. Greene had also echoed Alex Jones’s claims that school shootings were staged and, in a reminder that political violence was central to the cause, had once liked a Facebook post calling for Barack Obama to be hanged and another urging that House Speaker Nancy Pelosi should get “a bullet to the head.” — location: 4888
Facebook researchers made a startling discovery: 10 percent of all U.S.-based views of political content, or 2 percent of overall views, were of posts claiming the election had been stolen. — location: 4926
All demanded sweeping policy changes, ending with the same admonition: that the companies “begin a fundamental reexamination of maximizing user engagement as the basis for algorithmic sorting and recommendation.” — location: 5075
By early 2022, one study found, more than one in nine state lawmakers nationwide belonged to at least one far-right Facebook group. Many promoted the conspiracies and ideologies that had first grown online into law, passing waves of legislation that curbed voting rights, Covid policies, and LGBT protections. — location: 5137
The Texas GOP, which controlled the state senate, state house, and the governor’s mansion, changed its official slogan to “We are the storm,” the QAnon rallying cry. — location: 5141
in 2022, QAnon-aligned candidates were on the ballot in 26 states. — location: 5144
There is one group with the leverage, access, and technical know-how to effectively pressure Silicon Valley: its own workforce. — location: 5206
The founders and CEOs of these companies, for all their fabulous wealth, have been, whether they realized it or not, prisoners of their creations from the day that a venture capitalist first cut them a check in return for the promise of permanent, exponential growth. — location: 5249
There are, as in any matter this contentious, a handful of dissenting experts, who argue that social media’s impact is overblown. They do not dispute the evidence for the technology’s role in harms like radicalization, but rather use different methods that produce milder results. Still, their view remains a minority, and one of relative emphasis, akin to arguing that car exhaust’s role in climate change is less than that of coal plants. — location: 5261