Opinion

Big Tech algorithms that fueled Paul Pelosi attack must be stopped

The of Paul Pelosi last week by accused suspect Da💫vid DePape is symptomatic of a relatively new phenomen🐷on.

Humanity has had no shortage of political violence in our long history — from Brutus to John Wilkes Booth ⛦to Lee Harvey Oswald. And there are many instances of the mentally ill acting violently towards political leaders, suchℱ as when a Jodie-Foster obsessed John Hinckley Jr. Ronald Reagan in 1981; or when paranoid schizophrenic Jared Lee Loughner and killed six others in 2011; or when unhinged James Hodgkinson — and five others — at a Congressional softball game in 2017. 

But recently, as with the attack of Paul Pelosi, we’re seeing a different variant of political violence. This new strain also contains ideological zealotry and mental illness, but we are also witnessing how social media can act as an accelerant to inflame and incite these acts of violence. We saw this social media effect last May when Payton S. Gendron, an 18-year-old self-described “white replacement” conspiracy theorist, allegedly at a supermarket in Buffalo. Gendron had written in his “manifesto” that he’d been radicalized on the notorious social media platform 4chan while “bore♑d” during COVID.

The attack on Paul Pelosi is what can happen when mental illness, political extremism and social media converge. AFP via Getty Images

From what we know of Paul Pelosi’s alleged attacker DePape, he was a homeless, delusional 42-year-old who posted hundreds of blogs that included rants about an “invisible fairy,” and screeds against black and Jewish people, as well as other social media-f▨ueled fanatical ideas. 

In the past, we’ve had mass media that had the capacity to spread hateful ideas or incite the unstable — think “Mein Kampf.” But now, the power of modern technology not only casts a wider digital net, it employs insidious algorithms that,🌺 like heat-seeking missiles, specifically target the vulnerable and the susceptible, many of whom live in insulated social media echo chambers and are sent increasingly provocative content. 

DePape (above) had an extensive history of strange behavior, such as shooting the nude wedding of his former partner, Gypsy Taub. AP
Racist Buffalo supermarket shooter Payton S. Gendron said his hatred of blacks and Jews was fueled by web communities such as 4chan. AP

Toxic content and political vitriol is what feeds the social media beast, while feeding it back in turbocharged fashion to users in a continuous extremification loop.

We already know that social media makes emotionally or psychologically unwell people more unwell. There is ample research that shows social media can make people more depressed. Indeed, thanks to the “Facebook Whistleblower” Frances Haugen and the emails she obtained, we know that Instagram had its own internal research indicating that viewing in teen girls and enflamed their eating disorders. And in September, a British coroner concluded, for the first time ever, that social media had . 

This year, for the first time ever, a coroner ruled that social media contributed to a person’s suicide. British teen Molly Russell killed herself in 2017. Molly Rose Foundation

After examining her digital profile, it was determined that Molly Russell, 14,&ಞnbsp;was a depressed and vulnerable teen targeted by predatory algorithms that specifically fed her suicide-enflaming content because, as Big Tech knows all too well, toxic content drives increased engagement. Sick people can’t help rubbernecking content that makes them sicker. Yet the Big Tech oligarchs, in a classic example of profit over people, refuse to adjust their harmful algorithms for fear of losing market share. 

Some today argue that we need to censor certain psychologically harmful speech or hate speech or even speech labeled with the catch-all of “misinformation.” But that becomes a very slippery slope. Who would be the arbiter of what is consider🥀ed harmful speech or even “misinformation”? Recent history shows that the Big Tech gatekeepers are not the best stewards of the hallowed ground of free speech. 

Facebook “whistleblower” Frances Haugen revealed that Instagram knew its platform was increasing suicidality among young girls. AP

Yet there is a way to defang the most harmful aspects of social media — not nece🍬ssarily the content itself — but the predatory algorithms that seek out the vulnerable. That’s exactly what the , proposed in February by Sena🌟tors Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) seeks to do: allow the user to disable the algorithms that continuously recommend content — harmful or otherwise — to the user. Originally geared for kids and parents, I believe this bold new initiative can be applied to adults as well. , “Big Tech has brazenly failed . . . and betrayed its trust, putting profits above safety . . . Algorithms driven by eyeballs and dollars will no longer hold sway.”

Haugen’s data also showed that Instagram was well-aware the platform caused spikes in eating disorders among young female users. AP

In thi💧s Brave New World, we can keep our beloved tech, but opt out of Big Brother’s predatory, manipulative and potentially harmful algorithms.

The Kids Online Safety Act may be the first step of a multipronged initiative to help us tame the most harmful aspects of Big Tech and social media. Other initiatives can also include creating a version of the FCC to regulate and oversee Big Tech, in addition to antitrust legislation to effectively break up what are essentially monopolies. Finally, a repeal of is long overdue. Section 230 currently protects Big Tech from legal liability because they are mistakenly considered to be neutral platforms rather than content pu𒁏blishers.

We may never be able to entirely stop dangerous ideologues intent on committing acts𒊎 of violence — or cure all mental illness for that matter — but we can dampen down the algorithms of the social-media platforms that are fueling both those fires. 

Dr. Nicholas Kardaras is the Founder and Chief Clinical Officer of in Austin, Texas and in Hawaii. A former clinical professor at Stony Brook Medicine, his latest book “” (St. Martin’s) is out now.