OPINION

Juries, Not Politicians, Will Soon Decide the Fate of Child-Harming Social Media Platforms

The opinions expressed by columnists are their own and do not necessarily represent the views of Townhall.com.

Over the last four years, anybody with the tiniest amount of morality has been repeatedly horrified by the relentless drumbeat of jaw-dropping scandals revealing how social media platforms treat our children.

These scandals have revealed that men so unimaginably rich that their great, great, great grandchildren will not be able to spend their fortunes, have, apparently, with full knowledge, to different extents, allowed their platforms to facilitate the following: pedophilia and child sexual assault-for-hire (including child bestiality content); child suicides; addiction by using advanced neuroscience on children; unprecedented rates of major depression, especially in girls; grotesque amounts of anorexia, again especially in girls; and relentless and permanently traumatizing cyberbullying.

We have a word for this: evil.

There’s hope, however. 

Not from Washington lawmakers who, over two presidents, have slathered themselves in shame by their amateur-hour inability to pass even one law broadly protecting America’s dying and damaged children from the social media horrors revealed in their own congressional hearings.

Not from state lawmakers who have compassionately enacted many good laws. First, brave state lawmakers are shackled by Congress’s failure to clarify that Section 230 of the federally immunizing Communications Act cannot be perverted by platform high-priced lawyers into a blanket immunity for hurting children.

Second, state lawmakers keep getting lured by Big Tech lobbyists and legislative compromise culture into passing the kinds of laws either most likely to be held up in court or which will be enforced only by state and local government lawyers, who multi-billionaires with wealth greater than entire state budgets do not fear.

Where is the hope, then? In courts.

The first source of hope is the court of public opinion. If four years ago, when talking about social media platforms, somebody mentioned “the algorithm,” how many of us would have instantly known that phrase was shorthand for the Artificial Intelligence (AI)-written, content delivery algorithms that determine who sees what?

Now, we all know. We all know that platform AI writes and rewrites individually-tailored algorithms in real time to deliver us content that will keep us online so we see as many money-making ads as possible.

And, we know that what keeps us online the longest is content that makes us angry or anxious.

The second source of hope is real court. In an underreported development, in the next few months, juries will begin deciding whether social media platforms will have to pay for at least some of the child harm they have publicly acknowledged is occurring.

This is very new. Just three years ago, it was common for platforms to get lawsuits thrown out of court right at the beginning of the case. They would, without proof, assert in briefs that their platforms worked in this way or that, and courts – even though they weren’t supposed to do this – would, based on these untested claims, dismiss cases as being in violation of Section 230 or the First Amendment.

No more. Judges have gotten educated, too, and are everywhere converging on a set of Section 230 rulings permitting some personal injury lawsuits to get to juries. Plus, these rulings mean lawyers for children are right now obtaining discovery that will likely reveal even more horrors.

When it comes to the First Amendment, it gets better for us, worse for the child-harming multi-billionaires.

The U.S. Supreme Court has strongly signaled that laws or lawsuits based upon harmful content delivery will not be protected by the First Amendment if AI makes those who-sees-what decisions, and it does. It’s all AI.

Soon, it won’t be D.C. politicians lobbied by ex-colleagues, their egos stoked and campaign coffers filled by the world’s wealthiest men, who will decide whether to hold platforms responsible for harming America’s children.

Soon, it will be juries comprised of people like us who will decide. Regular people whose moral compasses are uncompromised by wanting the world’s largest yacht or reelection. Their verdicts won’t be the end of the story of how an entire generation of children agonizingly suffered at the hands of a tiny few whose limitless greed and vanity eclipses the sun. But, maybe, to paraphrase Churchill, their verdicts or the mega-settlements the prospect of regular people accountability will prompt, will be the turning point; the end of the beginning.

Robert C. Fellmeth is the retired founder and Executive Director of the Children’s Advocacy Institute at the University of San Diego School of Law.