Justice Oliver Wendell Holmes, Jr. famously said: “hard cases make bad law.” As Market Institute President Charles Sauer writes in a recent article at RealClear Markets, few cases are harder than one where a grieving parent seeks to hold someone accountable for the death of their child.

Sauer highlights the lawsuit brought by Norma Nazario against TikTok’s parent company ByteDance and Instagram’s parent company Meta. Nazario’s 15-year-old son, Zachery, tragically died while “subway surfing.” The lawsuit claims that Zachery attempted the stunt after becoming “addicted” to watching subway surfing videos on social media platforms.

“Mrs. Nazario’s lawsuit is the latest challenge to Section 230 of the 1996 Communications Decency Act. Section 230 exempts tech companies from liability for materials posted on their sites if the company does not exercise editorial control over the users’ posts. Section 230 is responsible for the rise of social media, which is why the law is a target of those seeking to impose new regulations on platforms like Instagram and TikTok.”

At the heart of the case is whether recommendation algorithms amount to “active” promotion of harmful content. The plaintiff’s attorneys argue that algorithms mean Section 230 should not apply. But Sauer points out that this logic would undermine the entire law.

“This view would create an exemption to Section 230 that would swallow the entire law, making tech companies vulnerable to legal action if someone committed a crime and blamed it on videos suggested to them by the company.”

Sauer draws on legal experts to reinforce the point. Santa Clara University Law Professor Eric Goldman explained:

“So long as the content is third-party content, it doesn’t matter whether the service ‘passively’ displayed it or ‘actively’ highlighted it–either choice is an editorial decision fully protected by Section 230.”

And as Reason magazine’s Elizabeth Nolan Brown noted:

“The fact that a particular dangerous or reckless thing might be showcased on social media platforms doesn’t mean social media platforms caused or should be held liable for their [the viewer’s] death. We don’t blame bookstores, or movie theaters, or streaming platforms if someone dies doing something they read about in a book or witnessed in a movie or TV show.”

Sauer concludes that the way to protect children is not by weakening Section 230, but by encouraging parents to use the many tools available to manage their children’s online experience:

“The way to prevent future tragedies like that of Zachery Nazario is for more parents to use one of the many tools available to protect their children from the dangers of social media. These provide an effective way to keep children safe while they learn how to responsibly use social media.”


📌 As Sauer makes clear in RCM, dismantling Section 230 in response to tragedy risks making bad law that undermines both free speech and innovation online.