Social Media’s Legal Dilemma: Curated Harmful Content

Walking the Line Between Immunity and Liability: How Social Media Platforms May Be Liable for Harmful Content Specifically Curated for Users

As proliferation of harmful content online has increasingly become easier and more accessible through social media, review websites and other online public forums, businesses and politicians have pushed to reform and limit the sweeping protections afforded by Section 230 of the Communications Decency Act, which is said to have created the Internet. Congress enacted Section 230 of the Communications Decency Act of 1996 “for two basic policy reasons: to promote the free exchange of information and ideas over the Internet and to encourage voluntary monitoring for offensive or obscene material.” Congress intended for internet to flourish and the goal of Section 230 was to promote the unhindered development of internet businesses, services, and platforms.

To that end Section 230 immunizes online services providers and interactive computer services from liability for posting, re-publishing, or allowing public access to offensive, damaging, or defamatory information or statements created by a third party. Specifically, Section 230(c)(1) provides,

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

[47 U.S.C. § 230(c)(1)]

Section 230 has been widely interpreted to protect online platforms from being held liable for user-generated content, thereby promoting the free exchange of information and ideas over the Internet. See, e.g., Hassell v. Bird, 5 Cal. 5th 522 (2018) (Yelp not liable for defamatory reviews posted on its platform and cannot be forced to remove them); Doe II v. MySpace Inc., 175 Cal. App.4th 561, 567–575 (2009) (§ 230 immunity applies to tort claims against a social networking website, brought by minors who claimed that they had been assaulted by adults they met on that website]; Delfino v. Agilent Technologies, Inc., 145 Cal. App.4th 790, 804–808 (2006) (§ 230 immunity applies to tort claims against an employer that operated an internal computer network used by an employee to allegedly communicate threats against the plaintiff]; Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 826-36 (Cal. Ct. App. 2002) (§ 230 immunity applies to tort and statutory claims against an auction website, brought by plaintiffs who allegedly purchased forgeries from third party sellers on the website).

Thus, under § 230, lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—are barred. Under the statutory scheme, an “interactive computer service” qualifies for immunity so long as it does not also function as an “information content provider” for the portion of the statement or publication at issue. Even users or platforms that “re-post” or “publish” allegedly defamatory or damaging content created by a third-party are exempted from liability. See Barrett v. Rosenthal, 40 Cal. 4th 33, 62 (2006). Additionally, merely compiling false and/or misleading content created by others or otherwise providing a structured forum for dissemination and use of that information is not enough to confer liability. See, e.g. eBay, Inc. 99 Cal. App. 4th 816 (the critical issue is whether eBay acted as an information content provider with respect to the information claimed to be false or misleading); Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1122-1124 (9th Cir. 2003) (Matchmaker.com not liable for fake dating profile of celebrity who started receiving sexual and threatening emails and voicemails).

Recently, however, the Third Circuit appellate court found that Section 230 did not immunize and protect popular social media platform TikTok from suit arising from a ten-year old’s death following her attempting a “Blackout Challenge” based on videos she watched on her TikTok “For You Page.” See Anderson v. TikTok, Inc., 116 F.4th 180 (3rd Cir. 2024). TikTok is a social media platform where users can create, post, and view videos. Users can search for specific content or watch videos recommended by TikTok’s algorithm on their “For You Page” (FYP). This algorithm customizes video suggestions based on a range of factors, including a user’s age, demographics, interactions, and other metadata—not solely on direct user inputs. Some videos on TikTok’s FYP are “challenges” that encourage users to replicate the actions shown. One such video, the “Blackout Challenge,” urged users to choke themselves until passing out. TikTok’s algorithm recommended this video to a ten-year old girl who attempted it and tragically died from asphyxiation.

The deciding question was whether TikTok’s algorithm, and the inclusion of the “Blackout Challenge” video on a user’s FYP, crosses the threshold between an immune publisher and a liable creator. Plaintiff argued that TikTok’s algorithm “amalgamat[es] [] third-party videos,” which results in “an expressive product” that “communicates to users . . . that the curated stream of videos will be interesting to them.” The Third Circuit agreed finding that a platform’s algorithm reflecting “editorial judgments” about “compiling the third-party speech it wants in the way it wants” is the platform’s own “expressive product,” and therefore, TikTok’s algorithm, which recommended the Blackout Challenge on decedent’s FYP, was TikTok’s own “expressive activity.” As such, Section 230 did not bar claims against TikTok arising from TikTok’s recommendations via its FYP algorithm because Section 230 immunizes only information “provided by another,” and here, the claims concerned TikTok’s own expressive activity.

The Court was careful to note its conclusion was reached specifically due to TikTok’s promotion of the Blackout Challenge video on decedent’s FYP was not contingent on any specific user input, i.e. decedent did not search for and view the Blackout Video through TikTok’s search function. TikTok has certainly taken issue with the Court’s ruling contending that if websites lose § 230 protection whenever they exercise “editorial judgment” over the third-party content on their services, then the exception would swallow the rule. Perhaps websites seeking to avoid liability will refuse to sort, filter, categorize, curate, or take down any content, which may result in unfiltered and randomly placed objectionable material on the Internet. On the other hand, some websites may err on the side of removing any potentially harmful third-party speech, which would chill the proliferation of free expression on the web.

The aftermath of the ruling remains to be seen but for now social media platforms and interactive websites should take note and re-evaluate the purpose, scope, and mechanics of their user-engagement algorithms.

COPYRIGHT © 2024, STARK & STARK by: Gene Markin of Stark & Stark For more on Social Media, visit the NLR Communications Media Internet section.

  • Related Posts

    Let’s Circle Back (and eFile) after the Holidays

    The Consumer Product Safety Commission launched its eFiling Beta Pilot a little over a year ago. Non-pilot participants were invited to participate in voluntary eFiling last summer, and the CPSC…

    IRS Announces 2025 Retirement Plan Limits

    The Internal Revenue Service (“IRS”) has announced the following dollar limits applicable to tax-qualified plans for 2025: The limit on the maximum amount of elective contributions that a person may…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Let’s Circle Back (and eFile) after the Holidays

    • By admin
    • November 7, 2024
    • 0 views

    IRS Announces 2025 Retirement Plan Limits

    • By admin
    • November 6, 2024
    • 6 views

    Dow Jones Today: Stock Futures Soar As Investors React to Election Results; Bitcoin Hits Record High

    • By admin
    • November 6, 2024
    • 5 views
    Dow Jones Today: Stock Futures Soar As Investors React to Election Results; Bitcoin Hits Record High

    Office Politics: The Basics for Private Employers

    • By admin
    • November 6, 2024
    • 9 views
    Office Politics: The Basics for Private Employers

    James Van Der Beek’s Colon Cancer Diagnosis Highlights Alarming Trend In Young People

    • By admin
    • November 6, 2024
    • 9 views

    Bruce Willis Experienced This Early Dementia Symptom That Was Dismissed

    • By admin
    • November 6, 2024
    • 9 views