Day v. TIKTOK, Inc.

CourtDistrict Court, N.D. Illinois
DecidedFebruary 28, 2022
Docket3:21-cv-50129
StatusUnknown

This text of Day v. TIKTOK, Inc. (Day v. TIKTOK, Inc.) is published on Counsel Stack Legal Research, covering District Court, N.D. Illinois primary law. Counsel Stack provides free access to over 12 million legal documents including statutes, case law, regulations, and constitutions.

Bluebook
Day v. TIKTOK, Inc., (N.D. Ill. 2022).

Opinion

IN THE UNITED STATES DISTRICT COURT FOR THE NORTHERN DISTRICT OF ILLINOIS WESTERN DIVISION Ashley Day, ) ) Plaintiff, ) Case No. 21 C 50129 ) vs. ) ) TikTok, Inc., ) Judge Philip G. Reinhard ) Defendant. ) ORDER For the reasons stated below, plaintiff’s complaint is dismissed without prejudice. Plaintiff is granted until April 15, 2022 to file an amended complaint. If no amended complaint is timely filed, the dismissal will become with prejudice and this case will be terminated. STATEMENT-OPINION Plaintiff, Ashley Day, pro se, a citizen of Illinois, brings this action against defendant, TikTok, Inc., a California corporation with its principal place of business in California seeking $1,000,000 in damages. Jurisdiction is proper under 28 U.S.C. § 1332(a). Defendant moves to dismiss plaintiff’s amended complaint [5] with prejudice for failure to state a claim upon which relief can be granted. Fed. R. Civ. P. 12(b)(6). The complaint alleges that on March 13, 2021, plaintiff discovered four videos which had been uploaded to the TikTok profile of a nine-year-old boy. The videos were of plaintiff’s two- year-old daughter. One video had been uploaded on September 30, 2020, two on October 29, 2020, and one on November 5, 2020. Two videos depicted acts of violence against plaintiff’s daughter, one had a song with sexually explicit lyrics playing, and one showed plaintiff’s daughter with a swollen lip and redness to the cheek and had a song with sexually explicit lyrics playing. Upon discovering the videos, plaintiff immediately called the police, subsequently made a report, and notified child protective services. The videos were removed from the boy’s TikTok profile the same day. Plaintiff alleges defendant did not put any warning on any of the videos claiming they might contain sensitive material; did not remove any of the videos from its platform; did not report the videos to any child abuse hotline; did not sanction, prevent, or discourage the videos in any way from being viewed, shared, downloaded or disbursed in any other way; and “failed to act on their own policies and procedures along with State and Federal Statutes and Regulations.” Plaintiff does not identify which laws or regulations defendant failed to follow. Based on these allegations, the complaint alleges defendant “knowingly, intentionally, and maliciously committed negligence, failure to act, publicly depicting child abuse, and publicly depicting sexual exploitation of a 2-year-old child.” To survive a Rule 12(b)(6) motion to dismiss for failure to state a claim, a complaint must contain a “short and plain statement of the claim showing that the pleader is entitled to relief.” Fed. R. Civ. P. 8(a)(2). If the complaint (1) describes the claim in sufficient detail to give the defendant fair notice of what the claim is and the grounds upon which it rests and (2) plausibly suggests that the plaintiff has a right to relief above a speculative level, this requirement is met. Bell Atlantic Corp. v. Twombly, 550 U.S. 544, 555 (2007); see also Ashcroft v. Iqbal, 556 U.S. 662 (2009). Defendant argues that any claim based on the allegations in the complaint is precluded by 47 U.S.C. § 230(c) (“Act”).1 The Act provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). “The term ‘information content provider’ means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” 47 U.S.C. § 230(f)(3). An “interactive computer service” is “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.” 47 U.S.C. § 230(f)(2). “What § 230(c)(1) says is that an online information system must not ‘be treated as the publisher or speaker of any information provided by’ someone else.” Chicago Lawyers’ Committee for Civil Rights Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666, 671 (7th Cir. 2008). In Chicago Lawyers’, plaintiff sought to hold Craigslist liable for postings made by others on its platform that violated the anti-discrimination in advertising provision of the Fair Housing Act (42 U.S.C. § 3604(c)). The court held 47 U.S.C. § 230(c)(1) precluded Craigslist from being liable for the offending postings because “[i]t is not the author of the ads and could not be treated as the ‘speaker’ of the posters’ words, given § 230(c)(1).” Id. The court rejected plaintiff’s argument that Craigslist could be liable as one who caused the offending post to be made stating “[a]n interactive computer service ‘causes’ postings only in the sense of providing a place where people can post.” Id. “Nothing in the service craigslist offers induces anyone to post any particular listing or express a preference for discrimination.” Id. “If craigslist ‘causes’ the discriminatory notices, then, so do phone companies and courier services (and, for that matter, the firms that make the computers and software that owners use to post their notices online), yet no one could think that Microsoft and Dell are liable for ‘causing’ discriminatory advertisements.” Id. at 672. The court concluded the opinion by stating that plaintiff could use the postings on Craigslist to identify targets to investigate and “assemble a list of names to send to the Attorney General for prosecution. But given § 230(c)(1) it cannot sue the messenger just because the message reveals a third party’s plan to engage in unlawful discrimination.” Id. Plaintiff’s complaint does not allege defendant created or posted the videos. It only alleges defendant allowed and did not timely remove the videos posted by someone else. This is clearly a complaint about “information provided by another information content provider” for which defendant cannot be held liable by the terms of Section 230(c)(1). Id.

1 Section 230(c) provides an affirmative defense, Doe v. GTE Corp., 347 F.3d 655, 657 (7th Cir. 2003), so the proper procedure was to raise the defense and then move for judgment on the pleadings. Waslczak v. Chicago Board of Education, 739 F.3d 1013, 1016 n. 2 (7th Cir. 2014).

Free access — add to your briefcase to read the full text and ask questions with AI

Related

Cite This Page — Counsel Stack

Bluebook (online)
Day v. TIKTOK, Inc., Counsel Stack Legal Research, https://law.counselstack.com/opinion/day-v-tiktok-inc-ilnd-2022.