NetChoice v. Anthony G. Brown, in his official Capacity as the Maryland Attorney General, William D. Gruhn, in his official Capacity as the Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General
This text of NetChoice v. Anthony G. Brown, in his official Capacity as the Maryland Attorney General, William D. Gruhn, in his official Capacity as the Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General (NetChoice v. Anthony G. Brown, in his official Capacity as the Maryland Attorney General, William D. Gruhn, in his official Capacity as the Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General) is published on Counsel Stack Legal Research, covering District Court, D. Maryland primary law. Counsel Stack provides free access to over 12 million legal documents including statutes, case law, regulations, and constitutions.
Opinion
IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF MARYLAND
NETCHOICE, *
Plaintiff, *
v. * Civil Action No. RDB-25-0322
ANTHONY G. BROWN, in his official * Capacity as the Maryland Attorney General, WILLIAM D. GRUHN, in his official * Capacity as the Chief of the Division of Consumer Protection of the Maryland Office of the * Attorney General, * Defendants. * * * * * * * * * * * * * MEMORANDUM OPINION Seeking to protect the online privacy of children, the Maryland General Assembly in 2024 enacted the Age-Appropriate Design Code Act, MD. CODE ANN., COM. LAW §§ 14- 4801–14-4813, familiarly referred to as the “Kids Code.”1 According to its preamble, the Kids Code governs “the protection of online privacy of children” in Maryland by regulating data collection and sharing practices. (ECF No. 35 Ex. 1 at 2, 3.) Plaintiff NetChoice, LLC (“NetChoice”)2 is a self-described “nonprofit trade association for Internet companies,” including Amazon, Google, Meta, Nextdoor, Pinterest, Netflix, Reddit, and X, among others. (ECF No. 27 ¶¶ 22, 25.) Consistent with its filing of approximately 22 lawsuits in 14 different
1 The Court adopts the naming convention used in Defendants’ briefing. See generally (ECF No. 35-1). 2 According to its website, NetChoice is a plaintiff in at least 22 civil suits challenging legislation in 14 states, including Arkansas, California, Colorado, Florida, Georgia, Louisiana, Maryland, Minnesota, Mississippi, Ohio, Tennessee, Texas, Utah, and Virginia. See Litigation Center, NETCHOICE, https://netchoice.org/litigation/ (last visited Nov. 24, 2025). In 2024, the Supreme Court issued a decision on appeals of preliminary decisions in two of these cases. See Moody v. NetChoice, LLC, 603 U.S. 707 (2024). states, NetChoice initiated this action in this Court on February 3, 2025, challenging the Kids Code on the basis that it violates First Amendment freedom of speech protections and is preempted by existing federal law. (ECF No. 1.) After filing an initial, six-Count Complaint
against Maryland Attorney General Anthony G. Brown and Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General William D. Gruhn (collectively, “Defendants” or “State Defendants”), NetChoice filed the operative eight-Count Amended Complaint on April 28, 2025, seeking declaratory and injunctive relief against the enforcement of the Kids Code. (ECF No. 27.) In short, NetChoice alleges that the Kids Code unconstitutionally restricts protected speech of its members and their users in a manner
that is vague, overbroad, and inconsistent with existing federal law. See (id.). State Defendants filed a Motion to Dismiss the Original Complaint (ECF No. 20) and, subsequently, a Motion to Dismiss the Amended Complaint (ECF No. 35). At this early stage of litigation, this Court addresses solely whether NetChoice has stated a plausible cause of action like the actions brought in thirteen other states. This case raises the constitutional balance between the State’s significant authority to enact legislation to
protect children, and the public’s equally consequential right under the First Amendment to the United States Constitution to be free of laws that burden their speech. NetChoice asserts that Maryland’s attempt to protect children’s privacy in the Kids Code overreaches by requiring data-collection processes that will infringe the speech and expression of online service providers and the members of the public who use them. State Defendants respond that the Kids Code falls squarely within their authority to limit children’s access to harmful
content. While the First Amendment “leaves undisturbed States’ traditional power to prevent minors from accessing” some legitimately harmful speech, states cannot overly burden access to speech in their efforts to protect children. Free Speech Coal., Inc. v. Paxton, 606 U.S. 461, 478 (2025) (citing Ginsberg v. State of N.Y., 390 U.S. 629, 641 (1968)); see Reno v. Am. Civ. Liberties
Union, 521 U.S. 844, 875 (1997). For the reasons set forth in detail below, NetChoice has set forth plausible claims, but this Court makes no determinations about the merits of those claims. Indeed, any conclusions as to the merits of NetChoice’s claims await a period of discovery and further analysis. Pending such litigation, the Kids Code remains in effect. Presently pending before this Court are Defendants’ Motion to Dismiss as to the original Complaint (ECF No. 20) (“Original Motion to Dismiss”) and Defendants’ Motion to
Dismiss Amended Complaint (ECF No. 35) (“Defendants’ Motion” or “Motion to Dismiss”). NetChoice filed its Amended Complaint (ECF No. 27) after Defendants filed their Original Motion to Dismiss, and the parties did not fully brief that Motion. NetChoice has responded in Opposition (ECF No. 40) to Defendants’ Motion to Dismiss Amended Complaint, Defendants have replied (ECF No. 45), and the parties have briefed several Notices of Supplemental Authority, see (ECF Nos. 46, 47, 49, 54, 55, 56). On November 13, 2025, the
Court heard oral argument from the parties regarding the pending Motions. As explained on the record and agreed by all parties, Defendants’ Original Motion to Dismiss (ECF No. 20) is DENIED AS MOOT.3 Additionally, for the reasons set forth below, Defendants’ Motion to Dismiss (ECF No. 35) is DENIED.
3 “Ordinarily, an amended complaint supersedes those that came before it.” Goodman v. Diggs, 986 F.3d 493, 498 (4th Cir. 2021). Thus, this Court has held that the filing of an Amended Complaint renders moot pending motions to dismiss the original complaint where the Amended Complaint addresses the issues raised in the prior motion to dismiss. See Howard v. Ocwen Loan Servicing, Inc., RDB-18-3296, 2019 WL 4750333, at *2 (D. Md. Sep. 30, 2019); Verderamo v. Mayor & City Council of Balt., 4 F. Supp. 3d 722, 724 n.3 (D. Md. 2014). On the BACKGROUND In ruling on a motion to dismiss, this Court “accept[s] as true all well-pleaded facts in a complaint and construe[s] them in the light most favorable to the plaintiff.” Wikimedia Found.
v. Nat’l Sec. Agency, 857 F.3d 193, 208 (4th Cir. 2017) (citing SD3, LLC v. Black & Decker (U.S.) Inc., 801 F.3d 412, 422 (4th Cir. 2015)). Except where otherwise indicated, the following facts are derived from Plaintiff’s Amended Complaint (ECF No. 27) and accepted as true for the purpose of Defendants’ Motion to Dismiss (ECF No. 35). A. NetChoice and online data “Plaintiff NetChoice is a District of Columbia nonprofit trade association for Internet
companies” whose “mission is to promote online commerce and speech and to increase consumer access and options via the Internet, while minimizing burdens that could prevent businesses from making the Internet more accessible and useful.” (ECF No. 27 ¶ 22.) NetChoice alleges that its members—including social media platforms; search services; and digital services that disseminate speech, such as video- and audio streaming services, news services, online libraries or journals, hosting services, and blogs—host “immense amounts of
First Amendment activity.” (Id. ¶¶ 36, 37, 38, 39.) It alleges that social media platforms allow users to engage in protected speech in daily posts, facilitate expressive activities of third parties, and directly engage in expression by displaying, compiling, and curating content created by others. (Id. ¶ 37.) Similarly, it alleges that search services engage in protected speech by selecting and choosing how to display Internet search results to allow users to find protected
record, all parties agreed that Defendants’ Original Motion to Dismiss (ECF No. 20) is MOOT because NetChoice subsequently filed its Amended Complaint (ECF No. 27). expression and information. (Id.
Free access — add to your briefcase to read the full text and ask questions with AI
IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF MARYLAND
NETCHOICE, *
Plaintiff, *
v. * Civil Action No. RDB-25-0322
ANTHONY G. BROWN, in his official * Capacity as the Maryland Attorney General, WILLIAM D. GRUHN, in his official * Capacity as the Chief of the Division of Consumer Protection of the Maryland Office of the * Attorney General, * Defendants. * * * * * * * * * * * * * MEMORANDUM OPINION Seeking to protect the online privacy of children, the Maryland General Assembly in 2024 enacted the Age-Appropriate Design Code Act, MD. CODE ANN., COM. LAW §§ 14- 4801–14-4813, familiarly referred to as the “Kids Code.”1 According to its preamble, the Kids Code governs “the protection of online privacy of children” in Maryland by regulating data collection and sharing practices. (ECF No. 35 Ex. 1 at 2, 3.) Plaintiff NetChoice, LLC (“NetChoice”)2 is a self-described “nonprofit trade association for Internet companies,” including Amazon, Google, Meta, Nextdoor, Pinterest, Netflix, Reddit, and X, among others. (ECF No. 27 ¶¶ 22, 25.) Consistent with its filing of approximately 22 lawsuits in 14 different
1 The Court adopts the naming convention used in Defendants’ briefing. See generally (ECF No. 35-1). 2 According to its website, NetChoice is a plaintiff in at least 22 civil suits challenging legislation in 14 states, including Arkansas, California, Colorado, Florida, Georgia, Louisiana, Maryland, Minnesota, Mississippi, Ohio, Tennessee, Texas, Utah, and Virginia. See Litigation Center, NETCHOICE, https://netchoice.org/litigation/ (last visited Nov. 24, 2025). In 2024, the Supreme Court issued a decision on appeals of preliminary decisions in two of these cases. See Moody v. NetChoice, LLC, 603 U.S. 707 (2024). states, NetChoice initiated this action in this Court on February 3, 2025, challenging the Kids Code on the basis that it violates First Amendment freedom of speech protections and is preempted by existing federal law. (ECF No. 1.) After filing an initial, six-Count Complaint
against Maryland Attorney General Anthony G. Brown and Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General William D. Gruhn (collectively, “Defendants” or “State Defendants”), NetChoice filed the operative eight-Count Amended Complaint on April 28, 2025, seeking declaratory and injunctive relief against the enforcement of the Kids Code. (ECF No. 27.) In short, NetChoice alleges that the Kids Code unconstitutionally restricts protected speech of its members and their users in a manner
that is vague, overbroad, and inconsistent with existing federal law. See (id.). State Defendants filed a Motion to Dismiss the Original Complaint (ECF No. 20) and, subsequently, a Motion to Dismiss the Amended Complaint (ECF No. 35). At this early stage of litigation, this Court addresses solely whether NetChoice has stated a plausible cause of action like the actions brought in thirteen other states. This case raises the constitutional balance between the State’s significant authority to enact legislation to
protect children, and the public’s equally consequential right under the First Amendment to the United States Constitution to be free of laws that burden their speech. NetChoice asserts that Maryland’s attempt to protect children’s privacy in the Kids Code overreaches by requiring data-collection processes that will infringe the speech and expression of online service providers and the members of the public who use them. State Defendants respond that the Kids Code falls squarely within their authority to limit children’s access to harmful
content. While the First Amendment “leaves undisturbed States’ traditional power to prevent minors from accessing” some legitimately harmful speech, states cannot overly burden access to speech in their efforts to protect children. Free Speech Coal., Inc. v. Paxton, 606 U.S. 461, 478 (2025) (citing Ginsberg v. State of N.Y., 390 U.S. 629, 641 (1968)); see Reno v. Am. Civ. Liberties
Union, 521 U.S. 844, 875 (1997). For the reasons set forth in detail below, NetChoice has set forth plausible claims, but this Court makes no determinations about the merits of those claims. Indeed, any conclusions as to the merits of NetChoice’s claims await a period of discovery and further analysis. Pending such litigation, the Kids Code remains in effect. Presently pending before this Court are Defendants’ Motion to Dismiss as to the original Complaint (ECF No. 20) (“Original Motion to Dismiss”) and Defendants’ Motion to
Dismiss Amended Complaint (ECF No. 35) (“Defendants’ Motion” or “Motion to Dismiss”). NetChoice filed its Amended Complaint (ECF No. 27) after Defendants filed their Original Motion to Dismiss, and the parties did not fully brief that Motion. NetChoice has responded in Opposition (ECF No. 40) to Defendants’ Motion to Dismiss Amended Complaint, Defendants have replied (ECF No. 45), and the parties have briefed several Notices of Supplemental Authority, see (ECF Nos. 46, 47, 49, 54, 55, 56). On November 13, 2025, the
Court heard oral argument from the parties regarding the pending Motions. As explained on the record and agreed by all parties, Defendants’ Original Motion to Dismiss (ECF No. 20) is DENIED AS MOOT.3 Additionally, for the reasons set forth below, Defendants’ Motion to Dismiss (ECF No. 35) is DENIED.
3 “Ordinarily, an amended complaint supersedes those that came before it.” Goodman v. Diggs, 986 F.3d 493, 498 (4th Cir. 2021). Thus, this Court has held that the filing of an Amended Complaint renders moot pending motions to dismiss the original complaint where the Amended Complaint addresses the issues raised in the prior motion to dismiss. See Howard v. Ocwen Loan Servicing, Inc., RDB-18-3296, 2019 WL 4750333, at *2 (D. Md. Sep. 30, 2019); Verderamo v. Mayor & City Council of Balt., 4 F. Supp. 3d 722, 724 n.3 (D. Md. 2014). On the BACKGROUND In ruling on a motion to dismiss, this Court “accept[s] as true all well-pleaded facts in a complaint and construe[s] them in the light most favorable to the plaintiff.” Wikimedia Found.
v. Nat’l Sec. Agency, 857 F.3d 193, 208 (4th Cir. 2017) (citing SD3, LLC v. Black & Decker (U.S.) Inc., 801 F.3d 412, 422 (4th Cir. 2015)). Except where otherwise indicated, the following facts are derived from Plaintiff’s Amended Complaint (ECF No. 27) and accepted as true for the purpose of Defendants’ Motion to Dismiss (ECF No. 35). A. NetChoice and online data “Plaintiff NetChoice is a District of Columbia nonprofit trade association for Internet
companies” whose “mission is to promote online commerce and speech and to increase consumer access and options via the Internet, while minimizing burdens that could prevent businesses from making the Internet more accessible and useful.” (ECF No. 27 ¶ 22.) NetChoice alleges that its members—including social media platforms; search services; and digital services that disseminate speech, such as video- and audio streaming services, news services, online libraries or journals, hosting services, and blogs—host “immense amounts of
First Amendment activity.” (Id. ¶¶ 36, 37, 38, 39.) It alleges that social media platforms allow users to engage in protected speech in daily posts, facilitate expressive activities of third parties, and directly engage in expression by displaying, compiling, and curating content created by others. (Id. ¶ 37.) Similarly, it alleges that search services engage in protected speech by selecting and choosing how to display Internet search results to allow users to find protected
record, all parties agreed that Defendants’ Original Motion to Dismiss (ECF No. 20) is MOOT because NetChoice subsequently filed its Amended Complaint (ECF No. 27). expression and information. (Id. ¶ 38.) Finally, it alleges that streaming services, blog sites, and hosting- and news services allow users to create websites and access information online. (Id. ¶ 39.) According to NetChoice, these online services “use information to engage in
editorial functions to publish, disseminate, and display protected speech to users,” while websites “collect and use data to facilitate, curate, and publish” such speech. (Id. ¶ 40.) NetChoice alleges data collection is a necessary step in disseminating protected speech to users. (Id. ¶ 41.) Specifically, collecting data about IP addresses, device type, operating system, screen resolution, browser type, language preferences, and time zone allows websites to provide functional services and content. (Id. ¶¶ 42–43.) Members collect other information,
such as activity and account changes, to detect malicious or suspicious activity and protect users’ security. (Id. ¶ 44.) Relatedly, account-based services must collect personal data from users to offer them an account. (Id. ¶¶ 45–46.) NetChoice members, particularly social media and streaming websites, also collect information to editorialize and personalize content for users. (Id. ¶¶ 47–48.) NetChoice alleges that such content is protected speech produced via algorithms or information-reliant means. (Id. ¶¶ 48–52.) NetChoice alleges that its members
utilize different approaches to online content, with some members restricting publication of potentially harmful content and others broadly allowing such publication. (Id. ¶ 63.) B. Maryland’s Age-Appropriate Design Code Act (“Kids Code”), MD. CODE ANN., COM. LAW §§ 14-801–14-813 Maryland’s General Assembly enacted the Kids Code, MD. CODE ANN., COM. LAW §§ 14-4801–14-4813, on April 6, 2024, and it went into effect on October 1, 2024. (Id. ¶ 64; ECF No. 35 at 1 ¶ 1.) The Kids Code applies to “covered entities” that are “[r]easonably likely to be accessed by children.” MD. CODE ANN., COM. LAW § 14-4801(h), (s). (ECF No. 27 ¶ 67.) It defines a “covered entity” to include any legal entity that: (i) Is organized or operated for the profit or financial benefit of its shareholders or other owners; (ii) Collects consumers’ personal data or uses another entity to collect consumers’ personal data on its behalf; (iii) Alone, or jointly with its affiliates or subsidiaries, determines the purposes and means of the processing of consumers’ personal data; (iv) Does business in the State; and (v) 1. Has annual gross revenues in excess of $25,000,000, adjusted every odd-numbered year to reflect adjustments in the Consumer Price Index; 2. Annually buys, receives, sells, or shares the personal data of 50,000 or more consumers, households, or devices, alone or in combination with its affiliates or subsidiaries, for the covered entity’s commercial purposes; or 3. Derives at least 50% of its annual revenues from the sale of consumers’ personal data. (Id. ¶ 68 (quoting MD. CODE ANN., COM. LAW § 14-4801(h)(1)).) Certain financial institutions and health entities are excluded from covered entity status. (Id. ¶ 72.) Personal data includes “information that is linked or reasonably able to be linked to an identified or identifiable individual,” but excludes “[d]e-identified data[] . . . or [p]ublicly available information.” (Id. ¶ 70 (quoting MD. CODE ANN., COM. LAW § 14-4801(n)).) Publicly available information excludes biometric data collected without a consumer’s knowledge but includes information lawfully made available in government records or that a “covered entity has a reasonable basis to believe is made available to the general public by the consumer or widely distributed media.” (Id. ¶ 71 (quoting MD. CODE ANN., COM. LAW § 14- 4801(r)).) Collecting “personal data means to ‘buy, rent, gather, obtain, receive, or access personal data relating to a consumer,’ including ‘[r]eceiving data from the consumer’” and observing their behavior. (Id. ¶ 69 (quoting MD. CODE ANN., COM. LAW § 14-4801(f)).) The Kids Code includes a provision explaining the General Assembly’s intent that: (1) Children should be afforded protections not only by online products specifically directed at them, but by all online products they are reasonably likely to access; (2) Covered entities that develop and provide online products that children are reasonably likely to access shall ensure the best interests of children when designing, developing, and providing those online products; (3) All covered entities that operate in the State and process children’s data in any capacity shall do so in a manner consistent with the best interests of children; (4) If a conflict arises between commercial interests and the best interests of children, covered entities that develop online products likely to be accessed by children shall prioritize the privacy, safety, and well-being of children. (5) Nothing in this subtitle may be construed to require a covered entity to monitor or censor third-party content or otherwise impact the existing rights and freedoms of any person; and (6) Nothing in this subtitle may be construed to discriminate against children on the basis of race, color, religion, national origin, disability, gender identity, sex, or sexual orientation. MD. CODE ANN., COM. LAW § 14-4803; (ECF No. 27 ¶ 80). NetChoice alleges that the Kids Code thereby imposes an obligation for websites to act “in the best interests of children.” (ECF No. 27 ¶ 81.) It challenges this obligation, the general definition of covered entities, and the collection restrictions and reporting requirements under the Kids Code. i. Best interests of children standard and associated restrictions, MD. CODE ANN., COM. LAW §§ 14-4801, 14-806 NetChoice alleges that the Kids Code’s restrictions are based on a “best interests of children” standard that requires consideration of online content. (Id. ¶¶ 77–79.) “‘Best interests of children’ means a covered entity’s use of the personal data of children or the design of an online product [is carried out] in a way that does not:” (1) Benefit the covered entity to the detriment of children; and (2) Result in: i. Reasonably foreseeable and material physical or financial harm to children; ii. Severe and reasonably foreseeable psychological or emotional harm to children; iii. A highly offensive intrusion on children’s reasonable expectation of privacy; or iv. Discrimination against children based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation. (Id. ¶ 82 (citing MD. CODE ANN., COM. LAW § 14-4801(c)).) NetChoice alleges that the Kids Code restricts the use of information to disseminate protected speech and establishes presumptions against covered entities’ ability to receive, use, handle, and delete user data unless they can satisfy the “best interests of children” standard. (Id. ¶ 83.) This includes restrictions on processing or profiling minors’ personal data in a manner inconsistent with the best interests of children. (Id. ¶¶ 84–87 (citing MD. CODE ANN., COM. LAW §§ 14-4806(a)(1), (3); 14-4801(p), (q)).) NetChoice alleges that such restrictions affect members’ ability to personalize content because they limit access to information needed to deliver such content. (Id. ¶¶ 87–89.) It alleges that covered entities can only overcome such restrictions if they meet the subjective standard that their services will not benefit them “to the detriment of children” or “result in” downstream harms. (Id. ¶ 90.) Relatedly, NetChoice alleges that the Kids Code prohibits use of “dark patterns,”4 as defined by the Federal Trade Commission, to (1) cause a child to provide personal data beyond that reasonably expected for the product; (2) circumvent privacy protections; or (3) “[t]ake any action that the covered entity knows, or has reason to know, is not in the best interest of children who access or are
4 According to NetChoice, a “dark pattern” is “a user interface that is designed or manipulated with the purpose of subverting or impairing user autonomy, decision making or choice,” and may include features such as “autoplay or newsfeed functions that recommend personalized content.” (ECF No. 27 ¶ 93 (quoting MD. CODE ANN., COM. LAW § 14-801(i)).) reasonably likely to access the online product.” (Id. ¶ 92 (quoting MD. CODE ANN., COM. LAW § 14-4806(a)(7)).)
ii. Reasonably likely to be accessed by minors, MD. CODE ANN., COM. LAW § 14-801(s) As explained above, the Kids Code generally applies to products that are “reasonably likely to be accessed by children.” See, e.g., MD. CODE ANN., COM. LAW §§ 14-4806, 14-4804. An online product satisfies this standard where it is “reasonable to expect that the online product would be accessed by children, based on satisfying any of the following criteria:” (1) The online product is directed to children as defined in the federal Children’s Online Privacy Protection Act; (2) The online product is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children; (3) The online product is substantially similar or the same as an online product that satisfies item (2) of this subsection; (4) The online product features advertisements marketed to children; (5) The covered entity’s internal research findings determine that a significant amount of the online product’s audience is composed of children; or (6) The covered entity knows or should have known that a user is a child. (ECF No. 27 ¶ 74 (citing MD. CODE ANN., COM. LAW §14-4801(s)).) A “consumer” is any individual resident of Maryland. (Id. ¶ 75 (citing MD. CODE ANN., COM. LAW §14-4801(g)).) iii. Data Protection Impact Assessments, MD. CODE ANN., COM. LAW §§ 14-804(a)–(b), 14-805 The Kids Code requires covered entities to submit by April 1, 2026, a Data Protection Impact Assessment (“DPIA requirement” or “DPI Assessment”) “for every online product, service, or feature they offer ‘that is reasonably likely to be accessed by children.’” (Id. ¶ 94 (quoting MD. CODE ANN., COM. LAW § 14-4804(a)(1)); ¶ 66.) The DPI Assessment is a “systematic survey to assess compliance with the duty to act in the best interests of children,” and requires covered entities to identify the purpose for which they use minors’ data, determine whether such use is designed in a manner consistent with the best interests of children, and describe steps that they have taken or will take to comply with the best interests
of children. (Id. ¶¶ 95–96 (quoting MD. CODE ANN., COM. LAW §§ 14-4801(j); 14-4804(b)).) DPI Assessments require covered entities to assess four categories of harm: (1) reasonably foreseeable and material physical or financial harm; (2) reasonably foreseeable and extreme psychological or emotional harm; (3) highly offensive intrusion on children’s reasonably expectation of privacy; and (4) discrimination against children based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation. (Id. ¶¶ 97–98
(citing MD. CODE ANN., COM. LAW § 14-4804(b)(3)).) Covered entities must determine whether each online product “is designed in a manner consistent with the best interest of children reasonably likely to access the online product through consideration of . . . any other factor that may indicate that the online product” is not so designed. (Id. ¶ 99 (quoting MD. CODE ANN., COM. LAW § 14-4804(b)(3)(viii)).) Finally, the Kids Code imposes obligations to: (1) “[m]aintain documentation of assessments for as long as the product is likely to be accessed
by children;” (2) review each assessment within 90 days of any material change “to processing pertaining to the online product;” and (3) make assessments available to Defendants. (ECF No. 27 ¶¶ 102–106); MD. CODE ANN., COM. LAW §§ 4-4805(1)–(2), 14-4807(a)–(c). iv. Enforcement of Kids Code, MD. CODE ANN., COM. LAW §§ 14- 4808, 14-4809 NetChoice alleges that the Kids Code provides that violations of its requirements are “subject to” the enforcement provisions of Maryland’s Consumer Protection Act, which is codified in Title 13 of Maryland’s Commercial Law Article. (ECF No. 27 ¶ 108 (citing MD. CODE ANN., COM. LAW § 14-4808(a)).) Under Maryland’s Consumer Protection Act, MD. CODE ANN., COM. LAW §§ 13-301 et seq., the Attorney General may investigate purported violations and seek declaratory, injunctive, and monetary relief, and entities may be subject to
administrative proceedings before the Division of Consumer Protection. (Id. ¶ 109 (citing MD. CODE ANN., COM. LAW §§ 13-405–13-409)); ¶ 111.) The Kids Code specifically subjects covered entities to civil penalties not to exceed (1) $2,500 per affected child for each negligent violation and (2) $7,500 per affected child for each intentional violation. (Id. ¶ 110 (citing MD. CODE ANN., COM. LAW § 14-4808(b)).) If covered entities commit violations while remaining in substantial compliance with the Kids Code, however, they must receive notice and an
opportunity to cure prior to enforcement of penalties against them. (Id. ¶ 112 (citing MD. CODE ANN., COM. LAW § 14-4809).) NetChoice alleges that these provisions make violations prohibitively expensive for covered entities. (Id. ¶ 113.) C. Procedural History On February 3, 2025, NetChoice initiated this action by filing in this Court a six-Count Complaint (ECF No. 1) for declaratory and injunctive relief alleging various First Amendment
and preemption claims under 42 U.S.C. § 1983 against Defendants Anthony G. Brown, in his official capacity as Maryland Attorney General and William D. Gruhn, in his official capacity as Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General (collectively, “Defendants”). On March 28, 2025, Defendants filed a Motion to Dismiss (ECF No. 20) the original Complaint. On April 28, 2025, NetChoice filed the operative, eight-Count Amended Complaint (ECF No. 27) against Defendants. In its
Amended Complaint, NetChoice alleges under 42 U.S.C. § 1983 four First Amendment challenges; two due process challenges based on First Amendment vagueness doctrine; and two preemption challenges to the Kids Code. See generally (ECF No. 27 at 24–52.) Specifically, NetChoice alleges 42 U.S.C. § 1983 claims that: (1) the Kids Code’s “best
interests of children” standard violates the First Amendment, as incorporated against the states by the Fourteenth Amendment, both facially and as-applied to NetChoice’s members and their current and prospective users (Count I); (2) the Kids Code’s “reasonably likely to be accessed by children” standard violates the First Amendment, as incorporated against the states by the Fourteenth Amendment, both facially and as-applied to NetChoice’s members and their current and prospective users (Count II); (3) the Kids Code’s “best interests of children”
standard is void for vagueness under the First and Fourteenth Amendments (Count III); (4) the Kids Code’s “reasonably likely to be accessed by children” standard is void for vagueness under the First and Fourteenth Amendments (Count IV); (5) the Kids Code’s Data Protection Impact Assessment requirement, MD. CODE ANN., COM. LAW § 14-4804, violates the First Amendment, as incorporated against the states by the Fourteenth Amendment, both facially and as-applied to NetChoice’s members and their current and prospective users (Count V);
(6) the Kids Code’s restrictions on data, dark patterns, and monitoring, MD. CODE ANN., COM. LAW § 14-4806, violates the First Amendment, as incorporated against the states by the Fourteenth Amendment, both facially and as-applied to NetChoice’s members and their current and prospective users (Count VI); (7) under Ex Parte Young,5 the Kids Code is preempted by the federal Children’s Online Privacy Protection Act, 15 U.S.C. §§ 6501 et seq.
5 Under Ex Parte Young, 209 U.S. 123 (1908), “if an individual claims federal law immunizes him from state regulation, the court may issue an injunction upon finding the state regulatory actions preempted.” Armstrong v. Exceptional Child Center, Inc., 575 U.S. 320, 327, 328 (2015) (citing Ex parte Young, 209 U.S. at 155–56). (Count VII); and (8) under Ex Parte Young, Kids Code §§ 14-4804 and 14-4806 are preempted by Section 230 of the federal Communications Decency Act, 47 U.S.C. § 230, (Count VIII). (ECF No. 27 at 23–51).
As explained above, Defendants filed a Motion to Dismiss Amended Complaint (ECF No. 35) on June 20, 2025. In their Motion (ECF No. 35), Defendants argue that NetChoice’s as-applied First Amendment challenges in Counts I, II, V, and VI should be dismissed for lack of standing under Federal Rule of Civil Procedure 12(b)(1) and all claims should be dismissed for failure to state a claim under Federal Rule of Civil Procedure 12(b)(6). On November 13, 2025, the Court heard oral argument from the parties at a motions hearing. This matter is
now ripe for review. STANDARDS OF REVIEW I. Motion to Dismiss Under Rule 12(b)(1) Courts commonly address motions to dismiss for lack of standing under Rule 12(b)(1) of the Federal Rules of Civil Procedure. See, e.g., CGM, LLC v. BellSouth Telecomms., Inc., 664 F.3d 46, 52 (4th Cir. 2011) (recognizing that Article III “standing . . . is generally associated
with Civil Procedure Rule 12(b)(1) pertaining to subject matter jurisdiction.”); accord White Tail Park, Inc. v. Stroube, 413 F.3d 451, 459 (4th Cir. 2005). A motion to dismiss under Rule 12(b)(1) of the Federal Rules of Civil Procedure for lack of subject matter jurisdiction challenges a court’s authority to hear the matter brought by a complaint. See Davis v. Thompson, 367 F. Supp. 2d 792, 799 (D. Md. 2005). This jurisdictional attack may proceed either as a facial challenge, asserting that the allegations in the complaint are insufficient to establish subject-matter
jurisdiction, or as a factual challenge, asserting “that the jurisdictional allegations of the complaint [are] not true.” Kerns v. United States, 585 F.3d 187, 192 (4th Cir. 2009) (quoting Adams v. Bain, 697 F.2d 1213, 1219 (4th Cir. 1982)). In a facial challenge like the one at issue in this case, a court will grant a motion to dismiss for lack of subject-matter jurisdiction “where
a claim fails to allege facts upon which the court may base jurisdiction.” Davis, 367 F. Supp. 2d at 799. In making this determination, “all the facts alleged in the complaint are assumed to be true and the plaintiff, in effect, is afforded the same procedural protection as he would receive under a Rule 12(b)(6) consideration.” Adams, 697 F.2d at 1219. Where the plaintiff has invoked federal jurisdiction, it bears the burden to establish subject matter jurisdiction. Lovern v. Edwards, 190 F.3d 648, 654 (4th Cir. 1999).
II. Motion to Dismiss Under Rule 12(b)(6) A complaint must contain a “short and plain statement of the claim showing that the pleader is entitled to relief.” FED. R. CIV. P. 8(a)(2). Rule 12(b)(6) of the Federal Rules of Civil Procedure authorizes the dismissal of a complaint if it fails to state a claim upon which relief can be granted. “[T]he purpose of Rule 12(b)(6) is ‘to test the sufficiency of a complaint’ and not to ‘resolve contests surrounding the facts, the merits of a claim, or the applicability of
defenses.’” Presley v. City of Charlottesville, 464 F.3d 480, 483 (4th Cir. 2006) (quoting Edwards v. City of Goldsboro, 178 F.3d 231, 243 (4th Cir. 1999)). To survive a motion under Rule 12(b)(6), a complaint must contain facts sufficient to “state a claim to relief that is plausible on its face.” Ashcroft v. Iqbal, 556 U.S. 662, 678 (2009) (quoting Bell Atl., Corp. v. Twombly, 550 U.S. 544, 570 (2007)). Under the plausibility standard, a complaint must contain “more than labels and conclusions” or a “formulaic recitation of the
elements of a cause of action.” Twombly, 550 U.S. at 555; see Painter’s Mill Grille, LLC v. Brown, 716 F.3d 342, 350 (4th Cir. 2013). A complaint need not include “detailed factual allegations.” Iqbal, 556 U.S. at 678 (quoting Twombly, 550 U.S. at 555). A complaint must, however, set forth “enough factual matter (taken as true) to suggest” a cognizable cause of action, “even
if . . . [the] actual proof of those facts is improbable and . . . recovery is very remote and unlikely.” Twombly, 550 U.S. at 556 (internal quotations omitted). “Threadbare recitals of the elements of a cause of action, supported by mere conclusory statements, do not suffice” to plead a claim. Iqbal, 556 U.S. at 678; see A Soc’y Without a Name v. Virginia, 655 F.3d 342, 346 (4th. Cir. 2011). ANALYSIS
In their Motion to Dismiss (ECF No. 35), Defendants argue that NetChoice’s Amended Complaint (ECF No. 27) should be dismissed on two bases. First, they seek dismissal under Federal Rule of Civil Procedure 12(b)(1) of the as-applied challenges in Counts I, II, V, and VI on the basis that NetChoice lacks associational and third-party standing. (ECF No. 35-1 at 10–16.) Second, they seek dismissal of all Counts under Rule 12(b)(6). (Id. at 17– 48.) The Court addresses the jurisdictional issue of standing before evaluating sufficiency of
the pleadings as to each Count. I. Rule 12(b)(1) Challenge to Standing “As the Supreme Court has consistently emphasized, Article III of the Constitution limits the jurisdiction of federal courts to Cases and Controversies.” Hutton v. Nat’l Bd. of Exam’rs in Optometry, Inc., 892 F.3d 613, 619 n.5 (4th Cir. 2018) (internal quotation marks omitted) (quoting Lujan v. Defs. of Wildlife, 504 U.S. 555, 559 (1992)). “The requirement that a
[p]laintiff possess ‘standing to sue’ emanates from that constitutional provision.” Id. The burden to establish standing lies with the party invoking federal jurisdiction—here, NetChoice. Lujan, 504 U.S. at 561. In this case, NetChoice invokes organizational standing on behalf of its members and
third-party standing on behalf of their users. (ECF No. 27 ¶ 114.) An organization can assert standing under two distinct theories: (1) standing in its own right to seek judicial relief for injury to itself; or (2) standing as a representative of its members who have been harmed. See S. Walk at Broadlands Homeowner’s Ass’n, Inc. v. OpenBand at Broadlands, LLC, 713 F.3d 175, 182 (4th Cir. 2013); Students for Fair Admissions Inc. v. Pres. & Fellows of Harvard Coll., 600 U.S. 181, 199 (2023). NetChoice asserts the latter option, known as representational or associational
standing.6 Additionally, as Judge Hollander of this Court has explained, “[t]he Supreme Court has recognized that ‘there may be circumstances where it is necessary to grant a third party standing to assert the rights of another.’” Am. Fed. Of State, Cnty., & Mun. Emps. v. Soc. Sec. Admin., 778 F. Supp. 3d 685, 720–21 (D. Md. 2025) (quoting Kowalski v. Tesmer, 543 U.S. 125, 129–30 (2004)). Usually, third-party standing requires a plaintiff to “demonstrate ‘(1) an injury-in-fact; (2) a close relationship between [itself] and the person whose right [it] seeks to
assert; and (3) a hindrance to the third-party’s ability to protect his or her own interests.’” Wikimedia Found. v. Nat’l Sec. Agency/Central Sec. Serv., 14 F.4th 276, 288 (4th Cir. 2021) (quoting Freilich v. Upper Chesapeake Health, Inc., 313 F.3d 205, 215 (4th Cir. 2002)). These conventional requirements are relaxed, however, in First Amendment challenges and cases where a vendor
6 As Judge Hollander of this Court recently noted, associational standing—sometimes called organizational standing—is generally considered to be a form of representational standing. Am. Fed. Of State, Cnty., & Mun. Emps. v. Soc. Sec. Admin., 778 F. Supp. 3d 685, 720 & n.21 (D. Md. 2025). raises the rights of its customers. See Craig v. Boren, 429 U.S. 190, 195 (1976); Peterson v. Nat’l Telecomms. & Info. Admin., 478 F.3d 626, 633–34 (4th Cir. 2007). Importantly, “standing in no way depends on the merits of the plaintiff’s contention
that particular conduct is illegal.” Am. Fed. Of State, Cnty., & Mun. Emps., 778 F. Supp. 3d at 720 (quoting Warth v. Seldin, 422 U.S. 490, 500 (1975)). Rather, the Court “assumes the merits of a dispute will be resolved in favor of the party invoking . . . jurisdiction in assessing standing and, at the pleading stage, ‘presumes that general allegations embrace those specific facts that are necessary to support the claim.’” Equity in Athletics, Inc. v. Dep’t of Educ., 639 F.3d 91, 99 (4th Cir. 2011) (quoting Lujan v. Nat’l Wildlife Fed’n, 497 U.S. 871, 889 (1990)). As Judge Bredar
and Judge Xinis of this Court have put it, “the court determines whether the allegations in the Complaint, taken as true, are sufficient to establish standing under the plausibility standard of Rule 12(b)(6) and Iqbal/Twombly.” Evans v. Am. Collection Enter., 624 F. Supp. 3d 593, 598 (D. Md. 2022) (quoting Allah-Mensah v. Law Off. Of Patrick M. Connelly, P.C., Civ. No. PX-16-1053, 2016 WL 6803775, at *2 (D. Md. Nov. 17, 2016)). Keeping in mind that at this pleading stage NetChoice need only allege facts sufficient to establish that standing is plausible, the Court
addresses each theory of standing below. A. NetChoice has sufficiently alleged associational standing To invoke associational standing, an organization must demonstrate that “(a) its members would otherwise have standing to sue in their own right; (b) the interests it seeks to protect are germane to the organization’s purpose; and (c) neither the claim asserted nor the relief requested requires the participation of individual members in the lawsuit.” Hunt v. Wash.
State Apple Adver. Comm’n, 432 U.S. 333, 343 (1977). In this case, Defendants dispute both the first and third elements of associational standing.7 As to the first element, they argue that NetChoice has not sufficiently alleged that all its members have suffered an injury-in-fact. (ECF No. 35-1 at 10, 11.) As to the third element, they contend that NetChoice’s ability to
establish Article III standing hinges on participation of individual members. (Id. at 13–16.) In Opposition, NetChoice asserts that it only needs to establish that some of its members have standing, and, at this stage, it has sufficiently alleged that its claims raise industry-wide issues not dependent on the participation of any individual member. (ECF No. 40 at 54–55.) i. NetChoice members have Article III standing NetChoice has sufficiently alleged that at least some of its members have Article III
standing to sue in their own right. “To satisfy the first prong of th[e standing] analysis, ‘an organization suing as representative [must] include at least one member with standing.’” Nat’l Fed’n of the Blind, Inc. v. Wal-Mart Assocs., Inc., 566 F. Supp. 3d 383, 395 (D. Md. 2021) (quoting United Food & Com. Workers Union v. Brown Grp., Inc., 517 U.S. 544, 555 (1996)). An entity has Article III standing where it establishes that it has (1) suffered an injury-in-fact that is (2) traceable to the defendant’s actions and (3) is likely to be redressed by judicial intervention.
See Lujan, 504 U.S. at 560–61. Judge Gallagher of this Court recently explained that “[w]here a government action ‘require[s] or forbid[s] some action by the plaintiff . . . standing is usually easy to establish.’” Am. Fed. Of Teachers v. Dep’t of Educ., 779 F. Supp. 3d 584, 607 (D. Md. 2025) (quoting Food & Drug Admin. v. All. For Hippocratic Med., 602 U.S. 367, 382 (2024)). “If
7 NetChoice has sufficiently alleged that “the interests it seeks to protect are germane” to its purpose because it has alleged that its “mission is to promote online commerce and speech and to increase consumer access and options via the Internet, while minimizing burdens that could prevent businesses from making the Internet more accessible and useful.” (ECF No. 27 ¶¶ 22, 26.) a defendant’s action causes an injury, enjoining the action or awarding damages for the action will typically redress that injury.” Am. Fed. Of State Cnty. & Mun. Emps., 778 F. Supp. 3d at 724 (quoting All. For Hippocratic Med., 602 U.S. at 380). In this case, Defendants dispute
whether NetChoice has sufficiently alleged an injury based on violations of its members’ First Amendment rights. “A plaintiff can show an ‘injury in fact’ when [it] suffers ‘an invasion of a legally protected interest which is concrete and particularized, as well as actual or imminent.’” Piney Run Preservation Ass’n v. Cnty. Comm’rs of Carroll Cnty., MD, 268 F.3d 255, 263 (4th Cir. 2001) (quoting Friends of the Earth, Inc. v. Gaston Copper Recycling Corp., 204 F.3d 149, 154 (4th Cir.
2000)). Plaintiffs can establish standing to allege First Amendment claims in two ways: ‘First, they may show that they intend to engage in conduct at least arguably protected by the First Amendment but also proscribed by the policy they wish to challenge, and that there is a “credible threat” that the policy will be enforced against them when they do so.’ Second, they may make a sufficient showing of self-censorship, establishing a chilling effect on their free expression that is objectively reasonable.8 Menders v. Loudon Cnty. Sch. Bd., 65 F.4th 157, 165 (4th Cir. 2023) (internal citations omitted) (quoting Abbott v. Pastides, 900 F.3d 160, 176 (4th Cir. 2018)). As to the as-applied challenges in Counts I, II, V, and VI, NetChoice appears to focus its allegations on the first theory.9
8 As Judge Gallagher of this Court recently noted, “standing requirements are relaxed in First Amendment cases” such that demonstration of an injury-in-fact under a self-censorship theory requires only an “objectively reasonable and not subjective or speculative” chilling effect. Am. Fed. Of Teachers, 779 F. Supp. 3d at 608 (citing Cooksey v. Futrell, 721 F.3d 226, 236 (4th Cir. 2013)). 9 NetChoice also references concerns that the Kids Code will unconstitutionally chill protected speech in Counts III and IV, which allege due process challenges based on vagueness doctrine, and generally refers to restrictions to “a range of commonplace expressive activities that depend on” data collection practices restricted under the Kids Code. See (ECF No. 27 ¶¶ 89, 142, 264). These allegations give rise to standing under the second theory discussed above, and Defendants do not challenge standing as to Counts III and IV at this stage. The First Amendment “at least arguably protect[s]” provision of curated content and the right to be free from governmentally compelled speech. Menders, 65 F.4th at 165. The Supreme Court recently recognized that “[a]n entity ‘exercis[ing] editorial discretion in the
selection and presentation’ of content is ‘engage[d] in speech activity.’” Moody v. NetChoice, LLC, 603 U.S. 707, 731 (2024) (quoting Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666, 674 (1998)).10 Thus, presentation of editorialized or curated content “is itself protected speech.” Id. at 744. Such First Amendment protections apply even where the government seeks to protect the interests of children. “Speech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young
from ideas or images that a legislative body thinks unsuitable for them.” Brown v. Ent. Merchs. Ass’n, 564 U.S. 786, 795 (2011) (quoting Erznoznik v. Jacksonville, 422 U.S. 205, 213–14 (1975)); see also NetChoice, LLC v. Bonta, 113 F.4th 1101, 1121 (9th Cir. 2024) [Bonta I] (applying strict scrutiny where statute compelled speech and deputized private entities as state censors). Even regulations that indirectly burden protected speech may effectuate an injury-in-fact under the First Amendment. See Amador v. Mnuchin, 476 F. Supp. 3d 125, 153 (D. Md. 2020); Minneapolis
Star & Trib. Co. v. Minn. Comm’r of Revenue, 460 U.S. 575, 585 (1983) (holding tax on ink and paper products violated First Amendment by burdening freedom of press). In this case, NetChoice has sufficiently alleged an injury-in-fact in Counts I, II, and VI that the Kids Code burdens its members’ protected provision of curated content and in Count V that the DPIA requirement subjects its members to compelled speech and may require them
10 Although the Supreme Court in Moody v. NetChoice, LLC, 603 U.S. 707 (2024), evaluated a facial challenge, its definition of protected speech applies equally to NetChoice’s as-applied challenge in this case. to censor content according to state proscriptions. First, it alleges that “[b]ased on the [Kids Code’s] definitions, § 14-801(h), many—if not most—of NetChoice’s members with online services are directly subject to and regulated by the Act and could face serious legal
consequences if they violate it.” (ECF No. 27 ¶ 25.) Second, it alleges that “it is undeniable that the covered members create, curate, and disseminate enormous amounts of protected speech, including, e.g., Amazon, Google, Meta, Nextdoor, Pinterest, Netflix, Reddit, and X.” (Id.) Finally, it alleges that the Kids Code “restrict[s] access to essential information needed to deliver content to users and a range of commonplace expressive activities that depend on such access.” (Id. ¶ 89.) These general allegations applicable to all Counts in the Amended
Complaint sufficiently allege that at least some of NetChoice’s members engage in “arguably protected” speech activity in the form of providing curated content. As to Counts I, II, and VI, NetChoice plausibly alleges that the Kids Code’s “best interest of children” standard, “reasonably likely to be accessed by children” standard, and data collection restrictions, respectively, create a “credible threat” of enforcement against members if they continue to engage in the data collection practices allegedly required to
provide protected speech. See (id. ¶¶ 120–22, 128, 138–39). It alleges that “[s]peech on the Internet requires” data collection, and the restrictions of “dark patterns,” “monitoring,” and data will limit members’ ability to “process data to curate and disseminate compilations of protected speech on their services” including the “use [of] ‘algorithms’ (i.e., ‘profiling’) to implement editorial policies, even if based in part on ‘user’s expressed interests.’” (Id. ¶¶ 254– 256 (quoting Moody, 603 U.S. at 734–35).) Thus, it has alleged that some members engage in
“arguably protected” speech and face “a credible threat” of enforcement if they continue to engage in the data collection practices necessary for such speech, which is sufficient to establish an injury-in-fact on behalf of its members in Counts I, II, and VI. Menders, 65 F.4th at 165.
Relatedly, in its challenge to the Kids Code’s Data Protection Impact Assessment requirement in Count V, NetChoice has sufficiently alleged an injury-in-fact to its members on two bases. First, it has sufficiently alleged that the requirement unconstitutionally compels speech from its members. See Md. Shall Issue, Inc. v. Anne Arundel Cnty., 662 F. Supp. 3d 557, 568 (D. Md. 2023). NetChoice has alleged that the DPI Assessments due by April 1, 2026 “require covered businesses to opine on potential harm to children,” “compel speech and
require covered entities to opine [o]n all manner of potential harms to minors,” and “require a company to publicly condemn itself.” (ECF No. 27 ¶¶ 221–22, 243.) It alleges that this requirement “compels speech that covered entities would not otherwise make and thus necessarily operates as a content-based regulation because it alters the content of speech.” (Id. ¶ 215.) Second, it has sufficiently alleged that the DPIA requirement infringes its members’ protected editorialization of content. As explained above, “the editorial function itself is an
aspect of speech.” Moody, 603 U.S. at 731 (quoting Denver Area Educ. Telecomms. Consortium Inc. v. FCC, 518 727, 737 (1996) (plurality opinion)). NetChoice alleges that the DPIA requirement “deputizes covered businesses into serving as censors for the State” in a manner that “compel[s] covered entities to remove content from their services.” (Id. ¶¶ 9, 223, 225, 226 (quoting Bonta I, 113 F.4th at 1118)).) At this stage, such allegations are sufficient to allege an injury-in-fact. See Equity in Athletics, Inc., 639 F.3d at 99. Thus, NetChoice has sufficiently alleged that some of its members have standing in their own right to challenge the Kids Code in Counts I, II, V, and VI. ii. NetChoice has sufficiently alleged that individual members’ participation is not required As explained above, the third element of associational standing requires that “neither the claim asserted nor the relief requested requires the participation of individual members in the lawsuit.” Hunt, 432 U.S. at 343. Where, as here, “an association seeks prospective or
injunctive relief for its members,” “‘individual participation’ is not normally necessary.” United Food & Com. Workers v. Brown Grp., Inc., 517 U.S. 544, 546 (1996) (citing Hunt, 432 U.S. at 34). As Judge Hollander of this Court has explained, this element “is best seen as focusing on . . . matters of administrative convenience and efficiency, not on elements of a case or controversy within the meaning of the Constitution.” Am. Fed’n of State, County, & Mun. Emps., 778 F. Supp. 3d at 727 (quoting United Food & Com. Workers, 517 U.S. at 557). Standing is in
the interest of judicial economy where individual suits “would generally implicate the same facts, the same defendants, and the same data . . . .” Id. at 727; accord City of Columbus v. Kennedy, -- F. Supp. --, Civ. No. BAH-25-2114, 2025 WL 2426382, at *9 (D. Md. Aug. 22, 2025). Defendants contend that NetChoice’s claims would require individual participation of members because to suffer an injury-in-fact under the First Amendment, NetChoice’s members must demonstrate that they are engaged in protected speech or expression. (ECF
No. 45 at 3; ECF No. 35-1 at 13–15); see also Menders, 65 F.4th at 165. Defendants argue that such a demonstration requires fact-intensive, individualized inquiry into (1) the algorithms and speech or expression practices of each member; and (2) the Kids Code’s effect on such algorithms and practices. (ECF No. 35-1 at 13–14.) Additionally, Defendants emphasize that an as-applied challenge requires a plaintiff to “show that the regulations are unconstitutional as applied to their particular speech activity,” which demands “fact-specific inquiries.” Fusaro v. Howard, 19 F.4th 357, 368 (4th Cir. 2021) (quoting Edwards v. Dist. Of Columbia, 755 F.3d
996, 1001 (D.C. Cir. 2014)); (ECF No. 35-1 at 15–16). At this pleading stage, however, the Court lacks a developed factual record and “presumes that general allegations embrace those specific facts that are necessary to support the claim.” Equity in Athletics, Inc., 639 F.3d at 99 (quoting Lujan, 497 U.S. at 889). Under this standard, NetChoice has sufficiently alleged that “the attributes that make the [Kids Code] unlawful as to every regulated website apply in substantially similar ways across NetChoice’s
covered members” such that “[i]ndividualized participation in the lawsuit is unnecessary.” (ECF No. 27 ¶ 27.) It has alleged that “[b]eyond social media and search services exist a panoply of other digital services that disseminate speech” and “[o]nline services—including NetChoice’s members—use information to engage in editorial functions to publish, disseminate, and display protected speech to users.” (Id. ¶¶ 39–40; see id. ¶¶ 41–52.) Accordingly, NetChoice has sufficiently alleged associational standing.
b. NetChoice has sufficiently alleged third-party standing “Generally, courts require that plaintiffs assert their own legal rights and interests and do not permit plaintiffs to rest their claim to relief on the rights or interests of third parties.” Lucas v. Curran, 856 F. Supp. 260, 265–66 (D. Md. 1994) (citing Sec. of State of Md. v. Joseph H. Munson Co., 467 U.S. 947, 955 (1984)). The Supreme Court has recognized, however, that “vendors and those in like positions have been uniformly permitted to resist efforts at
restricting their operations by acting as advocates of the rights of third parties who seek access to their market or function.” Craig, 429 U.S. at 195. Similarly, the Fourth Circuit has recognized that courts have “consistently held that a vendor has third-party standing to pursue claims on behalf of its customers, regardless of whether a vendor’s customers are hindered in
bringing their own claims.” Md. Shall Issue, Inc. v. Hogan, 971 F.3d 199, 216 (4th Cir. 2020) (collecting cases). Accordingly, where a vendor can establish that a third party will “satisfy the [Article] III case-or-controversy requirement,” it may assert third-party standing on behalf of its customers even without establishing any hindrance to customers’ ability to sue. Id. at 215 (quoting Joseph H. Munson Co., 467 U.S. at 956). In this case, NetChoice has sufficiently alleged third-party standing on behalf of its members’ users.11
“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective.” Free Speech Coal., Inc. v. Paxton,
11 As noted on the record and above, NetChoice has been remarkably active in litigation challenging state laws seeking to regulate online services’ interactions with children. Courts evaluating many of these challenges have repeatedly applied third-party standing precedents to hold that NetChoice has standing to raise the First Amendment rights of its members’ users “who seek access to [the members’] . . . function.” See, e.g., NetChoice v. Carr, 789 F. Supp. 3d 1200, 1216 (N.D. Ga. 2025) (quoting Craig, 429 U.S. at 195). Relatedly, courts in those cases have almost uniformly determined that NetChoice has associational standing to challenge similar state laws where NetChoice alleged First Amendment injuries based on allegations that such similar statutes chilled, altered, and/or compelled its members’ speech. See, e.g., Computer & Comms. Indus. Ass’n v. Paxton, 747 F. Supp. 3d 1011, 1030–31 (W.D. Tex. 2024) (holding NetChoice and additional trade association plaintiff had associational standing to challenge Texas statute imposing age registration and “monitoring and filtering” requirements); Carr, 789 F. Supp. 3d at 1213–16 (holding NetChoice had associational and third-party standing to challenge Georgia statute regulating how social media companies serve users under the age of sixteen); NetChoice, LLC v. Reyes, 748 F. Supp. 3d 1105, 1118–19 (D. Utah 2024) (holding NetChoice had associational standing to raise First Amendment interest of its members in challenge seeking preliminary injunction against Utah law requiring social media companies to restrict minors’ access to accounts); NetChoice LLC v. Fitch, 738 F. Supp. 3d 753, 763, 766–68 (S.D. Miss. 2024) vacated in part on other grounds 134 F.4th 799 (5th Cir. 2025) (holding NetChoice had associational and third-party standing to challenge Mississippi statute limiting minors’ access to digital services and requiring service providers to make “reasonable efforts” to prevent harmful material from reaching minor users). But see NetChoice v. Murrill, 2025 WL 2656063, at *7 (M.D. La. Sep. 16, 2025) (suggesting without determining at discovery stage that NetChoice may face difficulty showing standing for as-applied challenges); NetChoice, LLC v. Bonta, 152 F.4th 1002, 1014 (9th Cir. 2025) (holding NetChoice lacked associational standing to raise as-applied challenges due to need for individual member participation). Although this Court evaluates only NetChoice’s allegations specific to this case and the Kids Code, it nonetheless notes the relative uniformity with which other federal district courts have addressed standing in similar challenges. 606 U.S. 461, 478 (2025) (citing Ginsberg v. State of N.Y., 390 U.S. 629, 641 (1968)). This power also extends to non-obscenity speech that is legitimately harmful to children. See, e.g., Reno v. Am. Civ. Liberties Union, 521 U.S. 844, 875 (1997). Where such laws overly burden adults’
access to such speech, however, they may effectuate a constitutional injury. Id.; see Paxton, 606 U.S. at 487–88 (surveying cases). Moreover, “[m]inors are entitled to a significant measure of First Amendment protection, and . . . [s]peech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative body thinks unsuitable for them.” Brown, 564 U.S. at 794, 795 (quoting Erznoznik, 422 U.S. at 212–13, 214). In this case, for the reasons discussed above,
NetChoice has sufficiently alleged that the Kids Code will burden its members’ provision of curated content in a manner that may abridge or limit users’ access to protected speech. See, e.g., (ECF No. 27 ¶¶ 14, 15, 20, 28, 37–41, 44, 46–47, 52, 79, 89, 113). This is sufficient to allege an injury-in-fact to members’ users. See Hogan, 971 F.3d at 216. Relatedly, for the reasons discussed above, NetChoice has sufficiently alleged that the injury to its members’ users is traceable to the Kids Code and may be redressed by the declaratory and injunctive
relief requested. Defendants’ Motion (ECF No. 35) under Rule 12(b)(1) is DENIED. II. Rule 12(b)(6) Challenge to all Counts a. First Amendment Claims: Counts I, II, V, VI Constitutional challenges to a statute may be facial or as applied. Facial challenges generally “attack . . . a statute itself as opposed to a particular application,” City of Los Angeles, Calif. v. Patel, 576 U.S. 409, 415 (2015), and require a plaintiff to allege that “a substantial
number of [the law’s] applications are unconstitutional, judged in relation to the statute’s plainly legitimate sweep,” Moody, 603 U.S. at 723 (quoting Ams. For Prosperity Found. v. Bonta, 594 U.S. 595, 615 (2021)). An as-applied challenge under the First Amendment, however, requires plaintiffs to “show that the regulations are unconstitutional as applied to their
particular speech activity.” Fusaro, 19 F.4th at 368 (quoting Edwards, 755 F.3d at 1001). In this case, NetChoice alleges both facial and as-applied First Amendment challenges to the Kids Code in Counts I, II, V, and VI. As explained below, it has alleged facts sufficient to state facial and as-applied claims for relief in each of those Counts. Under the First Amendment, “Congress shall make no law . . . abridging the freedom of speech.” U.S. CONST. amend. I. The Fourteenth Amendment incorporates this
proscription against the states. Fusaro, 19 F.4th at 368 (citing Edwards v. City of Goldsboro, 178 F.3d 231, 245 n.10 (4th Cir. 1999)). As Judge Hollander of this Court has explained, First Amendment protections extend to laws that regulate conduct “where the conduct itself communicates a message; the conduct has an expressive element; or where the conduct is intertwined with protected First Amendment activity.” Amador v. Mnuchin, 476 F. Supp. 3d 125, 153 (D. Md. 2020) (internal citations omitted); see also Minneapolis Star & Trib. Co. v. Minn.
Comm’r of Revenue, 460 U.S. 575, 585 (1983) (holding tax on newspapers’ ink and paper products violated First Amendment by burdening freedom of press). Just as the First Amendment protects speech, it also protects the freedom to associate with others. See Amador, 476 F. Supp. 3d. at 153–54. Constitutional analysis of First Amendment claims proceeds in two steps. First, a court must determine “whether any protected First Amendment right is involved.” Id. at 154
(quoting Billups v. City of Charleston, 961 F.3d 673, 682 (4th Cir. 2020)). As the Supreme Court has explained, “First Amendment scrutiny . . . can, but do[es] not necessarily,” apply to “[l]aws that directly regulate expressive conduct” or speech. TikTok, Inc. v. Garland, 604 U.S. 56, 67 (2025). It also applies “in ‘cases involving governmental regulation of conduct that has an
expressive element,’ and to ‘some statutes which, although directed at activity with no expressive component, impose a disproportionate burden upon those engaged in protected First Amendment activities.’” Id. (quoting Arcara v. Cloud Books, Inc., 478 U.S. 697, 703–04 (1986)). Put differently, not every law that appears to regulate speech necessarily invokes First Amendment protections. Id. If a statute implicates no protected right, then no First Amendment claim arises. Billups, 961 F.3d at 682.
If, however, a statute reaches a right protected under the First Amendment, the court must proceed to the second step of the analysis, which requires it to “ascertain ‘whether the Governmental action in question infringes that right.” Amador, 476 F. Supp. 3d at 154 (quoting Billups, 961 F.3d at 682). Under this second step, the court must (1) determine which degree of First Amendment scrutiny applies; and (2) evaluate whether the challenged regulation satisfies that scrutiny. See Paxton, 606 U.S. at 471–72. Typically, this inquiry depends on
whether the law is content- or viewpoint-based or whether it is content-neutral but nonetheless burdens speech. See id. at 471. “‘Content-based laws—those that target speech based on its communicative content—are presumptively unconstitutional and may be justified only if’ they satisfy strict scrutiny.” Id. (quoting Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015)). “Content-neutral laws . . . ‘are subject to an intermediate level of scrutiny because in most cases they pose a less substantial risk of excising certain ideas or viewpoints from the
public dialogue.’” Id. (quoting Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 642 (1994)). As this Court has repeatedly recognized, application of this framework in full may be premature at the pleading stage, where courts must accept as true all well-pleaded facts and draw all reasonable inferences to favor the plaintiff. Thus, this Court generally holds First
Amendment challenges adequately alleged where the plaintiff has sufficiently alleged that the challenged statute applies to protected speech such that it triggers First Amendment scrutiny. See, e.g., Sammons v. McCarthy, 606 F. Supp. 3d 165, 213–14 (D. Md. 2022) (explaining public forum heightened scrutiny analysis of speech restriction but stopping short of applying scrutiny after determining plaintiff adequately alleged viewpoint discrimination); Amador, 476 F. Supp. 3d at 152–53, 156–57 (treating Fifth Amendment and First Amendment claims as
adequately alleged where plaintiff sufficiently alleged laws were subject to heightened scrutiny). Significantly, where a plaintiff has adequately alleged that a statute reaches protected speech or expression, determination of the precise form of applicable scrutiny is unnecessary at the pleading stage. See Amador, 476 F. Supp. 3d at 152–53. In this case, this Court does not determine which specific degree of First Amendment scrutiny applies to the Kids Code at this pleading stage. Rather, the Court cabins its evaluation
to the first step of the First Amendment analysis: whether NetChoice has sufficiently alleged that the Kids Code at least burdens protected speech or expression such that First Amendment scrutiny applies.12 Defendants argue that NetChoice fails to sufficiently allege that the Kids Code reaches protected speech. Alternatively, they contend that even if the Kids Code implicates protected speech, it does so only incidentally in a content-neutral manner such that
12 NetChoice acknowledged on the record that the determination of the appropriate degree of scrutiny is a merits analysis that awaits later stages of litigation. it is subject to and survives intermediate scrutiny.13 In Opposition, NetChoice argues that the Kids Code regulates speech in a content-based manner. At this pleading stage, NetChoice has sufficiently alleged that the Kids Code applies to protected speech, and, therefore, it has
adequately alleged Counts I, II, V, and VI. i. As-Applied Challenges NetChoice has adequately alleged that the Kids Code regulates constitutionally protected speech or expression of its members and their users. In Moody v. NetChoice, LLC, 603 U.S. 707 (2024), the Supreme Court held that to the extent online services “‘exercis[e] editorial discretion in the selection and presentation’ of content,” they are “engage[d] in speech
activity” entitled to First Amendment protection. Id. at 731 (quoting Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666, 674 (1998)). The Florida and Texas laws at issue in Moody directly limited speech by type of user or speaker, see NetChoice, LLC v. Moody, 546 F. Supp. 3d 1082, 1083–88 (N.D. Fl. 2021) vacated in part by NetChoice, LLC v. Att’y Gen. of Fl., 34 F.4th 1196 (11th Cir. 2022), and limited online providers’ ability to “censor” a user based on that user’s viewpoint, subject to content-based exceptions, NetChoice, LLC v. Paxton, 573 F. Supp.
3d 1092, 1099–1100 (W.D. Tex. 2021) vacated by 49 F.4th 439 (5th Cir. 2022).14 In this case,
13 As discussed above, this determination is premature at the pleading stage. As an initial matter, the Court does not determine on the pleadings the appropriate degree of First Amendment scrutiny. Moreover, even assuming without deciding that intermediate scrutiny were applicable, to justify a law under intermediate scrutiny, the State must demonstrate that the challenged regulation is proportional to the substantial state interest it is intended to serve. 360 Virtual Drone Servs., LLC v. Ritter, 102 F.4th 263, 277 (4th Cir. 2024); Central Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n of NY, 447 U.S. 557, 564 (1980). Demonstration of proportionality outside the context of “professional-conducted-focused regulations” requires the State to provide “actual evidence supporting its assertion that a speech restriction does not burden substantially more speech than necessary.” Ritter, 102 F.4th at 276, 277 (quoting Reynolds v. Middleton, 779 F.3d 222, 229 (4th Cir. 2015)). There is very limited actual evidence in the record before the Court at this pleading stage. 14 As noted above, the challenges at issue in Moody were facial challenges, whereas NetChoice in this case raises as-applied and facial challenges. Nevertheless, the Supreme Court’s definition of protected speech is equally applicable to facial and as-applied challenges. NetChoice alleges that (1) the Kids Code regulates data collection in a manner that requires evaluation of constitutionally protected content; (2) evaluation of content is essential to its members’ protected expression; and, therefore, (3) it is a content-based regulation of speech.
As explained below, NetChoice has sufficiently alleged that the Kids Code regulates protected speech or expression. The Court reaches no conclusion as to the merits of the claims or the applicable scrutiny other than to determine that NetChoice has alleged facts sufficient to state as-applied claims in Counts I, II, V, and VI. 1. Counts I, II, and VI As explained above, provision of curated content, including editorial decisions, is a
form of speech entitled to First Amendment protection. Moody, 603 U.S. at 731–33; Ark. Educ. Television Comm’n, 523 U.S. at 674. Regulations that indirectly burden protected speech, including those that burden speech “in its practical operation,” invoke First Amendment protections. Sorrell v. IMS Health, Inc., 564 U.S. 552, 567 (2011); see also Minneapolis Star & Trib. Co., 460 U.S. at 582–83 (discussing taxes that burden First Amendment rights); TikTok, Inc., 604 U.S. at 67–69; Arcara, 478 U.S. at 703–04. In this case, NetChoice has sufficiently alleged
that, as applied to its members and their users, the Kids Code standards and restrictions challenged in Counts I, II, and VI burden protected speech. In Counts I and II, NetChoice has sufficiently alleged that the Kids Code’s standards to define covered features and entities, respectively, burden protected speech as applied to its members. As to Count I, it alleges that the “best interest of children” standard that the Kids Code requires covered entities to apply to their content unconstitutionally deputizes private
actors to serve as censors of the state and necessarily requires such private actors to evaluate speech on their services in a content-based manner. (ECF No. 27 ¶¶ 119–20.) Specifically, it alleges that covered entities must make content- and speaker-based evaluations to determine whether their data collection practices serve “the best interest of children.” (Id. ¶¶ 121–22.)
In Count II, NetChoice raises the same allegations in relation to the Kids Code’s “reasonably likely to be accessed by children” standard, which it argues will require its members to differentiate online content in a content- or speaker-based manner such that it unconstitutionally discriminates against protected speech. (Id. ¶¶ 137–38.) Thus, NetChoice has sufficiently alleged that the Kids Code requires members to evaluate whether their data collection practices serve the “best interests of children” or apply to products that are
“reasonably likely to be accessed by children,” which in turn requires members to evaluate and potentially alter moderated content. At this stage, therefore, NetChoice has alleged that (1) its members engage in protected speech at least to the extent they provide moderated content; and (2) the Kids Code will burden such protected speech by requiring covered entities to make decisions about data collection based on the protected speech it produces. Similarly, in Count VI, NetChoice has sufficiently alleged that, as applied to its
members, the Kids Code’s restrictions on data, “dark patterns,” and monitoring, see MD. CODE ANN., COM. LAW § 14-4806, at minimum burden protected speech. It alleges that data collection, “dark patterns,” and monitoring are essential to its members ability to engage in protected speech in the form of curated or editorialized content provided to users. (ECF No. 27 ¶¶ 253–57.) It alleges that such practices are limited under the same standards challenged in Counts I and II such that the limitations are content- or speaker-based and function as prior
restraints on protected speech. (Id. ¶¶ 257, 260.) In short, NetChoice has alleged facts sufficient to state a First Amendment claim that, as applied to its members, the Kids Code will burden protected speech by burdening content moderation “in its practical operation.” Sorrell, 564 U.S. at 567. Specifically, the Kids Code will require members to alter products “reasonably
likely to be accessed by children” to ensure that their practices serve the “best interests of children,” as alleged in Counts I and II, and will limit members’ ability engage in the data collection practices necessary to provide such content in the first instance, as alleged in Count VI. Id.; TikTok, Inc., 604 U.S. at 67–68 (quoting Arcara, 478 U.S. at 703–04)). Relatedly, NetChoice has alleged that, by burdening its members’ ability to provide protected speech in the form of curated content, the provisions challenged in Counts I, II,
and VI burden protected speech for its members’ users. As explained above, “only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to [children].” Brown, 564 U.S. at 794 (quoting Erznoznik, 422 U.S. at 212–13). Additionally, laws intended to protect minors cannot “burn the house to roast the pig.” Butler v. State of Mich., 352 U.S. 380, 383 (1957). That is, even where a State acts within its authority to regulate content provided to minors, it cannot overly burden the provision of such content
to adults. Id.; Paxton, 606 U.S. at 475. In this case, NetChoice has sufficiently alleged in Counts I, II, and VI that the Kids Code will require its members who provide services “reasonably likely to be accessed by children” to suppress or alter data collection practices necessary to protected speech—editorialized, moderated, or curated content—to suppress content that is not in the “best interests of children” for all users. These allegations are sufficient to state a claim that the Kids Code burdens protected speech as applied to NetChoice’s members and their users in Counts I, II, and VI.15 Accordingly, Defendants’ Motion (ECF No. 35) is DENIED as to its Rule 12(b)(6) challenge to the as-applied claims in Counts I, II, and VI. 2. Count V
Additionally, NetChoice has sufficiently alleged that the Data Protection Impact Assessment requirement challenged in Count V is unconstitutional as applied to its members and their users. As noted above, the speech protections enshrined in the First Amendment “prohibit[] the government from telling people what they must say,” including compelling people to speak at all. Rumsfeld v. F. for Acad. & Inst’l Rts., Inc., 547 U.S. 47, 62 (2006); Wooley v. Maynard, 430 U.S. 705, 714 (1997). Thus, statutorily compelled speech may trigger First
Amendment scrutiny even where it applies only to commercial speech or requires disclosure only to the government. See, e.g., Zauderer v. Off. Of Disciplinary Couns. of Sup. Ct. of Ohio, 471 U.S. 626, 629 (1985); Ams. For Prosperity Found. v. Bonta, 594 U.S. 595, 616 (2021). As the Ninth Circuit recognized when considering NetChoice’s challenge to a similar California law,16 speech that requires “covered businesses to opine on potential harm to children” constitutes compelled speech. See Bonta I, 113 F.4th 1101, 1117 (9th Cir. 2024) (collecting cases). The
Ninth Circuit also applied heightened scrutiny because the reporting provision “deputized private actors into determining whether material is suitable for kids.” Id. at 1118 (citing Ams.
15 Although Defendants argue that the General Assembly’s stated intent not to burden speech should control the analysis of whether the Kids Code reaches such speech, (ECF No. 35-1 at 7–8, 23–24); MD. CODE ANN., COM. LAW § 14-4803, “a statute based on a nonexpressive activity [that] has the inevitable effect of singling out those engaged in expressive activity” is subject to First Amendment scrutiny. Arcara, 478 U.S. at 706–07. At this stage, NetChoice has sufficiently alleged that the Kids Code at least burdens data collection in a manner that “singl[es] out those engaged in expressive activity” in the form of content moderation. Id. at 707. 16 The California law challenged in Bonta I also contained a “DPIA report requirement” remarkably similar to the DPIA requirement in the Kids Code. See (ECF No. 27 ¶ 219 (comparing requirements under both statutes)); See also Bonta I, 113 F.4th at 1109–10 (listing California DPIA report requirements, including disclosures regarding whether product designs would harm children). For Prosperity Found., 594 U.S. at 616). In this case, NetChoice has sufficiently alleged in Count V that the Kids Code’s DPI Assessments require covered entities to opine on their products in a manner that triggers First Amendment Scrutiny as applied to members and their users.
NetChoice alleges that the DPI Assessments mandate opinionated speech by requiring covered entities to “[d]etermine whether the online product is designed in a manner consistent with the best interests of children reasonably likely to access the online product through consideration of” whether it is reasonably foreseeable that “the data management or processing practices of the online product” could result in enumerated harms to children. See MD. CODE ANN., COM. LAW § 14-4804(b)(3); (ECF No. 27 ¶¶ 215–22). Similarly, for the
same reasons discussed above as to Counts I, II, and VI, NetChoice has sufficiently alleged that the DPIA requirement burdens its members’ protected speech by requiring them to identify steps to ensure that their data collection practices will serve the “best interests of children.”17 At this stage, NetChoice has sufficiently alleged that altering data collection practices will concomitantly alter the services themselves, including protected speech. Such requirements that private actors “curtail[] their editorial choices must meet the First
Amendment’s requirements.” Moody, 603 U.S. at 717. Relatedly, it alleges that minor users’ free speech rights are burdened because the Kids Code requires its members to editorialize
17 At the Motions Hearing, Defendants argued that covered entities are not required to alter their content moderation under the Kids Code. At this stage, however, NetChoice has sufficiently alleged that DPI Assessments require covered entities to state whether their services are in compliance with the “best interests of children” standard and identify action steps if they are not in compliance. NetChoice has alleged that because data collection is essential to such services, altering data collection will alter the services, including moderated content. It has also alleged that the Kids Code’s enforcement requirement includes monetary fines for each affected child depending on whether the violation of the Kids Code was negligent or intentional. See (ECF No. 27 ¶ 110.) Drawing all reasonable inferences to favor NetChoice, the Kids Code may require entities who have identified services not in the best interests of children to alter their use of children’s data relative to those services or face the Kids Code’s enforcement provision for intentional violations. content “solely to protect the young from ideas or images that a legislative body thinks unsuitable for them.” (ECF No. 27 ¶ 117 (quoting Brown, 564 U.S. at 795).) Defendants’ Motion (ECF No. 35) is DENIED as to the as-applied claims in Count V.
ii. Facial Claims NetChoice has also sufficiently alleged Counts I, II, V, and VI to the extent they raise facial challenges to the Kids Code. Generally, to state a facial constitutional challenge at the pleading stage, a plaintiff “must establish that no set of circumstances exists under which the Act would be valid.” United States v. Hosford, 843 F.3d 161, 165 (4th Cir. 2016) (quoting United States v. Salerno, 481 U.S. 739, 745 (1987)). In the First Amendment context, however, a
plaintiff may raise a facial challenge under a slightly lower standard by alleging that “‘a substantial number of the law’s applications are unconstitutional,’ such that these applications ‘substantially outweigh its constitutional ones.’” Lynchburg Republican City Comm. v. Va. Dep’t of Elections, 793 F. Supp. 3d 765, 789 (W.D. Va. 2025) (quoting Moody, 603 U.S. at 723, 724); see Wash. State Grange v. Wash. State Republican Party, 552 U.S. 442, 449 n.6 (2008). Here, Defendants contend that NetChoice has not sufficiently alleged that most of the Kids Code’s
applications reach protected speech.18 (ECF No. 35-1 at 21–22; ECF No. 45 at 6, 7.)
18 Defendants also argue that the Court should decline to consider NetChoice’s facial challenges for prudential reasons. (ECF No. 35-1 at 17–20; ECF No. 45 at 7–8.) They argue that the Kids Code’s application is heavily dependent on facts specific to each covered entity such that as-applied challenges are more appropriate. (ECF No. 35-1 at 17–19.) “The generally applicable rule is that a federal court, whose jurisdiction has been invoked, must exercise that jurisdiction and address the matter before it.” Sonda v. W. Va. Oil & Gas Conservation Comm’n, 92 F.4th 213, 219 (4th Cir. 2024); accord Deakins v. Monaghan, 484 U.S. 193, 203 (1988). Nevertheless, facial constitutional challenges “are disfavored” because they are often speculative and “threaten to short circuit the democratic process by preventing laws embodying the will of the people from being implemented in a manner consistent with the Constitution.” Wash. State Grange, 552 U.S. at 450–51. At this pleading stage, however, the only inquiry before this Court is whether NetChoice has sufficiently alleged a facial challenge, and Defendants have identified no cases in which a court has dismissed a facial challenge on the pleadings for prudential reasons. As an initial matter, “the distinction between facial and as-applied challenges is not so well defined that it has some automatic effect or that it must always control the pleadings and disposition in every case involving a constitutional challenge.” Citizens United v. Fed. Election
Comm’n, 558 U.S. 310, 331 (2010); accord White Coat Waste Project v. Greater Richmond Transit Co., 35 F.4th 179, 203–204 (4th Cir. 2022)). Rather, “[t]he distinction is both instructive and necessary, for it goes to the breadth of the remedy employed by the Court, not what must be pleaded in a complaint.” Citizens United, 558 U.S. at 332 (citing United States v. Treasury Emps., 513 U.S. 454, 477–78 (1995)). Thus, to the extent a party has sufficiently alleged at least some constitutional claim, it is not clear that facial and as-applied theories of relief face separate
evaluation at the pleading stage. See Thorpe v. Clarke, 37 F.4th 926, 947 (4th Cir. 2022). Even assuming separate evaluation is required, however, NetChoice has sufficiently alleged its facial claims in Counts I, II, V, and VI. “In a facial challenge to the overbreadth . . . of a law, a court’s first task is to determine whether the enactment reaches a substantial amount of constitutionally protected conduct.” Vill. of Hoffman Ests. v. Flipside, Hoffman Ests., Inc., 455 U.S. 489, 494 (1982). In this case, as to
the “best interests of children” and “reasonably likely to be accessed by children” standards challenged in Counts I and II, respectively, NetChoice has sufficiently alleged that these standards permeate all applications of the statute by defining the applicable standard and the entities subject to the Kids Code. Thus, any challenge to these provisions arises in every application of the statute, making such allegations appropriate for a facial challenge “that the law is unconstitutional in all of its applications.” Wash. State Grange, 552 U.S. at 449 (citing
Salerno, 481 U.S. at 745). Furthermore, for the reasons discussed above in reference to the as-applied claims, NetChoice has sufficiently alleged that the Kids Code “reaches a substantial amount of constitutionally protected conduct” because it regulates protected speech of NetChoice members and their users. Vill. of Hoffman Ests., 455 U.S. at 494.
Similarly, NetChoice has met its burden as to its facial challenge in Count V to the Kids Code’s DPIA requirement, MD. CODE ANN., COM. LAW § 14-4804, because it has alleged that, in every case, such assessments constitute compelled speech and deputize covered entities to act as censors for the State. (ECF No. 27 ¶¶ 13–14, 213–22.) As explained above, laws that compel speech—or silence—are subject to First Amendment scrutiny. Riley v. Nat’l Fed’n of the Blind of N. Carolina, 487 U.S. 781, 797 (1988). NetChoice has alleged that the DPI
Assessments are required from every covered entity such that every application of § 14-4804 will compel speech, regardless of whether a covered entity is one of its members. Therefore, NetChoice has alleged that a “substantial number of [§ 14-4804’s] applications are unconstitutional, judged in relation to [its] plainly legitimate sweep.” Moody, 603 U.S. at 723. Finally, NetChoice has adequately alleged a facial constitutional challenge in Count VI to Kids Code § 14-4806 by alleging that restrictions on data collection and usage will “chill
programmed editorial decisions to select, promote, and moderate content to audiences,” which applies to news sites, content-sharing sites, social media sites, and independent blogging sites. (ECF No. 27 ¶ 264.) As a result, NetChoice has sufficiently alleged that § 14-4806 is unconstitutional whenever it is applied to websites that engage in protected speech or expression by editorializing or moderating content. See Moody, 603 U.S. at 731. Furthermore, NetChoice has alleged that § 14-4806 is overbroad in a substantial number of applications as
compared to its plainly legitimate sweep because, in every application, it limits the protected speech available to users. See Vill. of Hoffman Ests., 455 U.S. at 494 (explaining facial challenge inquires whether law reaches substantial amount of protected speech). Thus, Defendants’ Motion (ECF No. 35) is DENIED as to Counts I, II, V, and VI.
b. Due Process Claims: Counts III, IV “Vagueness doctrine is an outgrowth not of the First Amendment, but of the Due Process Clause of the Fifth Amendment[,]” which recognizes that “[a] conviction fails to comport with due process if the statute under which it is obtained fails to provide a person of ordinary intelligence fair notice of what is prohibited, or is so standardless that it authorizes or encourages seriously discriminatory enforcement.”19 United States v. Williams, 553 U.S. 285,
304 (2008) (citing Hill v. Colorado, 530 U.S. 703, 732 (2000). Distinct from overbreadth, vagueness “is basically directed at lack of sufficient clarity and precision in the statute[.]” Willis v. Town of Marshall, N.C., 426 F.3d 251, 261 (4th Cir. 2005) (quoting United States v. Morison, 844 F.2d 1057, 1070 (4th Cir. 1988)). A regulation is void for vagueness where it “fails to give a person of ordinary intelligence fair notice that his contemplated conduct is forbidden by the statute” or where “it encourages arbitrary and erratic arrests and convictions.” Papachristou v.
City of Jacksonville, 405 U.S. 156, 162 (1972). “Less clarity is required in purely civil statutes because the ‘consequences of imprecision are qualitatively less severe.’” Manning v. Caldwell for City of Roanoke, 930 F.3d 264, 272 (4th Cir. 2019) (quoting Vill. Of Hoffman Ests., 455 U.S. at 499); see also Greenville Women’s Clinic v. Comm’r, S.C. Dep’t of Health and Env’t, 317 F.3d 357, 366 (4th Cir. 2002). In this case, NetChoice has sufficiently alleged in Counts III and IV that the
19 In the context of a challenge to a state law under the First Amendment, void for vagueness claims arise under the Due Process Clause of the Fourteenth Amendment. Fusaro, 19 F.4th at 371. “best interest of children” and “reasonably likely to be accessed by children” standards, respectively, are unconstitutionally vague. i. Count III
NetChoice alleges in Count III that the “best interests of children” standard is unconstitutionally vague in violation of the Due Process Clause of the Fourteenth Amendment. It alleges that this standard treats all “children” without regard for age; requires covered entities to evaluate children’s interests based on “benefit” to covered entities and “detriment” to children but fails to define “benefit” and “detriment;” and does not provide a clear structure for the benefit-detriment balancing it requires of covered entities. (ECF No.
27 ¶¶ 155–169.) Nor does the Kids Code define how covered entities are supposed to determine whether conduct will “[r]esult in . . . [r]easonably foreseeable and material physical or financial harm;” “[s]evere and reasonably foreseeable psychological or emotional harm;” “highly offensive intrusion on children’s reasonable expectation of privacy;” or discrimination. (Id. at ¶¶ 82, 170, 171 (quoting MD. CODE ANN., COM. LAW §§ 14-4801(c)(2)(i)–(iv)).) NetChoice alleges that this lack of clarity presents compounding vagueness: covered entities
must determine the best interests of children in these broadly identified areas without any metric to determine, for example, what constitutes “severe” harm. As the Supreme Court has explained, “[v]agueness and [its] attendant evils . . . are not rendered less objectionable because the regulation of expression is one of classification rather than direct suppression.” Interstate Cir., Inc. v. City of Dallas, 390 U.S. 676, 688 (1968). Thus, an ordinance that tied licensing to whether films were “suitable for children” was
unconstitutionally vague because it depended on subjective “moral judgment” that failed to provide “narrowly drawn, reasonable and definite standards for the officials to follow.” Id. at 687, 690 (quoting Niemotko v. Maryland, 340 U.S. 268, 271 (1951)). Notably, when evaluating a “best interests” standard in a similar California law, the Northern District of California
recognized that this standard—which California also defined based on whether practices are “‘detrimental’ to [a] child’s ‘physical health, mental health, or well-being’”—was likely unconstitutionally vague because “[r]easonable minds may differ on what is ‘detrimental to a child[] . . . .” NetChoice, LLC v. Bonta, 770 F. Supp. 3d 1164, 1205, 1206 (N.D. Cal. 2025). The “best interests” standard comes from family law, in which courts evaluate extraordinarily fact- specific circumstances for individual children. See id. at 1205–06. Unlike the family law
context, however, covered entities allegedly must follow the “best interests of children” standard on a much broader scale. NetChoice has sufficiently alleged that this standard depends on subjective determinations not defined in the Kids Code such that it fails to provide businesses or enforcing officials reasonable, definite standards of practices that meet it. Defendants’ Motion (ECF No. 35) is DENIED as to Count III. ii. Count IV
Similarly, NetChoice has sufficiently alleged in Count IV that the “reasonably likely to be accessed by children” standard is unconstitutionally vague because it does not provide entities enough information to determine whether they are covered under the Kids Code. In the context of regulations that limit speech, “reasonably likely” standards may be unconstitutionally vague where they depend on ill-defined, subjective determinations of a given outcome’s likelihood. See, e.g., Hirschkop v. Snead, 594 F.2d 356, 371 (4th Cir. 1979)
(holding unconstitutionally vague rule prohibiting a lawyer from making statements about “matters that are reasonably likely to interfere with a fair trial” because it depended “entirely on what statements the disciplinary authority believes reasonably endangers a fair trial”). In this case, whether a product is “reasonably likely to be accessed by children” appears to depend
at least in part on what the State deems to be a “significant number of children” or a “substantially similar product,” but NetChoice has sufficiently alleged that these terms lack clear definitions. MD. CODE ANN., COM. LAW § 14-4801(s). Such unclear terms “force[] [entities] to ‘guess at [the standard’s] contours’” in a manner that raises a cognizable vagueness claim. In re Murphy Brown, LLC, 907 F.3d 788, 800 (4th Cir. 2018) (quoting Gentile v. State Bar of Nev., 501 U.S. 1030, 1048 (1991)).
The Kids Code defines “reasonably likely to be accessed by children” to include services or products (1) “directed to children as defined in the federal Children’s Online Privacy Protection Act,” MD. CODE ANN., COM. LAW § 14-4801(s)(1); (2) “determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children,” id. § 14-4801(s)(2); (3) “substantially similar or the same as an online product that satisfies item (2),” id. § 14-4801(3); (4) that “feature[] advertisements
marketed to children,” id. § 14-4801(4); (5) as to which “[t]he covered entity’s internal research findings determine that a significant amount of the online product’s audience is composed of children,” id. § 14-4801(5); or (6) as to which “the covered entity knows or should have known that a user is a child,” id. § 14-4801(6). As NetChoice alleges, this standard leaves unclear whether websites must measure “significant number of children” by proportional percentage or by number of child users, even as similar laws in other states have defined “significant
number of children” directly in the statute. (Id. ¶ 200 (citing Vt. S. 289 § 1 (2024)), ¶ 201.) Relatedly, NetChoice has sufficiently alleged that the Kids Code does not provide notice of what it means for products to be “substantially similar” in this context because it does not define substantial similarity. At this stage, NetChoice has sufficiently alleged that a reasonable
entity “reading the regulation in its entirety and in the context of [Maryland] statutes, would” not be able to determine whether it is subject to the Kids Code. Greenville Women’s Clinic, 317 F.3d at 366. Defendants’ Motion (ECF No. 35) is DENIED as to Count IV. c. Preemption Claims: Counts VII, VIII Finally, NetChoice has adequately stated claims for relief based on preemption in Counts VII and VIII. Under the Supremacy Clause of the U.S. Constitution, federal law is
the “supreme Law of the Land.” U.S. CONST. art. IV, cl. 2. Thus, any conflicting state law is preempted and “without effect.” Wash. Gas Light Co. v. Prince George’s Cnty. Council, 711 F.3d 412, 419 (4th Cir. 2013) (citing AES Sparrows Point LNG, LLC v. Smith, 527 F.3d 120, 125 (4th Cir. 2008)). “Congress may indicate pre-emptive intent through a statute’s express language or through its structure and purpose.” Altria Grp. Inc. v. Good, 555 U.S. 70, 76 (2008) (citing Jones v. Rath Packing Co., 430 U.S. 519, 525 (1977)). Express preemption may arise where a
statute contains an express preemption clause stating Congress’s intent to preempt state regulations. Id. Alternatively, preemption may be implied in two circumstances: (1) “conflict preemption,” which arises when the federal and state laws conflict; and (2) “field preemption,” which is not at issue in this case. Metro. Life Ins. Co. v. Massachusetts, 471 U.S. 724, 747–48 (1985). Conflict preemption may exist where (1) “it is ‘impossible for a private party to comply with both state and federal requirements;’” or (2) a “state law ‘stands as an obstacle to the
accomplishment and execution of the full purposes and objectives of Congress.’” Freightliner Corp. v. Myrick, 514 U.S. 280, 287 (1995) (first quoting English v. Gen. Elec. Co., 496 U.S. 72, 78– 79 (1990); and then quoting Hines v. Davidowitz, 312 U.S. 52, 67 (1941)). Courts presume “that the historic police powers of the States [are] not to be superseded
by the Federal Act unless that was the clear and manifest purpose of Congress.” Altria Grp. Inc., 555 U.S. at 77 (quoting Rice v. Santa Fe Elevator Corp., 331 U.S. 218, 230 (1947)). “The purpose of Congress is the ultimate touchstone in every preemption case,” id. at 76 (quoting Medtronic, Inc. v. Lohr, 518 .S. 470, 485 (1996)), and “when the text of a pre-emption clause is susceptible to more than one plausible reading, courts ordinarily ‘accept the reading that disfavors preemption,” id. at 77 (quoting Bates v. Dow Agrosciences, LLC, 544 U.S. 431, 449
(2005)). Thus, courts apply a presumption against implied preemption. See Wyeth v. Levine, 555 U.S. 555, 565, n.3 (2009) (collecting cases). As this Court has recognized, however, “a court should not apply a presumption against preemption when a ‘statute contains an express preemption clause.’” In re Smith & Nephew Birmingham Hip Resurfacing (BHR) Hip Implant Prods. Liab. Litig., 300 F. Supp. 3d 732, 742 n.8 (D. Md. 2018) (quoting Puerto Rico v. Franklin Cal. Tax-Free Tr., 579 U.S. 115, 125 (2016)). NetChoice raises express and implied preemption
under two federal laws.20
20 As district courts within the Fourth Circuit have explained, preemption is a matter of law that may be suitable for decision at the pleading stage. See, e.g., Covert v. Stryker Corp., 2009 WL 2424559, at *9 n.8 (M.D.N.C. Aug. 5, 2009) (collecting cases). Where determination of preemption claims would not finally resolve the case and implicates significant substantive concerns or factual nuances, however, district courts in this Circuit have declined to adjudicate such claims at the pleading stage. See Woodson v. U.S. Airways, Inc., 67 F. Supp. 2d 554, 557 (M.D.N.C. 1999); Signature Flight Support LLC v. Carroll, 2021 WL 4352564, at *4–5 (W.D. Va. Sep. 24, 2021). In this case, it appears that neither this Court nor the Fourth Circuit has previously considered preemption of a state statute in this context, and courts reviewing similar preemption challenges have reached differing conclusions. Compare, e.g., H.K. through Farwell v. Google, LLC, 595 F. Supp. 3d 702 (C.D. Ill. 2022) (deeming Illinois law regulating collection of data from children preempted by COPPA); with Jones v. Google, LLC, 73 F.4th 636 (9th Cir. 2023) (holding COPPA did not preempt state-law claims based on data collection in alleged violation of children’s privacy). NetChoice’s preemption claims raise “important and difficult question[s]” that appear to be “of first impression within this circuit” such that, to the extent those claims are i. Count VII NetChoice has sufficiently alleged in Count VII that the Kids Code is expressly or impliedly preempted by the federal Children’s Online Privacy Protection Act, 15 U.S.C. §§
6501 et seq., (“COPPA”). To state a claim that a state statute is expressly preempted by federal law, a plaintiff must allege that “Congress expressly state[d] its intent to preempt state law.” See Ass’n of Am. Publishers, Inc. v. Frosh, 586 F. Supp. 3d 379 , 388 (D. Md. 2022) (quoting Decohen v. Cap. One, N.A., 703 F.3d 216, 223 (4th Cir. 2012)). Also known as the “clear statement rule,” this requirement emphasizes that “[e]xpress preemption is a question of statutory construction, requiring a court to look to the plain wording of the statute and the surrounding
statutory framework to determine whether Congress intended to preempt state law.” Jones v. Google, LLC, 73 F.4th 636, 641 (9th Cir. 2023). COPPA’s preemption clause states: No State or local government may impose any liability for commercial activities or actions by operators in interstate or foreign commerce in connection with an activity or action described in this chapter that is inconsistent with the treatment of those activities or actions under this section. 15 U.S.C. § 6502(d). This provision allows states to create “consistent” parallel legislative schemes covering online privacy of children. Jones, 73 F.4th at 643, 644. At this pleading stage, NetChoice has sufficiently alleged that the Kids Code is expressly preempted as inconsistent with COPPA to the extent that (1) the statutes cover the same issues;21 and (2) the Kids Code holds entities liable for activity permissible under
adequately alleged, their adjudication should await development of a full factual record. Woodson, 67 F. Supp. 2d at 557. 21 COPPA applies only to collection of data from children under the age of thirteen, while the Kids Code applies to data collection and entities engaging children under the age of 17. Thus, there is some overlap in the statutes’ applicability, but the Kids Code sweeps more broadly than COPPA. As discussed below, NetChoice alleges that the Kids Code’s regulation of entities interacting with children between the ages of 13 and 17 is another basis for preemption. (ECF No. 27 ¶ 282.) COPPA. An online provider is only subject to liability under COPPA where it “has actual knowledge that it is collecting personal information from a child,” 15 U.S.C. § 6502(a)(1), but a provider may be subject to liability under the Kids Code where its products are “reasonably
likely to be accessed by children” regardless of the provider’s actual knowledge that it is collecting data from children, MD. CODE ANN., COM. LAW § 14-4801(s). The Kids Code applies to products “reasonably likely to be accessed by children,” which it defines to include circumstances in which a “covered entity knows or should have known that a user is a child.” Id. § 14-801(s)(6) (emphasis added). Thus, to the extent that COPPA only imposes liability for activity covered thereunder where websites have “actual knowledge that [they are] collecting
personal information from a child” and expressly preempts inconsistent state laws, 15 U.S.C. §§ 6502(b)(1)(A), 6502(d), NetChoice has sufficiently alleged that the Kids Code’s broader “reasonably likely to be accessed by children” standard imposes liability for conduct that would be permissible under COPPA. See, e.g., New Mexico ex rel. Balderas v. Tiny Lab Prods., 457 F. Supp. 3d 1103, 1110, 1120–21 (D.N.M. 2020) (holding claims expressly preempted by COPPA where plaintiff failed to allege defendants acted with “actual knowledge that they were
collecting personal information from users of child-directed apps”). Express preemption determinations require nuanced analysis of both the state and federal statutes at issue, but at this pleading stage NetChoice has sufficiently alleged an express preemption claim based on the differing mens rea requirements in the Kids Code and COPPA. For substantially the same reasons, NetChoice has also sufficiently alleged that the Kids Code is impliedly preempted by COPPA via obstacle or conflict preemption. In an obstacle
preemption analysis, a court should analyze the whole text of the statute to identify if any obstacles exist, but it should look beyond the text of the entire statute only if the statute’s language is ambiguous. Crosby v. Nat’l Foreign Trade Council, 530 U.S. 363, 373 (2000). Conflict preemption applies where it is not possible for a party to comply with both state and federal
law. See Gade v. Nat’l Solid Wastes Mgmt. Ass’n, 505 U.S. 88, 98 (1992) (quoting Florida Lime & Avocado Growers, Inc. v. Paul, 373 U.S. 132, 142–43 (1963)). To the extent that NetChoice alleges implied preemption based on the Kids Code’s creation of an enforcement scheme relative to children under the age of 13 that is inconsistent with COPPA’s enforcement scheme, it has sufficiently stated a claim based on implied preemption. Specifically, to the extent NetChoice alleges that the Kids Code’s mens rea requirement would impose liability for conduct that is
permissible under COPPA, it has sufficiently stated a claim for relief based on implied preemption at this pleading stage. Finally, as noted above, COPPA only regulates data collection from children under 13 and is silent as to data collection from children between the ages of 13 and 17. 15 U.S.C. § 6501(1) (defining “child” to mean “an individual under the age of 13”). NetChoice alleges that the Kids Code is thus preempted to the extent it applies to children between the ages of
13 and 17 because such regulation is inconsistent with Congress’s decision in COPPA to regulate collection of data only from children under 13. (ECF No. 27 ¶ 282.) As noted above, states are permitted to have parallel enforcement schemes that are consistent with federal law, and COPPA’s express preemption provision by implication permits imposition of state liability where it is “[]consistent with the treatment of . . . activities or actions under” COPPA. 15 U.S.C. § 6502(d); see also Jones, 73 F.4th at 644. At this stage, NetChoice has sufficiently
alleged that the Kids Code is inconsistent with COPPA as to at least some of its requirements and thus may be preempted. Accordingly, Defendants’ Motion (ECF No. 35) is DENIED as to Count VII. ii. Count VIII
Similarly, NetChoice has sufficiently alleged in Count VIII that Section 230 of the federal Communications Decency Act, 47 U.S.C. § 230, expressly or impliedly preempts Kids Code § 14-4804, which requires DPI Assessments, and § 14-4806, which restricts data processing and profiling, to the extent those Sections would hold covered entities liable for the dissemination of third-party speech. See (ECF No. 27 ¶ 287.) “Under § 230 of the Communications Decency Act of 1996, interactive computer service providers are immune
from state law liability when plaintiffs seek to treat those providers as publishers of third-party content.” Doe v. Grindr, Inc., 128 F.4th 1148, 1151 (9th Cir. 2025) (citing 47 U.S.C. § 230(c)(1)); accord Zeran v. Am. Online, Inc., 129 F.3d 327, 330–31 (4th Cir. 1997). Section 230 “only protects from liability (1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.” Doe, 128 F.4th at 1151 (quoting Barnes v.
Yahoo!, Inc., 570 F.3d 1096, 1100–01 (9th Cir. 2009)). Generally, “if a duty imposed by state law ‘requires that [the defendant] moderate content to fulfill its duty, then § 230 immunity attaches.” NetChoice v. Bonta, 790 F. Supp. 3d 798, 808 (N.D. Cal. 2025) (quoting Est. of Bride by & through Bride v. Yolo Techs., Inc., 112 F.4th 1168, 1177 (9th Cir. 2024)). Similarly, if a “state law ‘obliges the defendant to monitor third-party content—or else face liability—then that too is barred’ by Section 230.” Id. (quoting Calise v. Meta Platforms, Inc., 103 F.4th 732, 742 (9th
Cir. 2024)). Like COPPA, Section 230 contains an express preemption clause, which provides: “Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. §
230(e)(3). At this pleading stage, NetChoice has sufficiently alleged in Count VIII that Kids Code § 14-4804 and § 14-4806 are expressly or impliedly preempted by Section 230. As an initial matter, it has sufficiently alleged that its members who are covered entities are providers of interactive computer services. Next, it has sufficiently alleged that § 14-4804(b)(4) of the Kids Code requires covered entities to take steps “to comply with the duty to act in a manner
consistent with the best interests of children,” which NetChoice has sufficiently alleged could include altering presentation of third-party speech to children. (ECF No. 27 ¶¶ 96, 102, 225 (citing MD. CODE ANN., COM. LAW § 14-4804(b)(4)).) Similarly, NetChoice has sufficiently alleged that § 14-4806 of the Kids Code restricts the use of dark patterns and other data in a manner that may alter which third-party content members offer children and others using their products. At this stage, therefore, NetChoice has sufficiently alleged that § 14-4806 restricts
websites’ “exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone, or alter [third-party] content” in a manner that plausibly contradicts Section 230. Zeran, 129 F.3d at 330; (ECF No. 27 ¶¶ 287–88). To the extent that websites would then face liability for editorial choices regarding third parties’ content, therefore, the Kids Code may be expressly preempted by Section 230.22 Thus, Defendants’ Motion (ECF No. 35) is DENIED as to Count VIII.
CONCLUSION For the reasons stated above, Defendants’ Motion to Dismiss (ECF No. 35) is DENIED, and Defendants shall file an Answer within fourteen (14) days of the date of this Memorandum Opinion and Order. A separate Order follows.
Date: November 24th, 2025 /s/ Richard D. Bennett United States Senior District Judge
22 Defendants have emphasized that the Kids Code contains language that “[n]othing in this subtitle may be construed to require a covered entity to monitor or censor third-party content or otherwise impact the existing rights and freedoms of any person . . . .” MD. CODE ANN., COM. LAW § 14-4803(5). As noted above, preemption analyses require nuanced evaluation of the federal and state laws at issue, which best awaits further development of this case. At this pleading stage, NetChoice has met its burden to allege facts sufficient to state a claim for relief—namely, that its members may face state liability in a manner inconsistent with Section 230— and the Court looks no further than that standard as required under Rule 12(b)(6).
Related
Cite This Page — Counsel Stack
NetChoice v. Anthony G. Brown, in his official Capacity as the Maryland Attorney General, William D. Gruhn, in his official Capacity as the Chief of the Division of Consumer Protection of the Maryland Office of the Attorney General, Counsel Stack Legal Research, https://law.counselstack.com/opinion/netchoice-v-anthony-g-brown-in-his-official-capacity-as-the-maryland-mdd-2025.