Skip to Content

State Laws Regulating Privacy Practices of Social Media Providers

08.04.2025

State legislatures have recently begun directly regulating the privacy practices of companies that provide social media solutions to consumers.

Common among these new laws are age verification requirements and parental control and consent requirements. Some laws go further, such as requiring ‘deplatforming’ in certain circumstances and restricting ‘shadow banning’ practices. Many of these statutes define social media to include any technology that facilitates interactions among end users and can carry significant penalties for noncompliance. 

Below is a summary of certain provisions of certain laws passed in recent years that are aimed directly at social media providers and intend to regulate social media content and user data. If your business provides social media solutions to residents of these states, Morris, Manning & Martin, LLP is pleased to offer an initial consultation to new and existing clients.

Last updated: August 4, 2025

____________________________________________________________________________________________

Arkansas

SB 396 – Ark. Code Ann. § 4-88-1402 - § 4-88-1403 (effective September 1, 2023): This law requires certain social media companies to verify that all users are either at least 18 years old or have parental consent to create an account. Age verification must be performed by a third-party vendor and must adhere to prescribed verification methods. Note that enforcement has been enjoined as of the date of this article.[1]

California

AB 587 – Cal. Bus. & Prof. § 22675 – 22681 (effective January 1, 2023): This law requires certain social media companies to post their terms of service in a specified manner and with specified content and to make a semiannual terms of service report to the California Attorney General describing how the company flags content, groups, or users, the company’s related response processes, and a list of actions the company may take such as removal, demonetization, or shadow banning. This report must include any definitions of hate speech, racism, extremism, radicalization, disinformation, misinformation, harassment, and foreign political interference, if the companies include such terms in their terms of service. The requirement to include definitions of terms such as hate speech and racism in the report is under a preliminary injunction as of the date of this article.[2]

Florida

SB 7072 – Fla. Stat. § 501.2041 (effective April 22, 2022): This law requires certain social media platforms to apply censorship and deplatforming standards in a consistent manner, to notify a user before shadow banning them, and to allow users to opt out of algorithm-driven feeds, allowing users the option of having a purely chronological feed. Additionally, social media platforms cannot censor, deplatform, or shadow ban political candidates during their candidacy, nor censor, deplatform, or shadow ban “journalistic enterprises” on the basis of their content. Social media platforms must also, upon request, provide a user with the number of other users who were shown the requesting user’s content or posts. The law was preliminarily enjoined and ultimately remanded back to the 11th Circuit by the Supreme Court for further review.[3]

HB 3 – Fla. Stat. §§ 501.1736 – 501.1737 (effective January 1, 2025): This law requires social media platforms to delete accounts of users that are younger than (or that the platform “treats or categorizes” as belonging to an account holder who is “likely younger than”) 14 years old. For users who are 14 or 15, the platform must acquire parental consent to allow such users to create an account. This law is currently under preliminary injunction, which, as of the date of this article, the Florida Attorney General has appealed to the 11th Circuit Court of Appeals.[4]

Georgia

SB 351 - O.C.G.A. §§ 39-6-1 to 39-6-5 (effective July 1, 2025): This law requires certain social media companies to implement age verification practices and prevents children under the age of 16 from creating social media accounts without parental consent. Enforcement is preliminarily enjoined as of the writing of this article.[5]

Louisiana

HB 61 – La. Stat. Ann. § 9:2712.2 (effective May 8, 2024): This law requires age verification. No “interactive computer service,” broadly defined to encompass all social media companies, may allow a minor to create an account without the consent of a legal representative of the minor. 

HB 577 – La Stat. Ann. § 51:1761 – 1763 (effective July 1, 2025): This law prohibits certain social media companies from sending targeted advertisements to minor account holders and from selling certain personal data of the minor account holder. Such restricted data includes the account holder’s race, religion, gender, citizenship status, medical history, biometric data, and geolocation data.

Mississippi

HB 1126 – “Walker Montgomery Protecting Children Online Act” – Miss. Code Ann. §§ 45-38-1 – 45-38-13 (effective July 1, 2024): This law requires age verification and parental consent before minors can create social media accounts. Covered businesses must practice data minimization and purpose restriction regarding the data of minors. This law is currently under a preliminary injunction and cannot be enforced against Dreamwidth, Meta, Nextdoor, Pinterest, Reddit, Snap, Inc., X, and YouTube.[6] The injunction is currently being appealed.

New York

S7694 – NY Gen Bus Law §§ 1500-1508 (effective December 17, 2024): This law makes it illegal to provide an “addictive feed” without utilizing age verification to confirm that the relevant individual is not a minor or without first obtaining parental consent for minors and prohibits certain social media platforms from withholding non-addictive feed products or services where that consent is not obtained. The law prohibits “addictive social media platforms” from sending certain notifications to minors between midnight and 6:00 AM.

Tennessee

HB 1891 – “Protecting Children from Social Media Act,” Tenn. Code Ann. §§ 47-18-5701 – 5706 (effective January 1, 2025): This law requires certain social media companies to verify the age of account holders and requires parental consent before a minor is allowed to become an account holder. Such social media companies are also required to allow parents to view privacy settings, set daily time restrictions, and implement periods in which the minor cannot access the account. There is active litigation contesting the legality of the law. The law is still in force as of the writing of this article.

Texas

HB 18 – Tex. Bus. & Com. Code Ann. § 509.002-152 (effective September 1, 2024): This law requires age verification for social media users. In addition, the law restricts minors from making purchases on social media sites, the sites cannot collect certain identifying information about minors, the site must implement a strategy to prevent exposing minors to content that glorifies bullying, self-harm, and grooming, and the site must build parental tools that allow parents to control minors’ access to the site. Finally, the terms of service must include an explanation of how the algorithm functions, including what personal identifying information is used to provide certain content.

The harm prevention, parental tools, and algorithm requirements are currently enjoined.[7] The Texas Attorney General has appealed the injunction.

Utah

SB 194 – “Utah Minor Protection in Social Media Act” - Utah Code Ann. §§ 13-71-101 – 13-71-401 (effective October 1, 2024): This law requires social media companies to verify the age of all users and requires companies to take certain steps to maintain the privacy of minors and to restrict collection of minors’ data. The law requires parental consent to bypass these requirements. Enforcement of the law is under preliminary injunction as of September 10, 2024.[8] That injunction is being appealed. 

Virginia

SB 854 – Va. Code Ann. § 59.1-577.1 (effective January 1, 2026): This law requires social media companies to perform age verification to determine if users are minors and to limit such minors to one hour per day of use. Only verified parental consent can increase or decrease the one-hour limit.

____________________________________________________________________________________________________

MMM Contacts:

Morris, Manning & Martin, LLP notes that the above list may not be exhaustive or complete. The above information is provided by Morris, Manning & Martin, LLP for general information purposes only and does not constitute legal advice or establish an attorney-client relationship.

____________________________________________________________________________________________________

[1] See NetChoice, LLC v. Griffin, No. 5:23-CV-51052025, WL 978607 (W.D. Ark. Mar. 31, 2025) (granting Netchoice LLC’s Amended Motion for Summary Judgment under First Amendment vagueness claims).

[2] See X. Corp. v. Bonta, 116 F.4th 888 (9th Cir. 2024) (remanding to the U.S. District Court for the Eastern District of California for further analysis of the law’s requirement that social media companies must report whether and how they define six categories of content).

[3] See Moody v. NetCoice, LLC, 603 U.S. 707 (2024).

[4] See generally Computer & Communications Industry Ass’n v. Uthmeier, No. 4:24cv438-MW/MAF, 2025 WL 1570007 (N.D. Fla. Jun. 3, 2025).

[5] See generally Netchoice v. Carr, No. 1:25-cv-2422-AT2025, WL 1768621 (N.D. Ga. Jun. 26, 2025).

[6] See generally NetChoice, LLC v. Fitch, No. 1:24-cv-170-HSO-BWR, 2025 WL 1709668 (S.D. Miss. Jun. 18, 2025) (holding that the Preliminary Injunction is granted in part, to the extent it applies to the eight covered members listed above).

[7] See Computer & Communications v. Paxton, 747 F. Supp.3d 1011 (W.D. Tex. Aug. 30, 2024) (placing a preliminary injunction on HB 18’s monitoring-and-filtering requirements but stating that the remainder of the law is not subject to the injunction).

[8] See generally NetChoice, LLC v. Reyes, 748 F. Supp.3d 1105 (D. Utah Sept. 10, 2024).