Top Qs
Timeline
Chat
Perspective

Social media age verification laws in the United States

From Wikipedia, the free encyclopedia

Social media age verification laws in the United States
Remove ads

Social media age verification laws in the United States are laws ostensibly designed to limit young people's access to content deemed problematic such as pornography. The design and ultimate intent of such laws is the subject of considerable controversy.

Thumb
Status of social media age verification laws in the United States

Laws

Summarize
Perspective

Many state legislatures have considered or enacted legislation pertaining to young people and social media.

In 2022, California passed the California Age-Appropriate Design Code Act (AB 2273) requiring websites that are likely to be used by minors to estimate visitors' ages.[1][2][3]

On March 23, 2023, Utah Governor Spencer Cox signed SB 152 and HB 311, collectively known as the Utah Social Media Regulation Act, which requires age verification; if a user is under 18, they have to get parental consent before making an account on any social media platform.[4][5][6][7]

Few laws have gone into effect partially due to court challenges.[8][9][10][11][12][13][14][15][16][17][18][19][20]

Arkansas

On April 11, 2023, Arkansas enacted SB 396, the Social Media Safety Act. The law requires certain social media companies that make over $100 million per year to verify the age of new users using a third party, and to obtain parental consent for users under 18. It excludes social media companies that allow a user to generate short video clips as well as games.[20][18] The law was set to go in effect in September 2023.[21]

On June 29, 2023, NetChoice sued the Attorney General of Arkansas Tim Griffin in The Western District Court of Arkansas to block enforcement of the law,[22][23] supported by the American Civil Liberties Union and the Electronic Frontier Foundation (EFF).[24][25] On July 7, 2023, NetChoice filed a motion for a preliminary injunction to block enforcement of the law.[26] On July 27, Griffin and Tony Allen filed briefs in opposition to the preliminary injunction.[27][28] The preliminary injunction was granted by Judge Timothy L. Brooks on August 31, reasoning that the law was too vague, that NetChoice's members will suffer irreparable harm if the act goes into effect, and that age restrictions were ineffective.[29][30][31]

California

California Age-Appropriate Design Code (AB 2273)

On September 15, 2022, California enacted AB 2273, the California Age-Appropriate Design Code Act.[32][33][3] Its most controversial provisions required online services that are likely to be used by those under 18 to estimate the age of child users with a "reasonable level of certainty". It also required these services to file Data Protection Impact Assessments (DPIAs) certifying whether an online product, service, or feature could harm children, including by exposing them to (potentially) harmful content. The law does not define harmful content.[1] Before the law took effect, EFF sent a veto request to Newsom.[34]

On December 14, 2022, NetChoice sued.[35] On September 18, 2023, Federal Judge Beth Labson Freeman granted a preliminary injunction.[36][37][10][38] The 9th Circuit on August 16, 2024, affirmed the injunction against the DPIA section of the law and sent the rest back, because the argument in the 9th circuit was mainly focused on the DPIA.[39][9][40][41]

Protecting Our Kids from Social Media Addiction Act (SB 976)

On September 20, 2024, California enacted SB 976, Protecting Our Kids from Social Media Addiction.[42][43] The law requires online platforms to exclude those under 18 from "addictive" feeds unless parental consent is given. It requires online platforms to not send notifications to someone under 18 between 12:00 AM and 6:00 AM without parental consent or between 8:00 am – 3:00 pm without parental consent from September through May (the law does not define what a "notification" is). The law took effect on January 1, 2025, with age verification required as of December 31, 2026.[44][45]

On November 12, NetChoice sued in the Northern District and before Judge Edward John Davila.[46][47][48][49] On December 31, the judge blocked the sections of SB 976 that required time-of-day restrictions. He also enjoined requirements to report on the number of minor users as well as the number of parental assents to access an addictive feed.[50]

He did not block the age assurance requirement or blocking minors from seeing addictive feeds without parental consent. His reasoning was that age assurance that runs in the background does not restrict adult access to speech and that regulating feeds does not violate the first amendment because it was content neutral and did not remove any content.[50][51]

On January 1, 2025, NetChoice filed a motion to fully block the law as part of its appeal to the Ninth Circuit. NetChoice claimed that the court erred in its reading of Supreme Court case Moody v. NetChoice by mainly focusing on the concurring opinions and not the deciding opinion.[52] The same day Davila decreed that California's response to NetChoice was due by 11:59 pm.[53] California responded the same day to NetChoice's motion, claiming that the court should not block the full law, claiming that NetChoice had misread Moody v. NetChoice and that NetChoice's members would not likely face any harm from the act because members such as X (formerly Twitter) already offer their members feeds that were not personalized.[54]

On January 2, Davila granted NetChoice's motion to block the full law during the appeals process by delaying the effective date of the law from January 1, 2025, to February 1, 2025.[55] That day NetChoice appealed the case to the Ninth Circuit Court of Appeals.[56]

Florida

On January 5, 2024, Tyler Sirois introduced HB 1, which would ban anyone under 16 from using any social media platform and would require platforms to verify the age of users.[57][58] After the bill passed, the American Civil Liberties Union (ACLU) published a blog post opposing the bill, it bill violates the rights of minors and adults.[59][60] The bill was vetoed by Governor Ron DeSantis on March 1, 2024, claiming that the State Legislature was going to enact a better alternative.[61][62] HB 3 then decreased the minimum age from 16 to 14, allowing minors aged 14 and 15 to make social media accounts with parental consent. Florida enacted it on March 25, 2024, and took effect on January 1, 2025.[63][64] A surge of 1,150% in VPN demand in Florida was detected after the law took effect. VPN services provide the ability to circumvent the law.[65]

On October 28, 2024, NetChoice and Computer and Communications Industry Association sued. The Judge is Chief Judge Mark E. Walker.[66][67][68] On February 28, 2025, arguments were heard on the motion for a preliminary injunction. Walker seemed skeptical of Florida's argument that the law did not violate the first amendment and said the State would have a hard time to justify a complete ban of youth under 14 from social media.[69][70][71] On March 13, Walker denied the motion for a preliminary injunction because the plaintiffs had not proven that at least one of their members had at least 10 percent of their users under 16 use their platform for at least 2 hours per day.[72] Plaintiffs filed an amended complaint and a renewed motion for a preliminary injunction which was granted on June 3, for failing First Amendment Intermediate scrutiny. The injunction left in force the provision that allowed parents to request termination of their child's social media account.[73]

Georgia

On April 23, 2024, Georgia enacted SB 351, which became Act 463.[74][75] Act 463 requires platforms to verify the age of users of social media platforms and require users under 16 years of age to have parental consent before creating an account. It also requires schools to ban all social media platforms, including YouTube.[76][77] Before the law was signed NetChoice sent a veto request to Kemp claiming the law was unconstitutional and was bad policy.[78] After the bill was enacted, ACLU and NetChoice criticized the bill.[79][80]

NetChoice sued two months before the law's effective date. The Judge is Amy Totenberg. the suit claims that the law violates the First Amendment and Fourteenth Amendments.[81]

Louisiana

Secure Online Child Interaction and Age Limitation Act (SB 162)

On June 28, 2023, Louisiana enacted SB 162, the Secure Online Child Interaction and Age Limitation Act.[82] It requires social media platforms to verify user age and get parental consent for users under 16, prohibits account holders under 16 from messaging adults unless they are already connected, and prohibits the display of advertising based on user data and the unnecessary collection of personal information. A parent or guardian of young users is permitted to monitor the child's account.[83][84][85]

The law excludes online email, video games, streaming services, news, sports, and entertainment as long as the content is not user generated. The law is administered by the Department of Justice of Louisiana effective July 1, 2024.[15][86][85] However, HB 577 was signed on June 18, 2024, delaying the effective date to July 1, 2025, and amended the Act to include a ban on targeted advertising to minors under 16.[87][88]

On March 18, 2025, NetChoice sued Attorney General Liz Murrill and the Director of the Public Protection Division of the Louisiana Department of Justice Mike Durpee in the U.S District Court for the Middle District of Louisiana.[89]

HB 61

Louisiana enacted HB 61 the same day as SB 162. The law requires parental consent for anyone under 18 before making an account on an "interactive computer service". It took effect on August 1, 2024.[90][91][92][93]

NetChoice testified in opposition to both laws and sent a veto request for HB 61.[94][95]

Mississippi

On April 30, 2024, Mississippi enacted HB 1126. the Walker Montgomery Protecting Children Online Act.[96]

Section 4 requires digital service providers (DSPs) to make a commercially reasonable effort to verify the age of anyone who wants to make an account in the state of Mississippi and requires parental consent if under 18.[97]

Section 5 requires DSPs to limit collection of minors' personal data, and not collect a minor's geolocation data or display targeted advertising not suitable for minors.[97]

Section 6 requires DSPs to "prevent and mitigate" the posting of harmful content about issues such as eating disorders and substance abuse as well as any illegal activity.[97]

On June 7, 2024, NetChoice sued in the Southern District Court of Mississippi to block enforcement of the law before it took effect on July 1, 2024.[98] On June 18, EFF filed a brief in the case in favor of a preliminary injunction.[99] The state responded on June 18.[100] On June 21, NetChoice filed its reply brief.[101] On July 1, Federal Judge Halil Suleyman Ozerden granted a preliminary injunction.[102][103][104][105]

The case was appealed to the 5th Circuit Court of Appeals on July 5.[106] On April 17, 2025, the Fifth Circuit lifted the preliminary injunction and remanded the case because the District Court did not apply the right standard of review for a facial challenge under Moody v. NetChoice. The opinion was written by Patrick Higginbotham and was joined by Don Willett. Judge James Ho wrote a concurring opinion saying he thought Clarence Thomas' dissent in Brown v. Entertainment Merchants Association was correct; however, he could not apply it since it was not the opinion of the court.[107]

Nebraska

On May 20, 2025, Nebraska enacted LB 383, the Parental Rights in Social Media Act, which requires social media companies to verify age and obtain parental consent for anyone under 18. Parents are allowed to view the profiles of their minor children. Fines on violators are up to $2,500 per violation, accompanied by a private right of action. The law takes effect July 1, 2026.[108][109]

New York

On June 20, 2024, New York enacted S7694A. the SAFE For Kids Act.[110][111][112] The law requires operators to use age determination technology and not give addictive feeds to anyone under 18 absent parental consent. It requires operators to not send notifications to minors' accounts between 12:00 AM – 6:00 AM without parental consent.[113] The law takes effect 180 days after the Attorney General of New York issues necessary rules and regulations. Violators face up to a $5,000 fine per violation.[113][111]

The law was criticized by EFF and NetChoice because of its age verification requirement, however neither EFF nor NetChoice has sued New York over the law yet.[114][115][116]

Ohio

HB 31

On July 4, 2023, Ohio enacted HB 33. One part of that bill was the Social Media Parental Notification Act, which requires online gaming and social media platforms that are likely to be used by under 16s and requires such users to obtain parental consent before they can make a contract on a social media or online gaming platform. The law took effect on January 15, 2024.[117][118][119][120] Governor Mike DeWine and Lieutenant Governor Jon Husted both advocated for the law to be added in the 2024-2025 bill.[121]

On January 5, 2024, NetChoice sued in the Southern District Court of Ohio claiming the law was unconstitutionally vague and was in violation of the First Amendment and Due Process Clause of the Fourteenth Amendment.[122] On January 9, Chief Judge Algenon L. Marbley granted a temporary restraining order, temporarily blocking the law.[123][124][125] On January 19 Husted filed a brief in opposition to a preliminary injunction, claiming that the law protects minors' mental health and privacy and protecting them from predators and that Ohio had a compelling interest in the law.[126] on On January 26, NetChoice filed another brief.[127] The Attorney General then submitted a reply brief.[128]

On February 7, a hearing on NetChoice's motion was held.[129] On February 12, Chief Judge Algenon L. Marbley granted NetChoice's motion.[130][131][8][132]

HB 96

On June 30, 2025, Ohio enacted HB 96, the bill for the fiscal years of 2026–2027, in Ohio. The bill contained the Ohio Innocence Act, which requires websites to verify user ages to prevent minors from viewing harmful content.[133][134] The law targets porn and other unnamed harmful sites that is primarily centered on explicit content and makes a significant amount of money on said content.[135] Age verification takes effect on September 30, 2025.[136]

Tennessee

On May 2, 2024, Tennessee enacted HB 1891 the Protecting Kids From Social Media Act.[137][138][139] The law requires social media companies to verify by a third party the age of all users within 14 days of them attempting to access an existing account and if that user is under 18 years of age, they must obtain parental consent. Parents are allowed to view privacy settings on their children's accounts, set time restrictions, and implement breaks during which the minor cannot access the account. The law took effect January 1, 2025.[140][141]

On October 3, 2024, NetChoice sued in the Middle District Court of Tennessee.[142][143][144] Chief Judge William L. Campbell Jr is assigned to the case.[145][146]

Texas

On June 13, 2023, Texas enacted HB 18, the SCOPE Act.[147][148][149][150] The law requires minors to obtain parental consent using a commercially reasonable method.[151]

Minors are not allowed to make purchases or engage in other financial transactions. DSPs are not allowed to collect minors' precise location or display targeted advertising to them.[151]

DSPs are required to prevent minors' exposure to harmful material and content that promotes, glorifies, or facilitates suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography, or other sexual exploitation or abuse.[151]

The bill was criticized by the Chamber of Progress because it required platforms to filter out "grooming" content that could include LGBTQ content and claim that the bill will have an isolating effect on LGBTQ minors.[152]

On July 30, 2024, the Computer and Communications Industry Association and NetChoice sued in the Western District Court of Texas.[153]

Later on, August 16, 2024, the Foundation for Individual Rights and Expression (FIRE) helped four plaintiffs sue as well.[154]

On August 30, Federal Judge Robert Pitman granted Computer and Communications Industry Association and NetChoice a preliminary injunction against the law's harmful to minors section.[155][156][157][158]

Utah

On March 23, 2023, Utah enacted SB 152 and HB 311, collectively the Utah Social Media Regulation Act.[4][6][7][5] SB 152 requires social media platforms with at least 5 million accounts to verify the age of all account holders. Users under 18 must obtain parental consent. The parent is allowed to view all posts and messages sent to the youth.[5] SB 152 prohibits direct messaging between users if that user is not linked to the account. The act prohibits the display of targeted advertising to minors and requires that, between 10:30 AM – 6:30 PM Mountain Standard Time, minors cannot access social media.[5]

HB 311 creates a private right of action for parents to sue social media companies from causing addiction/harm to minors with a stipulation that, if the minor is under 16, that the social media platform caused the harm.[4]

On December 18, 2023, NetChoice sued, arguing that the law was preempted by federal law, was unconstitutionally vague, and was in violation of the First Amendment and Due Process Clause of the Fourteenth Amendment. On December 20, NetChoice requested a preliminary injunction.[159][160][161][162] On January 19, 2024, Attorney General Sean Reyes announced that the law's effective date was delayed from March 2024 to October 2024 and that they would repeal and replace the law.[163]

SB 194 and HB 464, amending the act, were enacted on March 13, 2024. The amendments removed the 10:30 AM – 6:30 PM restriction and changed it so that parental consent was required only if a minor changed their privacy settings. It also replaced the age verification with age assurance that was at least 95% accurate.[164][165]

NetChoice updated its complaint and motion on May 3, 2024.[166][167] Utah briefed its opposition to the motion on May 31.[168] On July 22, Chief Judge Robert J Shelby granted in part the state's motion to dismiss, saying that the law did not violate Section 230 and therefore was not preempted by federal law.[169][170] On September 10, 2024, Shelby granted NetChoice's motion for a preliminary injunction.[171][172][11][173] The case was appealed to the 10th Circuit Court of Appeals on October 11. A week later the injunction was stayed by the district court.[174][175]

Virginia

On May 2, 2025, Virginia enacted 854, an amendment to the Virginia Consumer Data Protection Act that requires social media platforms to use commercially reasonable efforts to determine a user's age, and if that user is under 16, limited them to one hour per day per application without parental consent. The parent can increase or decrease the amount of time on a given platform. Violators face a fine of up to $7,500 per incident. The law allows a 30-day curation period since it is part of the Virginia Consumer Data Protection Act. The law is scheduled to take effect on January 1, 2026.[176][177][178][179][180]

Remove ads

Vetoed legislation

Summarize
Perspective

Colorado

On January 23, 2025, SB 25-086, Protections for Users of Social Media, was introduced. The bill requires certain social media platforms to report on how they respond to selected conduct on their services, such as the illegal sale of drugs or guns. Such conduct includes users that have not provided their true age.[181][182] The bill passed the Senate on February 26 by 28-5 and the House by 46–18 on March 31.[183][184]

On April 24, Governor Jared Polis vetoed the bill, claiming that it is flawed and erodes privacy. The next day, the Senate voted to overturn the veto by 29–6.[185] The House vote to overturn 51–13, not enough to override.[186][187]

Vermont

On January 9, 2024, H. 712, the age-appropriate design code, was introduced to the Vermont General Assembly.[188]

The bill requires services likely to be accessed by minors to act in the best interest of minors, which means that such services should process minors' data in a way that does not cause them physical, psychological or financial harm, and does not discriminate based on race, color, religion, national origin, disability, sex, or sexual orientation.[189]

The bill requires covered entities to conduct impact assessments for services that are likely to be accessed by minors to determine whether they would lead minors to become exposed to harmful or discriminatory contacts or conduct. The entities are required to provide any privacy information, terms of service, policies, and community standards concisely and use language that minors can understand.[189]

Covered entities may not collect a minor's precise geolocation information, using dark patterns, or profile a minor by default unless necessary, and must estimate user ages.[189]

Penalties are up to $2,500 per affected user and up to $7,500 per affected user for intentional violations.[189]

On January 17, S. 289, a modified companion bill was introduced in the Vermont Senate.[190] On March 19, S. 289 passed the Senate by a vote of 27–0.[191][192][193] On June 7, H. 121 passed the House. On June 13, governor Phil Scott vetoed H. 121, because it had a private right of action and because the Kids Code section of the bill was similar to a California bill that was enjoined for likely violating the First Amendment.[194][195] On June 17, his veto was sustained.[196]

Remove ads

Proposed legislation

Summarize
Perspective

Alabama

On February 6, 2025, HB 235 was introduced in the Alabama House of Representatives.[197][198] The bill requires any online service that allows users to upload content or view other users' content and employs algorithms that analyze user data or information on users to present content. The bill requires services that meet both criteria to prohibit anyone under 16 from using their service and verify user ages.[198]

Violations are considered a deceptive trade practice actionable under Chapter 19 of Title 8 of the Code of Alabama. Violations can result in fines up to $25,000 per violation and a Class A misdemeanor, which can carry up one year of imprisonment.[198][199][200] An online service can be fined an additional $50,000 per violation.[197][198]

The bill would take effect January 1, 2026.[198]

Alaska

On January 16, 2024, HB 271, the Alaska Social Media Regulation Act, was introduced. The bill requires social media platforms to verify user ages and that minors must have parental consent.[201]

The bill prohibits an online platform from displaying, sending, or targeting an advertisement to a minor or using data collected from a minor for advertising purposes.[202][203]

The bill prohibits an online platform from:

  • using an algorithm, artificial intelligence, machine learning, or other technology to select, recommend, rank, or personalize content for a minor based on the minor's profile, preferences, behavior, location, or other data.[202][203]
  • employing a feature, design, or mechanism that encourages or rewards a minor's excessive or compulsive usage, or that exploits minors' psychological vulnerabilities.[202][203]
  • allowing minors to use the platform between 10:30 pm – 6:30 am Alaska Standard Time.[202][203]

The bill would be enforced by a private right of action and by the state.[202][203]

The bill's first reading came on January 16, but died in the Labor & Commerce and Judiciary Committee.[201][204]

The R Street Institute opposed HB 271, claiming it "would almost certainly be found unconstitutional on several different counts" as well as that it would be governmental intrusion and was overly broad in its definitions.[205]

Arizona

On February 8, 2024, Seth Blattman introduced HB2858, the Protecting Children on Social Media Act, in the House. It failed in the House. The bill would have required social media platforms to:

  • establish default settings for the online service product or feature that provide the maximum degree of privacy protections to each user,
  • allow minors to opt out of the collection and use of their personal information,
  • prohibit using a minor's personal information in targeted advertising,
  • develop content filters for users to limit cyberbullying (later removed),[206][207][208][209]
  • prohibit non-minor users from sending a message to a minor (later removed),[207][210][208][209]
  • obtain parental consent for users under 16 (later removed).[207][210][208][209]

On January 13, 2025, Nick Kupper introduced HB2112 in the House.[211] Arizona Governor Katie Hobbs signed the bill into law on May 16, 2025.[212][213] The Act requires any commercial website where over one third of content is "sexual material harmful to minors" to:

  • Require users to either provide digital identification or utilize a commercial age verification system, that may use either government-issued identification or transactional data,[214][215]
    • Those who perform the age verification may not retain any identifying information on individuals,
    • Those who perform age verification may not allow cause or allow any identifying information of the individual to be directly or indirectly transmitted to any federal, state or local entity (later added),
  • Display mandatory notices on their websites describing the harms of pornography and include helpline contact information (later removed).

The Act authorizes a court to impose civil penalties against an entity in violation of this Act, in an amount up to:[214][215]

  1. $10,000 per day for failure to comply with age verification;
  2. $10,000 per instance of retaining identifying user data;
  3. $250,000 if one or more minors accesses pornography due to the entity's violation of the age verification requirements of the Act.

Penalties against an entity in violation of this Act are awarded to a successful plaintiff (later added).[215]

The Act specifies that it doesn't apply to news organizations, internet service providers, search engines and cloud services. It also specifies that animated and simulated sexual acts, displayed or described, are included under what is deemed "sexual material that is harmful to minors."[214]

Connecticut

On February 5, 2025, HB 6587 was introduced to the General Assembly.[216] A public hearing was held on February 10.[216] Attorney General William Tong submitted testimony in support, saying that the bill was needed to protect kids from addictive algorithms, that the bill would require social media companies to delete information from the age verification and parental consent process, and that many companies already have age verification for some of their services, easing compliance.[217]

The bill requires operators of social media platforms to:

  • use commercially reasonable methods to determine whether a user is a minor, and if so, obtain parental consent to access any recommendation system,
  • delete any data used to determine age or obtain parental consent.[218][219]
  • not send notifications to minors between 12 am - 6 am Eastern Standard Time and give parents the option to prohibit stop sending notifications entirely,[218][220]
  • report to the Attorney General detailing the number of covered users each year, the number of users who obtained parental consent, the number of users who had default settings turned on and off, the average amount of time these users spend on the platform, and detail the report broken down by age and number of hours.[218]

The bill would take effect on July 1, 2026, except that the transparency section would take effect on March 1, 2027.[218]

Idaho

Senate Bill 1417, the Parental Rights in Social Media Act, was introduced on March 8, 2024.[221]

The bill applies to social media platforms that have at least five million users, that allows users to create a profile, upload posts, and interact with others' posts, while excluding email, streaming services, online gaming, cloud storage services, and academic or scholarly research, as well as professional news, sports, or entertainment. Minor users must obtain parental consent.[222][223]

The bill would be enforced by a private right of action or the state. Violators can be fined up to $5,000 per violation or $2,500 for each incident of harm or actual damages for addiction, financial, physical, and emotional harm incurred by a minor user.[222][223]

The bill died in the State Affairs committee. It would have taken effect on January 1, 2025.[221][222][223]

Illinois

On February 8, 2024, Willie Preston introduced SB 3440, the Parental Consent for Social Media Act. The bill would have required social media companies who make more than $100 million per year to:

  • perform reasonable age verification by a third party either by a government-issued identification or any commercially reasonable age verification method,
  • obtain parental consent for minor accounts,[224][225]
  • enforce a curfew of for minors between 10 pm to 6 am Central Standard Time.[225][224]

It excludes email, direct messaging, streaming services, online shopping or e-commerce, cloud storage, visualization platforms, libraries, or hubs, providing or obtaining technical support for a social media company's platform, products, or services, academic or scholarly research or providing professional news, sports, entertainment, or other content. The bill permits comments on a digital news website, as long as content is posted only by the provider.[224]

The bill had its first reading and was referred to Assignments.[225]

Minor User of Social Media Protection Act (SB 3510)

On February 9, Laura Fine introduced SB 3510, the Minor User of Social Media Protection Act.[225] The bill requires social media platforms with sales greater than $100 million per year to:

  • verify the age of users and if that user is under 13 (a child), obtain parental consent,
  • not use the information of children for targeted advertising
  • enforce a curfew for children between 10 pm - 6 am Central Standard Time.[226]

The bill differs from SB 3440 in that it applies a lower age limit.[224][226] The bill had its first reading, but did not pass.[225]

Indiana

HB 1314

On January 10, 2024, Johanna King introduced HB 1314 to the House.[227] The bill requires social media services to:

  • verify user ages via a reasonable method,
  • require parental consent for minors after 14 days,[228]
  • not recommend content to minors' accounts or disseminate advertising to them.
  • enforce a curfew for minors from 10:30 pm – 6:30 am Eastern Standard Time.
  • prevent minors from changing or configuring an account.[228]
  • allow a parent to view all account activity, configure the account, and limit the number of daily usage hours.[228]

The bill would be enforced by a private right of action and by the state.[228] The bill died in committee.[229]

SB 11

On January 8, 2025, Mike Bohacek introduced SB 11.[230][231] The bill requires social media operators to:

  • identify all users,
  • obtain parental consent for users who are under 16 and notify the parent that consent can be revoked,
  • secure and encrypt data collected parental consent.[232]

Each violation would have faced a fine of up to $250,000, plus the cost of the investigation following a 90-day period were the defendant could cure the violation.[232][233][234] It would have taken effect on July 1, 2025.[232]

On January 15, the bill passed out of Indiana's Judiciary Committee by a vote of 10–1.[235] The bill passed the Indiana Senate by a vote of 42–7 on January 23.[236][237]

Iowa

On February 14, 2024, House File 2523 was introduced.[238] The bill requires social media platforms[239] exclusive of interactive gaming, virtual gaming, or online services that allows the creation and uploading of content for the purpose of interactive gaming, educational entertainment, or associated entertainment, and the communication related to such content.[239]

The bill requires social media platforms to:

  • obtain parental consent for minors to become users,[239]
  • allow the parent or guardian to view all posts created by the minor, view all messages sent or received, control the privacy and account settings, and monitor and limit the amount of time the minor spends on the platform.[239]

The bill would be enforced by the state and by a private right of action.[239]

The bill passed the House by a vote of 88–6. The bill died in the Senate.[238][240][241]

Kentucky

On February 1, 2024, House Bill 450 was introduced in the Kentucky Legislature.[242]

The bill requires social media platforms exclusive of email, search engines, cloud storage, product review sites, broadband internet services, or services that consists primarily of information or content that is not user-generated[243] to:

  • verify user ages via a digitized identification card, including a copy of a driver's license, government issued identification, financial documents or other documents that are reliable proxies for age, or any other reliable method via a trusted third-party vendor
  • obtain parental consent for minor users.[243]
  • allow parents to view all posts, messages sent or received by the minor, control privacy and account settings and monitor and limit the amount of time the minor spends on the platform.[243]

The bill would be enforced by the state and by a private right of action.[243]

Michigan

On September 11, 2024, Mark Tisdel, Donni Steele, Tom Kuhn introduced HB 5920 to the Michigan Legislature.[244]

The bill requires social media companies who have at least 5 million accounts to:

  • verify the age of all users within 14 days,
  • obtain parental consent for minor users,[245][246]
  • ensure that minors' accounts appear in search results only for users with whom they are friends;[245][246]
  • prohibit the use of targeted or suggested groups, services, products, posts, and accounts, or users in minors' accounts
  • make no use of minors' personal information,[245][246]
  • allow parents access to minors' accounts and view all posts and messages sent or received.[245][246]
  • Enforce a curfew for minors between 10:30 pm – 6:30 am.[245][246]

The Attorney General (AG) is responsible for establishing rules implementing these requirements, but not limit age verification to a valid government identification card.[245][246] The bill included a private right of action.[245][246]

The bill would take effect 180 days after enactment.[245][246]

Minnesota

On February 24, 2022, HF 3724 was introduced to the House.[247]

It would require social media platforms with at least 1 million users to:

  • avoid using algorithmic recommendation systems to present content to any minor, save for government or school-created content,[248] except to block access to inappropriate or harmful content, or comply with parental controls that filter content for age-appropriate materials.[249]

The bill made it through its first and second reading in the House.[250][251] Enforcement would be by the state of country.[252][253][254]

Missouri

On January 2, 2024, Josh Hurlbert Introduced HB 2157 to the House.[255]

The bill requires social media platforms[256] to:

  • verify the age of all users,
  • obtain parental consent for minors,[256]
  • enforce a 10:30 pm – 6:30 am curfew for minors,
  • prevent minors from appearing in search results for users who are not linked to those accounts, viewing any advertising, collecting or using minor personal information, or promoting, targeting or suggesting groups, services, products, posts, accounts, or users.[256]
  • allow parents and guardians to view all posts and messages sent or received,[256] set time-of-day restrictions, including modifying curfew hours, and limit the number of hours per day the minor can use the account.[256]

The bill applies to social media platforms, but excludes electronic mail, direct messaging services, streaming service, news, sports, entertainment, or other content that is preselected by the provider and not user-generated, online shopping or e-commerce, Interactive gaming or virtual gaming, photo editing services, a professional creative network made for artistic content, single-purpose community groups for public safety, cloud storage, and document collaboration services, providing access to or interacting with data visualization platforms, libraries, or hubs, providing or obtaining technical support for a platform, product, or service, academic, scholarly, or genealogical research from its definition of social media platform.[256]

It would have directed the Attorney General to establish rules for age verification, establish requirements for retaining, protecting, and securely disposing of information obtained by the age verification process, require that the information from the age verification process is retained for the purpose of compliance and not be used for any other purposes.[256]

The bill was to be enforced by the state and by a private right of action. Violations faces a penalty of $2,500 per violation.[256]

It would have taken effect on July 1, 2025.[256]

Nevada

On November 20, 2024, SB 63 was introduced to the Nevada Legislature.[257]

The bill defines social media platforms as those that allow users to establish an account, and create, share, and view user-generated content.[258] The bill requires platforms to:

  • verify user ages,
  • obtain parental consent from users aged 13–17,
  • prevent individuals under 13 from creating an account.[258]
  • use minors' personal information in algorithmic recommendation systems[258]
  • suspend notifications to minors from 12 am to 6 am or between 8 am to 3 pm Pacific Standard Time on weekdays from September to May absent parental consent.[258]
  • disable features such as infinite scrolling, the display of interactive metrics, icons or emoticons that tally or reveal user interactions with content, autoplay, and or livestreaming;[258]

The bill would task Nevada Department of Health and Human Services to recommend methods for obtaining parent consent, and assess whether age verification systems are at least 95 percent effective at assessing user ages.[258]

The bill would be enforced by the state and would go into effect October 1, 2025.[258]

New Mexico

On February 2, 2023, SB 319 was introduced in the Senate.[259] The bill requires online service providers that are likely to be accessed by minors to:

  • estimate the age of their users with a "reasonable level of certainty"
  • conduct data protection impact assessments to assess whether their services expose minors to harmful content or contacts.
  • not collect or share personal data or profiles of minors, including geolocation.[260][261]

Negligent violations would incur fines of not more than $2,500 per affected minor; and fines of not more than $7,500 per affected minor for each intentional violation. Enforcement is via the state.[260][261]

North Carolina

On April 17, 2023, HB 644, the Social Media Algorithmic Control in IT Act, was introduced in the House.[262]

The bill requires social media platforms hosting over one million users to:[263]

  • verify user ages;
  • exclude data from content recommendations to the minor;
  • allow content recommendations that are a direct result of explicit user actions.
  • exclude North Carolina minor users from algorithm targeted advertising or promotions, while allowing them based upon explicit user actions, such as a search,
  • make succinct (less than 250 words) privacy policy accessible and disclose how user data will be used by the platform.
  • incorporate user data in algorithmic recommendations only after the user has been notified and consents.
  • fully disclose data access to be used in algorithmic recommendations, including third-parties, separately from terms of service information
  • provide full functionality for users who do not consent for their user data to be used in algorithmic recommendations,
  • provide the Consumer Protection Division of the Department of Justice with the platform's privacy policy and certify that the platform complies with the law

The bill would have established the North Carolina Data Privacy Task Force within the Department of Justice. The bill would have been enforced by state.[263]

The bill died in the legislature.[263]

Oklahoma

Enforcement of this bill is done by the Attorney General of Oklahoma and social media companies will be given 45 days to comply the act before action by the Attorney General of Oklahoma is taken and if they do not comply within 45 days they will face fines of up to $2,500 per violation and be order to pay for court costs, and reasonable attorney fees as ordered by the court; or damages resulting from a minor accessing a social media platform without the consent of the minor's parent or custodian.[264]

The bill later passed the Oklahoma House of Representatives on March 14, 2024, by a vote of 69 - 16.[265][266][267]

The bill made it to its second reading in the senate, however died in the senate before being able to be passed.[268]

The bill would have taken effect July 1, 2024, if it had been signed into law.[264]

On February 5, 2024, Chad Caldwell introduced HB 3914 to the House.[269] The bill defines social media platforms as services with annual revenues exceeding $100 million that enable users to create public profiles, establish accounts, and create, upload, or interact with content, excluding subscription-based services focused on gaming, entertainment, email, or cloud storage where social interaction accounts for less than 25% of revenue.[264] The bill mandates that social media platforms:

  • Prohibit users under 16 from creating accounts,
  • Obtain parental consent for users aged 16–17,
  • Verify user ages via third-party digital identification,
  • Safeguard minors’ personal information,
  • Avoid profiling minors unless essential for service provision and without posing harm,
  • Refrain from collecting, selling, or sharing minors’ personal data beyond what is necessary for the service, unless justified without risk to minors,
  • Avoid collecting precise geolocation data unless strictly necessary, with clear notification to minors during collection,
  • Prohibit dark patterns that encourage minors to share excessive personal data or forgo privacy protections,
  • Limit use of personal data for age estimation and delete it promptly after verification.[264]

The state would enforce the bill, with a 45-day compliance period. Non-compliant platforms face fines up to $2,500 per violation, plus court costs, attorney fees, or damages.[264] The bill passed the Oklahoma House of Representatives on March 14, 2024, with a 69–16 vote but failed to pass the Senate.[265][266][267]

Oregon

On January 9, 2023, Senator Chris Gorsek introduced SB196 to the Oregon Senate.[270] The bill mandates online businesses likely accessed by children under 18 to:

  • Estimate user ages with reasonable accuracy,
  • Conduct Data Protection Impact Assessments to evaluate whether their services use children's personal information, expose them to harmful content, enable harmful interactions, or deliver harmful advertisements.[271]

The bill would have established an Age-Appropriate Design Task Force, comprising eight members with expertise in children's health, legal rights, data privacy, or internet science to study best practices to:[271]

  • Prioritize children's interests,
  • Assess how online service designs affect children,
  • Mitigate risks from data practices,
  • Publish clear privacy policies for children.

The state would have enforced the bill, imposing fines up to $2,500 per negligent violation and $7,500 per intentional violation.[271] The bill reached the Senate Judiciary Committee on January 13, 2023, but failed to pass.[270][271]

Pennsylvania

On February 20, 2024, Carolyn Comitta introduced H 2017 to the Pennsylvania House of Representatives.[272] The bill mandates social media companies to:

  • Verify account holders' ages,
  • Obtain parental consent for users under 16, maintaining documentation of consent,
  • Allow parents of minors under 16 to view privacy settings, receive notifications of reported content, and revoke consent at any time.[273]

Companies must remove harmful content, enable users to report such content, and allow minors under 18 to opt out of personalized recommendation systems. They cannot use manipulative design tactics, mine data unnecessarily, process geolocation data by default, or permit unknown adults to contact minors without consent.[273] Companies must delete minors' collected personal data upon request within 30 days and notify the minor of deletion within 90 business days.[273]

Rhode Island

On February 5, 2025, Joseph McNamara introduced H 5291 to the General Assembly.[274] The bill mandates social media services with over 5 million users to:

  • Verify user ages within 14 days.
  • Obtain parental consent for users under 18.
  • Exclude minors from search results, unlinked messaging, advertisements, and recommendations of accounts, groups, posts, products, or services.
  • Prohibit selling minors’ personal data,[274]
  • Provide parents or guardians with a password to access a minor's account, allowing them to view all posts and messages.[274]
  • Enforce a curfew for minors from 10:30 p.m. to 6:30 a.m., modifiable by parents.[274]

The state enforces the bill. Violators face fines up to $2,500 per violation, enforceable through a private right of action.[274]

South Carolina

On December 5, 2024, Representative Micah Caskey introduced HB 3431 as a prefile for the 126th General Assembly.[15] The bill comprises two sections: Age-Appropriate Design and Social Media Regulation.

Age-appropriate design

Covered social media platforms likely accessed by minors must:

  • Design features to reduce risks of compulsive use, mental health issues (e.g., anxiety, depression, self-harm), privacy intrusions, identity theft, or discrimination,
  • Provide minors with tools to limit time, hide location, or restrict communication,
  • Prevent notifications to minors from 10 p.m. to 6 a.m. EST or 8 a.m. to 3 p.m. during August to May,
  • Avoid profiling minors or using manipulative design tactics unless essential,
  • Offer parents tools to manage minor account settings,
  • Publish annual transparency reports by July 1.[86]

Social media regulation

Covered platforms must:

  • Verify user ages, requiring parental consent for those under 18,
  • Prohibit adults from messaging minors directly or targeting them with advertisements,
  • Ban collection of minors’ personal data,
  • Implement policies to block content promoting self-harm or illegal activities,
  • Prevent minors from bypassing restrictions via VPNs or proxies.[86]
  • The South Carolina Department of Education must develop programs to educate students on online safety.[86]

Violations of the Age-Appropriate Design section constitute deceptive practices under South Carolina law. Social Media Regulation violations incur fines up to $2,500 per violation, plus damages for financial, physical, or emotional harm, enforced by the state and a private right of action.[86] On February 20, 2025, the bill passed the House by an 89–14 vote.[275][276][277]

South Dakota

In October 2024, lawmakers in South Dakota announced plans on introducing legislation on having age verification and parental consent for minors to access any app on an app store with support around claiming it to be a good approach to social media.[278]

On January 30, 2025, SB 180 was introduced to the South Dakota Legislature. The bill requires app stores to have four age categories:[279]

  • Adult – 18 years of age or older
  • Older teen – 16 or 17 years of age
  • Younger teen – 13 to 15 years of age
  • Child – Under 13 years of age

The bill requires app stores to verify the age of anyone who attempts to download an app. If they are a minor under 18 years of age, they must have parental consent, and their parents with be notified that they are downloading an app from the app store and must not enforce their contract or terms of service against a minor unless they have parental consent. Companies must not knowingly misrepresent information collected from the parental consent process or share age category data.

The bill is enforced by a private right of action. However, an amendment to the bill would make some sections enforceable under Section 37-24-6 of South Dakota code.[280][281]

A violation under Section 37-24-6 of South Dakota code is a Class 1 misdemeanor if the violation is under $1,000. A Class 1 misdemeanor can carry up to a year in county jail if the violation is over $1,000. If the violation is under $100,000, it is a Class 6 felony, which can carry up to 2 years imprisonment. If the violation is over $100,000, it will result in a Class 5 felony, which can carry up to 5 years imprisonment.[282][283] The Attorney General of South Dakota can enforce Section 37-24-6 of South Dakota Code.[284]

The bill take effect on January 1, 2026, if enacted into law.[280]

On January 30, 2025, Senator Michael Rohl introduced SB 180 to the South Dakota Legislature.[280] The bill establishes four age categories for app store users (stores must verify user ages):[280]

  • Adult (18 and older)
  • Older teen (16–17
  • Younger teen (13–15)
  • Child (under 13)

The bill requires app stores to:

  • Obtain parental consent for all minors for covered services, including contracts or terms of service
  • Properly represent parental consent data
  • Not share age category information.[280]

The state enforces violations, alongside a private right of action. Penalties include:

More information Damages, Type ...

If enacted, the bill takes effect on January 1, 2026.[280]

Washington

On February 4, 2025, Representative Shelley Kloba introduced HB 1834 to the Washington Legislature.[288] The bill mandates social media platforms likely accessed by minors to:

  • Restrict minors’ access to addictive feeds, regardless of parental consent,
  • Verify user ages before providing addictive feeds,
  • Avoid profiling minors unless essential,
  • Prohibit collecting or sharing minors’ personal or geolocation data,
  • Prevent using manipulative design tactics on minors,
  • Block notifications to minors from midnight to 6 am and 8 a.m. to 3 pm absent parental consent.[289]

If enacted, the bill would take effect on January 1, 2026.[289] On February 21, 2025, the bill advanced from committee.[290]

Remove ads

Criticism

Groups such as EFF, ACLU and NetChoice have criticized social media age verification laws due to privacy risks, free speech burdens, and ineffective operation.[291][292][293][294]

  • Age verification has been criticized over concerns of lost privacy/anonymity, unconstitutionality, and ineffectiveness, including those without sufficient documentation.[295][296][297][298]
  • Technologies such as VPN can be used to circumvent geography-based protections.[299][300]
  • Loss of anonymity can chill free speech.[301][300][296][297]
  • EFF claimed that better platform design could better protect youth without compromising privacy.[300]
  • Enforcement difficulties, cultural/global variations in age norms, and potential overreach may limit beneficial access.[302] Education, dialogue, and platform accountability gave been recommended over technical approaches.[298]
Remove ads

Summary

More information State, Parental consent ...
Remove ads

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads