The year 2025 is certain to be a watershed for social media legislation and litigation. As it continues to shape how we connect, share, and consume information, social media remains at the forefront of public discourse due to both privacy concerns and its potentially addictive nature and impact on mental health, particularly among young users. Regulators around the globe have responded with a flurry regulatory efforts and lawsuits aimed at mitigating these risks and implementing age-related protections.
While tech moves at the speed of light, legislation moves much more slowly. As a result, both federal and state lawmakers find themselves scrambling to play catchup with an ever-changing tech landscape, resulting in a patchwork of well-intentioned, but often vague and overly burdensome regulations. In some cases, neither the end users, the online platforms, nor even the lawmakers themselves fully understand what is or is not legally compliant on any given day. Look no further than the recent TikTok ban for a textbook example.
With a new U.S. administration in place and many unresolved legal cases set for high court rulings, we may see a bit more standardization of these laws in the coming months. Let’s look at where we are and what may be in store.
ONLINE SAFETY FOR THE AGES
In 2025, a major area of focus is the push for online safety measures, including age verification or restrictions, parental consent mandates, data privacy requirements, and limits on data collection and targeted advertising.
Legislative efforts generally fall into several categories. Age verification involves implementing systems to verify a user’s age before allowing them to access specific online content or services. Consumer privacy laws impose restrictions on processing personal data, often requiring safeguards such as opt-in consent and data processing requirements. COPPA-style laws and age-appropriate design laws expand protections for minors by mandating stricter data collection limits and platform design changes tailored to children’s best interests. Social media laws focus on regulating platform features, such as algorithmic recommendations, engagement-driven design, and parental oversight tools.
Supporters argue that these laws serve as a crucial safeguard to protect the public, and especially minors, from digital platforms and online services, such as exposure to addictive design elements, inappropriate content, and negative mental health effects, while also giving parents more control over their children’s online activities. Such supporters liken these measures to existing restrictions on purchasing alcohol or tobacco.
Critics, in contrast, argue that these measures could curtail constitutionally protected speech and contribute to an expansive surveillance system, potentially normalizing digital monitoring in ways that undermine personal freedom and autonomy.
The push for age verification and other privacy-focused laws has not been confined to theoretical debate; it has prompted a wave of legislative initiatives. In recent years, there has been a significant increase in state-level proposals to enhance online safety. In the past few years alone, numerous bills focusing on online safety for minors have been introduced across nearly every state. These legislative initiatives encompass a wide range of measures, including age verification requirements, parental consent mandates, algorithm control, and restrictions on targeted advertising. In particular, this trend reflects a growing focus among state legislatures on children’s online activities and how to implement safeguards that protect minors in the digital environment.
As of February 2025, numerous lawsuits related to age verification and content control have been filed against major social media companies and state regulators. Many of these cases have been consolidated in federal court and remain in the pretrial or discovery stages, while others have already resulted in significant rulings, with challenges still ongoing.
This piece captures the status of each listed law as of the publication date, but the legal landscape is constantly evolving. Readers should stay informed and remain up to date on the latest developments.
Here are some of the key pieces of legislation, broken down by jurisdiction:
U.S. FEDERAL LEGISLATION
Kids Off Social Media Act (S. 278)
Status: Proposed
The Kids Off Social Media Act, co-sponsored by Senators Ted Cruz, Brian Schatz, Chris Murphy, and Katie Britt, is the latest of a series of federal congressional efforts to require social media platforms to provide enhanced online protection standards for children online. The bill aims to establish a minimum age of 13 for social media use and would prohibit platforms from using algorithms to push targeted and addictive content to users under 17. It would also require federally funded schools to limit social media access on their networks or risk losing broadband subsidies. The bill empowers the Federal Trade Commission and state attorneys general to enforce its provisions. On February 5, 2025, the U.S. Senate Commerce Committee advanced S. 278, sending it to the full Senate for consideration. While it has gained traction, the legislation faces opposition from digital rights organizations and other civil liberties groups, with legal challenges expected.
Kids Online Safety Act (KOSA) (S. 2073)
Status: Failed
The Kids Online Safety Act (KOSA) initially introduced by Senators Richard Blumenthal and Marsha Blackburn back in February 2022, has faced significant challenges in its passage. The bill seeks to impose a duty of care on social media platforms and allows the FTC to sue platforms that don’t undertake sufficient measures to mitigate harms to minors. KOSA would mandate social media platforms to provide safety and privacy protections for users a platform knows are underage, prohibit algorithmic recommendation systems and other addicting design features for minors, require parental consent for users under 18, and implement strict safety and privacy protections for users under 13. After passing the Senate in July 2024 with overwhelming bipartisan support, KOSA faced obstacles in the House, where concerns over regulatory power and opposition from tech companies complicated its path forward. The bill had not passed by the end of 2024 and thus failed, but Senator Blumenthal has expressed his intent to reintroduce it with the 119th Congress. With a new group of lawmakers in Congress and a new administration in the White House, this one is worth watching.
Protecting Kids on Social Media Act (S. 1291)
Status: Failed
Introduced in April 2023, S. 1291 is similar to the KOSA bill, but with a couple key differences and some fairly vague language. The bill would mandate social media companies to take “reasonable steps” to verify the age of their users and to ensure that children under 13 do not access these platforms. For minors aged 13 and older, the bill would require platforms to obtain affirmative parental or guardian consent before allowing account creation and prohibits the use of algorithmic recommendation systems for individuals under 18. Additionally, S. 1291 would establish a voluntary pilot program under the Department of Commerce to provide secure digital identification credentials for age verification. The Federal Trade Commission and state attorneys general would be empowered to enforce its provisions. While the bill did not pass in the 118th Congress, its key provisions could potentially be included in future legislative efforts or companion bills.
STATE LEGISLATION
Alabama
Alabama Social Media Bill (HB 235)
Status: Proposed
Alabama HB 235, which was introduced on February 6, 2025, and is yet to be titled, would require social media platforms to prevent individuals under 16 from creating accounts and to implement a commercially reasonable age verification process to ensure compliance. The bill broadly defines social media platforms as any online service that allows users to upload content or view the content or activity of others while also employing algorithms that analyze user data to select content for users. Platforms meeting both criteria would be required to verify users’ ages and prohibit access to anyone under 16. Any knowing or reckless violation of this law by a social media platform would be deemed a deceptive trade practice, subject to civil penalties, including significant fines. The bill explicitly grants enforcement authority to the Alabama Attorney General. If passed, the bill would take effect on January 1, 2026.
Alaska
Liability for Publishing or Distributing Pornography to Minors on the Internet (HB 254)
Status: Failed
Alaska House Bill 254, which aimed to restrict minors’ access to adult content by imposing age verification requirements on commercial entities that publish or distribute such material, passed in the Alaska House and was referred to the Senate Judiciary Committee on May 6, 2024. However, the bill failed to progress further in the Senate and did not become law.
Alaska Social Media Regulation Act (HB 271)
Status: Failed
The Alaska House Bill 271, which sought to impose regulations on minors’ access to social media platforms by requiring parental consent and a curfew, was introduced on January 16, 2024 and referred to the House Labor and Commerce Committee on the same day. However, the bill did not progress beyond this point and became inactive by the end of the legislative session.
Arizona
Protecting Children on Social Media Act (HB 2858)
Status: Failed
Arizona’s HB 2858 was introduced in the Arizona State Legislature on February 8, 2024. The bill aimed to impose several requirements on social media platforms, such as implementing default privacy settings for users, allowing minors to opt out of personal information collection, prohibiting targeted advertising based on minors’ data, and developing content filters to limit cyberbullying. Additionally, the bill sought to prevent adults from messaging minors under 18 and required parental consent for minors under 16 to use the platform. The bill was referred to committee later that same month and subsequently failed to progress further.
Arkansas
Social Media Safety Act (SB 396)
Status: Enacted but enjoined
The Arkansas Social Media Safety Act, which requires social media platforms controlled by a business entity with at least $100 million in revenue to verify the age of users and obtain parental consent for minors, has been blocked by a preliminary injunction. In June 2023, NetChoice filed suit to block enforcement of the law due to First Amendment concerns regarding websites’ ability to collect and verify sensitive personal information about their users (see NetChoice v. Griffin). On August 31, 2023, U.S. District Court Judge Timothy L. Brooks issued an injunction the day before the law was set to take effect, halting enforcement of the law on the basis that it was likely unconstitutional. The court held that the law’s vague definition of “social media company” could lead to arbitrary enforcement and that requiring age verification via government-issued IDs could deter both minors and adults from using online platforms, thus violating the First Amendment.
Arkansas Children and Teens’ Online Privacy Protection Act (HB 1082)
Status: Proposed
HB 1082, introduced on January 13, 2025 in the current 95th Arkansas General Assembly, is a proposed law that would impose strict limitations on the collection, use, and sharing of personal information from minors. It includes provisions that prohibit operators of websites, online services, or mobile applications directed toward children from collecting personal information unless necessary for providing a service or fulfilling a transaction. Additionally, it would require these operators to obtain verifiable parental consent before collecting or disclosing personal information from minors and would mandate clear disclosures regarding the collection and use of such data. The bill also sets requirements for operators to allow parents and teens to review, correct, or delete personal information. Violations of the law could result in civil penalties under the Deceptive Trade Practices Act, with the attorney general authorized to bring enforcement actions.
Arkansas Kids Online Safety Act (HB 1083)
Status: Proposed
Also introduced in the current 95th Arkansas General Assembly, HB 1083 places a duty of care on platforms to mitigate potential harms to minors. The bill mandates that covered platforms take reasonable measures in the design and operation of their services to prevent and reduce risks associated with mental health disorders, addiction-like behaviors, online bullying, harassment, sexual exploitation, and exposure to harmful content. It also would require platforms to implement safeguards such as parental controls, privacy settings, and content restrictions by default, ensuring that minors are provided the highest level of protection. Additionally, HB 1083 would enforce transparency in personalized recommendation systems and advertising, prohibiting targeted ads for substances like narcotics, tobacco, alcohol, and gambling. The legislation would further establish the Kids Online Safety Council to identify emerging risks and recommend best practices for protecting minors online. Under the enforcement provisions, violations of the bill would be treated as unfair and deceptive acts, granting the attorney general authority to pursue legal action against noncompliant platforms. This bill is one to watch as the legislative session progresses.
California
California Age-Appropriate Design Code Act (AB 2273)
Status: Enacted but enjoined
AB 2273, signed into law in September 2022 and set to take effect on July 1, 2024, was challenged in court before implementation (see NetChoice v. Bonta). The law imposes obligations on online platforms “likely to be accessed by children,” requiring Data Protection Impact Assessments (DPIAs) to identify certain risks to children and enhanced default privacy settings for online products and services used by or aimed at children. It also authorizes the attorney general to impose civil penalties for noncompliance. On September 18, 2023, the district court granted a preliminary injunction against enforcement in a lawsuit brought by NetChoice. The subsequent Ninth Circuit holding on August 16, 2024 affirmed the injunction on the DPIAs provision and remanded other sections to the district court. The District Court heard revised oral arguments on January 23, 2025 and litigation is ongoing.
Protecting Our Kids from Social Media Addiction Act (SB 976)
Status: Enacted but enjoined
In a separate but related case, NetChoice challenged California’s SB 976, which was signed into law on September 20, 2024. The law prohibits online platforms from providing “addictive” feeds to users under the age of 18 without parental consent. It also bans platforms from sending notifications to minors between midnight and 6:00 a.m., or during school hours from September through May, unless parental consent is obtained. An injunction was granted in late December 2024, blocking the provisions requiring platforms to restrict notifications at specific times and to report annually on the number of minors using their services and instances of parental consent granted for access to addictive feeds. However, the court declined to enjoin the provision prohibiting minors from accessing addictive feeds without parental consent, reasoning that the restriction was content-neutral and did not require platforms to remove any content. On January 1, 2025, NetChoice filed a motion seeking to block the full law, arguing that the provisions would harm its members. On January 2, 2025, Judge Davila granted NetChoice’s motion to delay the law’s effective date from January 1 to February 1, 2025, while the case proceeded. On January 28, 2025, the Ninth Circuit fully enjoined SB 976 from taking effect via a temporary injunction, blocking enforcement of the law in its entirety while the appeal is pending. Litigation remains ongoing, with the case placed on the calendar for April 2025.
Social Media Warning Labels (AB 56)
Status: Proposed
California’s proposed AB 56, introduced in December 2024, seeks to address the growing mental health crisis among young people by requiring social media platforms to display warning labels about the potential risks to kids and teens. If passed, the bill would mandate that platforms show warning labels to all new users upon their first use of the service. The warning would be displayed for at least 90 seconds and would reappear weekly thereafter. This proposal follows mounting pressure on various social media companies over its addictive features and negative impact on mental health. This bill is one to watch as the legislative session progresses.
Let Parents Choose Protection Act (SB 1444)
Status: Failed
SB 144, introduced on February 16, 2024, would have required large social media platforms to provide third-party safety software providers with API access to manage a child’s online interactions, viewable content, and account settings upon parental consent. The bill sought to impose registration, compliance, and auditing requirements on these providers, authorizing enforcement by the attorney general, and limiting liability for social media platforms that comply in good faith. The bill was held in the Senate Appropriations Committee and did not advance further in the legislative process.
Colorado
Healthier Social Media Use by Youth (HB 24-1136)
Status: Enacted and in force
Governor Jared Polis signed approval of HB 24-1136 on June 6, 2024, with the law subsequently becoming effective on August 7, 2024. The law requires for social media platforms, on or after January 1, 2026, to push notifications for users under 18 after one hour of use a day or when used in between 10 p.m. and 6 a.m. to provide information regarding mental health risks associated with excessive screen time. Notably, the law relies on users’ self-attestation that they are under 18. The bill further mandates the Department of Education to create a resource bank for schools with evidence-based materials on social media’s mental health impacts on youth and expand student wellness programs to address problematic technology use.
Social Media Protect Juveniles Disclosure Reports (SB 24-158)
Status: Failed
SB 24-158, introduced in February 2024, would have required social media companies to publish and update policies outlining content moderation, prohibited activities, and law enforcement reporting, while also mandating annual reports to the attorney general detailing enforcement actions, age verification practices, and juvenile user data. Additionally, platforms would be required to implement age verification, parental control tools, content warnings, and safeguards against harmful content while prohibiting deceptive design tactics, with violations enforceable under the Colorado Consumer Protection Act. The bill faced significant opposition regarding concerns over free speech and privacy, and failed to pass before the end of the 2024 state legislative session.
Connecticut
Online Privacy, Data and Safety Protections Act (SB 3)
Status: Enacted and in force
Connecticut’s general consumer privacy law, the CT Data Privacy Act, effective since July 1, 2023, mandates social media platforms to remove minors’ accounts upon request and limits the processing of their personal data. Additional geolocation and direct messaging restrictions will take effect on October 1, 2024. The Act prohibits the processing of personal data of minors between the ages of 13 and 16 for targeted advertising, the sale of such data, and certain types of profiling. The law also bans the collection of precise geolocation data and the use of system features that encourage excessive use. Platforms must conduct data protection assessments for minors’ data processing. Additionally, the law requires online dating platforms to offer safety advice and reporting tools to address harmful behavior.
Youth Social Media Legislation (HB 6857)
Status: Proposed
Introduced on February 5, 2025, HB 6857 would require social media companies to implement default settings that limit minors’ exposure to engagement-driven algorithms, restrict data collection, and curb addictive design tactics. The bill, modeled after similar laws in New York, California, and Utah, would set automatic restrictions on account privacy, time limits, and notifications. If passed, it would prohibit notifications between midnight and 6 a.m. and limit social media use to one hour per day unless a parent provides consent to modify these settings. Additionally, the legislation would require social media companies to submit annual reports detailing the number of minors using their platforms and their average daily screen time. Connecticut Attorney General William Tong recently announced his plans to collaborate with the legislature on this bill.
Florida
Online Protection for Minors (HB 3)
Status: Enacted and in force
On January 1, 2025, Florida’s HB 3 took effect, requiring social media platforms to prohibit children under the age of 14 from creating or maintaining social media accounts and requiring parental consent for users aged 14 and 15. Social media platforms are required to verify user ages and terminate accounts belonging to users under 14, though account owners may dispute such terminations within 90 days. The law also mandates age verification for adult websites to prevent minors from accessing harmful material, a provision that is already in force. Florida’s HB 1, introduced in early 2024, sought to prohibit minors under 16 from using social media platforms, requiring age verification for account creation and mandating the termination of accounts for minors. Despite passing the Florida House, the bill faced significant opposition and died in the Senate due to concerns over its restrictive nature, particularly its lack of provisions for parental consent. Critics argued it violated minors’ rights to free expression, leading lawmakers to revise the bill and introduce HB 3, which provided a more balanced approach by allowing minors aged 14 and 15 to use social media with parental consent.
The Florida Department of Legal Affairs is authorized to enforce violations under the Florida Deceptive and Unfair Trade Practices Act, with penalties of up to $50,000 per violation and additional damages. On January 6, 2025, Free Speech Coalition requested a preliminary injunction to block enforcement of the law. However, on January 16, 2025, the United States District Court for the Northern District of Florida granted a motion to stay the lawsuit until the Supreme Court issues its opinion in the Free Speech Coalition, Inc. v. Paxton case out of Texas, expected later this year. The legal challenge to HB 3 centers on whether arguments in support of the law justify state intervention or whether they unlawfully override parental decision-making, with critics pointing out the law’s selective application to certain platforms while exempting others with similar “addictive” features.
Florida Social Media Platforms Bill (SB 7072)
Status: Enacted but enjoined
Florida’s SB 7072 lies at the center of Moody v. NetChoice, a legal battle over the state’s attempt to regulate social media platforms. Among other things, SB 7072 defines social media companies as “common carriers” and limits their discretion regarding content moderation, particularly regarding political content. Passed in May 2021, the law prohibits platforms from de-platforming political candidates, imposes fines for violations, and establishes detailed transparency and user notice requirements under Florida’s Deceptive and Unfair Trade Practices Act. While the law was preliminarily enjoined by a district court in June 2021, the Eleventh Circuit upheld most of the injunction in May 2022. In July 2024, the Supreme Court vacated the Eleventh Circuit’s decision and remanded the case for further proceedings. As of February 2025, SB 7072 remains unenforced, pending the outcome of litigation.
Georgia
Protecting Georgia’s Children on Social Media Act of 2024 (SB 351)
Status: Enacted but not yet in force
Georgia teens under the age of 16 will soon be required to obtain parental consent to use social media platforms, according to a new law signed by Governor Brian Kemp on April 23, 2024. The law, known as SB 351, mandates various social media platforms to make “commercially reasonable efforts” to verify users’ ages, with the option to obtain parental consent via methods such as signed forms or video calls. Set to take effect on July 1, 2025, the law aims to limit social media access for young people and reduce mental health risks associated with online use. However, similar parental consent laws in other states have faced legal obstacles, with federal judges halting their implementation due to concerns over First Amendment rights. Legal challenges are expected, with criticisms citing potential constitutional issues and privacy concerns over the collection of age verification data. The law also includes provisions to limit data collection or use of personal information of minors, as well as a requirement for the Georgia Department of Education to create programs that address the impacts of social media on students’ physical and emotional well-being. The state attorney general will be responsible for enforcing the law, which gives businesses a 90-day window to correct violations before penalties are imposed. This law is one to watch as the year progresses.
Idaho
Parental Rights in Social Media Act (SB 1417)
Status: Failed
SB 1417 was introduced in the Idaho State Legislature on March 8, 2024. The bill defined social media companies as platforms with at least five million account holders worldwide. The bill specified that social media platforms include forums allowing users to create profiles, upload posts, and interact with others, while excluding platforms like email, streaming services, online gaming, cloud storage services, and sites focused on news, sports, or entertainment, provided the content was not user-generated. The proposed legislation would have required social media companies to obtain explicit parental or guardian consent before allowing anyone under the age of 18 to create an account. It also provided for enforcement through a private right of action for harm to a minor, with a rebuttable presumption that harm actually occurred against Idaho minors seeking recovery, enabling individuals or the Idaho attorney general to bring lawsuits against violators. Penalties would have included fines of up to $5,000 per violation or $2,500 for each incident of harm, with potential compensation for actual damages caused by addiction, financial loss, or physical or emotional harm suffered by minors. The bill was referred to the State Affairs Committee on March 11, 2024 but died in committee and did not advance further in the legislative process. Had it been enacted, the law would have taken effect on January 1, 2025.
Illinois
Illinois Age-Appropriate Design Code Act (SB 51)
Status: Proposed
The Illinois Age-Appropriate Design Code Act would require businesses offering online services, products, or features likely to be accessed by children to implement specific protections, including completing a data protection impact assessment. Businesses must conduct these assessments by July 1, 2026, for existing services and before launching any new offerings thereafter. Violations of the Act could result in injunctions and civil penalties of up to $2,500 per affected child for negligent violations and up to $7,500 per affected child for intentional violations. The Act would also establish the Children’s Data Protection Working Group, which would be responsible for delivering a report to the General Assembly on best practices for implementation. This Act was introduced on January 13, 2025, and is one to keep an eye on, as similar design code acts have been challenged.
Parental Consent for Social Media Act (SB 3440)
Status: Failed
SB 3440 was introduced to the Illinois Senate on February 8, 2024. The bill would have required social media companies earning over $100 million annually to verify users’ ages through third-party systems, such as government-issued ID, and obtain parental consent for users under the age of 18. It also imposed a curfew barring minors from accessing social media platforms between 10 p.m. and 6 a.m. Central Standard Time. Exemptions included email, direct messaging, streaming services, e-commerce, cloud storage, and platforms with preselected content or academic purposes. The bill was referred to the Assignments Committee on the day it was introduced but died at the end of the legislative session.
Minor User of Social Media Protection Act (SB 3510)
Status: Failed
SB 3510, introduced on February 9, 2024, would have required social media companies with Illinois account holders to develop and publicly share a written policy in compliance with the procedures outlined in the Act. It would have also mandated the establishment of a reporting function allowing account holders to report if an Illinois account holder is a minor. Upon receiving such a report, the social media company would have been required to verify the age of the account holder, and if a reasonable verification confirmed that the account holder was a minor, the company would have had to remove the account. The Act would have also addressed liability for social media companies and for commercial entities or third-party vendors. SB 3510 is similar to SB 3440 but applies only to users under 13, whereas SB 3440 covers minors under 18. After its first reading and referral to the Assignments Committee on the day of introduction, the bill gained a second co-sponsor on February 22, 2024. However, it did not advance further and ultimately died at the end of the legislative session.
Indiana
Minor Access and Use of Social Media (SB 11)
Status: Proposed
On January 8, 2025, SB 11 was introduced to regulate minors’ access to social media. If passed, the bill would require social media platforms to identify accounts created by individuals under 16 and to obtain verifiable parental consent before allowing such accounts to access the platform. Parents can revoke consent anytime, and collected data must be encrypted. SB11 would be enforced by the Indiana attorney general, with fines up to $250,000 and a 90-day cure period for violators. Earlier drafts of the bill included a right of private action for minors subject to social media bullying, but the bill was amended to remove this provision, redefine social media operators, and add a severability clause. The amended version prohibits minors under 16 from creating or accessing accounts without parental consent. The Senate Judiciary Committee approved the bill on January 15, 2025, and the full Senate passed it on January 23, 2025. In its progress through the legislative process, the bill now heads to the House. If enacted, the law is set to take effect July 1, 2025.
Social Media Use by Minors (HB 1314)
Status: Failed
On January 10, 2024, HB 1314 was introduced in the Indiana State House. The bill would have required social media services to implement a reasonable age verification method for users who wish to create an account, as well as for existing accounts created before July 1, 2024. Under the proposal, if a user is under 18, their account must either be suspended within 14 days, or the social media service must receive consent from a parent or guardian. The bill also would have prohibited social media services from recommending content to minor accounts or disseminating advertisements to minors and enforced a curfew for minors, restricting their use of the platform. HB 1314 was referred to the Committee on Judiciary, where it died.
Iowa
Social Media Parental Authorization Act (HF 2523)
Status: Failed
Iowa House File 2523 aimed to restrict minors under 18 from creating social media accounts without parental consent and would have required platforms to provide parents with oversight of their children’s activity. Under the bill, parents or guardians would have been able to view all posts and messages sent or received by their child, control privacy and account settings, and monitor or limit screen time. The legislation defined “social media platforms” based on features such as profile creation, friend connections, content sharing, and private messaging, while excluding interactive gaming and educational entertainment services. The bill granted enforcement authority to the Iowa attorney general and provided a private right of action for violations. After passing the Iowa House, the bill stalled in the Senate and ultimately failed to advance.
Kentucky
Kentucky Act Relating to the Protection of Minors (HB 450)
Status: Failed
HB 450, introduced on February 1, 2024, would have prohibited social media companies from allowing minors to create accounts without parental consent and requires platforms to verify users’ ages through third-party authentication methods, such as digitized identification cards or financial documents. Social media companies would also have been required to provide parents with tools to supervise their child’s account, including access to posts and messages, control over privacy settings, and the ability to monitor and limit screen time. Platforms would be prohibited from retaining personal information obtained during the verification and consent process, with enforcement authorized by the Kentucky attorney general and a new private right of action against companies that violate these requirements. The bill died in committee.
Louisiana
Secure Online Child Interaction and Age Limitation Act (SB 162)
Status: Enacted and in force
On June 28, 2023, Louisiana Governor John Bel Edwards signed SB 162, which mandates strict age verification for social media platforms with over 5 million global users. Platforms must verify the age of users under 16 and obtain explicit parental consent through methods like forms, video calls, or government-issued IDs. The law also prohibits adults from messaging Louisiana minors unless already connected and bans most advertising based on minors’ data. It gives parents tools to supervise their children’s accounts and restricts the collection of unnecessary data. SB 162 excludes certain platforms like email, gaming, and streaming services, and is enforced by Louisiana’s Department of Justice, with fines for non-compliance. The law took effect on July 1, 2024.
Louisiana Parental Consent Law (HB 61)
Status: Enacted and in force
On June 28, 2023, Louisiana Governor John Bel Edwards signed HB 61 into law, requiring parental consent for minors under 18 to create online accounts or enter into agreements on any “interactive computer service.” Effective as of August 1, 2024, the law covers any platform that provides internet access or allows users to share content, including social media, gaming, and educational sites. Minors must have explicit consent from a parent or legal representative before engaging with these platforms. Despite opposition from groups like NetChoice, which sent a veto request to Governor Edwards, HB 61 has not yet faced legal challenges.
Maryland
Status: Enacted and in force
The Maryland Kids Code, enacted on May 9, 2024, requires large online products and services likely to be accessed by children and teens under 18 to be age-appropriate and prioritize children’s safety and privacy. The law applies to for-profit companies with annual gross revenue exceeding $25 million or those that process the personal data of at least 50,000 consumers or devices or that derive more than 50% of their annual revenues from selling personal data. The law prohibits companies from profiling minors unless such profiling can be shown to be in their best interests, from designing features that are detrimental to minor’s well-being, and from harvesting location or other personal information unless necessary to deliver the service. Key provisions include making new social media accounts private by default, limiting autoplay videos for children, and preventing anonymous adults from contacting minors. Notably, unlike the similar California law, the Maryland Kids Code does not require online service providers to estimate the age of its users. In addition to the Maryland Kids Code, Maryland has enacted the Maryland Online Data Privacy Act of 2024, which prohibits processing personal data for targeted advertising and selling personal data if the entity knew or should have known that a consumer is under the age of 18. On February 3, 2025, NetChoice filed a lawsuit challenging portions of the law, seeking to block the requirement for platforms to file reports on their services’ impact on minors.
Michigan
Michigan Age-Appropriate Design Code Act (HB 5823)
Status: Failed
The Michigan Age-Appropriate Design Code Act aimed to establish strict standards for online services, products, and features likely to be accessed by children. It would have required online platforms to prioritize child privacy and safety by implementing protections such as default high-privacy settings, restrictions on data collection and targeted advertising, and safeguards against harmful content and addictive design features. The bill also proposed civil penalties for violations and the creation of a fund to support enforcement. However, it failed to pass before the end of the legislative session and was not enacted into law.
Social Media Age Verification Bill (HB 5920)
Status: Failed
HB 5920, introduced in September 2024 in Michigan, sought to require social media companies with at least 5 million accounts to verify the age of users within 14 days of attempting to access an account. For users under 18, platform would have been required to obtain consent from a parent or guardian. The bill also would have prohibited platforms from allowing minor accounts to appear in search results unless specific conditions were met. It would have established a curfew from 10:30 p.m. to 6:30 a.m., during which social media companies would have been required to prevent access to minor accounts. Additionally, the bill would have mandated parental controls to allow parents or guardians to manage and monitor their children’s accounts. The attorney general of Michigan would have had enforcement authority, with the ability to issue fines for non-compliance. The bill was referred to the House Committee on Health Policy, where it died.
Minnesota
Social Media Use by Minors (HF 5452)
Status: Failed
Minnesota’s HF 5452, introduced in May 2024, sought to regulate social media use for minors under 16 by requiring anonymous age verification for platforms that host harmful content. The bill prohibited individuals under 14 from creating accounts without parental consent, while also exempting certain entities from the age verification requirement. The bill would have granted the attorney general enforcement authority, with violations subject to civil penalties up to $50,000, and a private right of action, with claimants able to seek damages of up to $10,000. If passed, the bill would have been set to take effect on August 1, 2024. However, the bill died at the end of the legislative session.
Minnesota Age-Appropriate Design Code Act (HF 2257)
Status: Failed
The Minnesota Age-Appropriate Design Code Act, originally introduced in February 2022 and re-introduced in subsequent legislative sessions, aimed to require online services likely to be accessed by children to implement design features and settings that prioritize privacy and data protection for minors. Initially, the bill would have prohibited social media platforms with at least 1 million users from using algorithmic recommendation systems for minors under 18, unless the content was created by a government entity or an educational institution. The bill was later amended to allow platforms to use algorithmic recommendation systems for minors if the systems were designed to block harmful content or were used in conjunction with parental controls to ensure age-appropriate content. The bill has thus far failed to make its way through the legislative process.
Missouri
Children’s Internet Safety Act (HB 2157)
Status: Failed
HB 2157, introduced to the Missouri House of Representatives in January 2024, would have required social media companies to verify the age of Missouri residents and obtain parental consent before allowing minors to create or maintain accounts starting on July 1, 2025. Social media companies would also have been required to restrict direct messaging to minors for any user not already linked to the minor’s account and to prevent minors’ accounts from appearing in search results or being served ads. The bill passed in the House but did not progress further through the legislative process, and thus died at the end of the legislative session. A similar bill focused on the age verification mandate, Missouri HB 1993, was also introduced, but failed to pass by the end of the legislative session.
Mississippi
Walker Montgomery Protecting Children Online Act (HB 1126)
Status: Enacted but enjoined
The Walker Montgomery Protecting Children Online Act, signed into law by Mississippi Governor Tate Reeves on April 30, 2024, imposes several significant restrictions on digital service providers aimed at better protecting children online and requiring enhanced platform accountability in their interactions with children. Under the law, platforms are required to make “commercially reasonable efforts” to verify the age of users and obtain parental consent for users under 18. It also limits the collection of minors’ personal and geolocation data, bans targeted advertising aimed at minors, and mandates the prevention of harmful content, including materials related to self-harm, eating disorders, substance abuse, and illegal activities. However, the law has been challenged in court by NetChoice (NetChoice v. Fitch). On June 7, 2024, NetChoice filed a lawsuit against Mississippi Attorney General Lynn Fitch, seeking to block the law from taking effect and claiming that that the law infringes on First Amendment rights by placing undue restrictions on online speech and expression. On July 1, 2024, a federal judge granted a preliminary injunction, halting the law’s implementation pending further legal proceedings. This decision was appealed to the Fifth Circuit Court of Appeals, with oral arguments scheduled for February 2025. The outcome of this case could have significant implications for state-level regulation of online platforms and children’s access to digital spaces.
Nebraska
Parental Rights in Social Media Act (LB 383)
Status: Proposed
LB 383, introduced on January 17, 2025, would require social media platforms to verify the age of their users and prohibit minors from creating social media accounts unless a parent provides verified consent, requiring such verification to be conducted by digitized identification cards or other commercially reasonable measures. Social media companies must offer parental supervision tools, allowing parents to monitor posts, messages, privacy settings, and screen time. If passed, violations could result in civil action, damages, and penalties of up to $2,500 per offense, with enforcement by the attorney general. The law is intended to protect children from the risks associated with excessive social media use. If passed, the law would go into effect on January 1, 2026.
Adopt the Age-Appropriate Online Design Code Act (LB 504)
Status: Proposed
Introduced on January 21, 2025 at the request of Governor Jim Pillen, LB 504 would require covered online services to implement measures that protect personal data, prevent compulsive usage, and mitigate psychological harm. These services would be required to provide tools that limit communication, prevent data misuse, and restrict access to harmful content, while giving users the ability to manage their experience, including by opting out of certain features. The bill would also mandate that online services enforce privacy settings for minors by default, provide parental oversight tools, and restrict targeted advertising. Platforms would need to ensure the security of minors’ data, limit its use to essential purposes, and avoid harmful notifications or profiling. Furthermore, the bill would require clear communication of privacy protections, parental controls, and transparency in the use of recommendation systems. If passed, the law would go into effect on January 1, 2026. LB 504 would be enforced by the Nebraska attorney general.
Nevada
Social Media Platforms Age Verification (SB 63)
Status: Proposed
Nevada’s SB 63, introduced on November 20, 2024, on behalf of the attorney general, seeks to enhance online safety for teenagers by implementing strict age verification requirements for social media users. The bill would prohibit minors under 13 from using social media platforms without parental consent and mandates platforms to establish a system to verify the age of prospective users. Additionally, it would require social media companies to disable certain features on minors’ accounts, restrict the delivery of notifications during specific hours, and prevent the use of minors’ personal data in algorithmic recommendation systems. The bill would also grant civil enforcement authority and impose hefty penalties for noncompliance. On February 3, 2025, the bill was read for the first time before being referred to a committee for further review.
New Mexico
New Mexico Age-Appropriate Design Code Act (SB 319)
Status: Failed
Originally introduced in February 2023, SB 319 would require “controller[s] that provide[] an online service, product or feature likely to be accessed by minors” to estimate the age of child users with “a reasonable level of certainty,” and to configure enhanced default privacy settings for minors. In addition, the bill would have required online services that allow a child’s parent to monitor their online activity to send an “obvious signal” when the child’s online activity is monitored or tracked. The bill was re-introduced in the 2024 legislative session but did not make it further through the legislative process.
New York
Stop Addictive Feeds Exploitation (SAFE) for Kids Act (S7694A)
Status: Enacted but not yet in force
On June 20, 2024, New York Governor Kathy Hochul signed S7694A into law. The law mandates that operators of digital platforms use age determination technology to verify users’ ages and restricts the use of “addictive” feeds for users under 18 unless they have parental consent. It also prohibits sending notifications to minors’ accounts between 12:00 a.m. and 6:00 a.m. Eastern Standard Time without verified parental consent. The SAFE Act will go into effect 180 days after the New York attorney general finalizes regulations necessary for implementation.
On January 22, 2025, Governor Hochul unveiled follow-on proposed legislation to create a statewide standard for limiting use of smartphones and other internet-enabled personal devices in K–12 schools.
North Carolina
Social Media Algorithmic Control in IT Act (HB 644)
Status: Failed
North Carolina’s HB 644 aimed to enhance protections for minors on social media platforms by requiring age verification, ensuring that platforms provide age-appropriate content, and prohibiting the use of minors’ data for advertising and algorithmic recommendations. It included provisions for clear privacy policies, full disclosure of data usage, and user consent for data used in algorithmic recommendations. The bill failed to progress through the legislative process and died in committee.
Let Parents Choose Act (HB 773)
Status: Failed
North Carolina House Bill 773, also known as the “Let Parents Choose/Sammy’s Law of 2023,” was introduced on April 18, 2023, with the goal of enhancing parental control over children’s social media use. The bill aimed to implement several key provisions, including requiring large social media platforms to create and maintain real-time application programming interfaces that could be accessed by third-party safety software providers. Additionally, it would have mandated that these platforms make the necessary information available to enable the use of these APIs, and sought to empower parents or legal guardians with the ability to actively manage and monitor their children’s social media interactions. The bill failed to progress beyond the committee stage and was not enacted into law.
Ohio
Social Media Parental Notification Act (HB 33)
Status: Enacted but enjoined
HB 33, passed in July 2023, requires social media platforms to obtain parental consent before allowing users under 16 to create accounts. The law imposes requirements on platforms that are targeted at or likely to be accessed by minors, including those that feature child-oriented content, with penalties for non-compliance that escalate over time. In January 2024, the law was challenged in court by NetChoice. A federal judge issued a preliminary injunction, agreeing with NetChoice that the law violated users’ free speech rights and was overly broad. The judge emphasized that the law would impose undue restrictions on speech and create significant compliance burdens for platforms. Both parties subsequently filed motions for summary judgment, and as of now, the law is blocked and cannot be enforced.
Oklahoma
Oklahoma Social Media Law (HB 3914)
Status: Failed
Oklahoma’s proposed HB 3914 aimed to require social media companies to verify that account holders are over 18 or confirm parental consent for users aged 16 to 18. The bill included an emergency clause, meaning it would take effect immediately upon being signed into law by the governor. Although the bill passed in the House, it ultimately failed to progress in the Senate and did not become law.
Pennsylvania
Protecting the Mental Health of Young Users on Social Media Platforms (HB 2017)
Status: Failed
HB 2017 sought to protect the mental health of young users on social media by requiring social media companies to verify the age of all users and mandated that minors under 16 obtain express consent from a parent or legal guardian before creating an account. Additionally, the bill would have allowed parents or guardians to view and manage the privacy settings of their minor children’s accounts. It also aimed to limit the types of data social media platforms could collect from minors. A key aspect of the bill would have required platforms to adopt and publish a “Hateful Conduct Prohibited” policy, outlining how they would address online speech that could “vilify, humiliate, or incite violence” against any protected class. The bill proposed creating a mechanism for users to file complaints about harmful content, with platforms obligated to respond directly to these complaints. Violations of these provisions would have been subject to investigation, subpoenas, and daily fines of $1,000 per violation, enforceable by the attorney general. HB 2017 was referred to the Communications and Technology Committee on May 28, 2024. However, it did not advance beyond this stage and ultimately failed to become law.
South Carolina
Child Online Safety Act (H. 3424)
Status: Enacted and in force
The Child Online Safety Act aims to protect minors from harmful online content by requiring websites that contain 33.33% or more material deemed harmful to minors to implement an age verification system, preventing users under 18 from accessing such content. The bill, which took effect on January 1, 2025, defines harmful online content as material or performances depicting sexually explicit nudity or sexual activity that, when evaluated by an average adult applying contemporary community standards, is deemed to appeal to the prurient interest of minors in sex. Additionally, the bill holds websites that produce obscene material or promote child pornography and sexual exploitation liable for damages, court costs, and reasonable attorney’s fees as determined by the court. It also allows for class action lawsuits and creates a private right of action against websites that fail to prevent minors from accessing pornographic material.
South Carolina Age-Appropriate Design Code Act (H. 3402)
Status: Proposed
3402 was introduced and read for the first time on January 14, 2025. The proposed bill would require online services likely to be accessed by children to complete data protection impact assessments, enforce high-default privacy settings, and provide clear, age-appropriate privacy disclosures. It would prohibit profiling by default, excessive data collection, and the use of dark patterns to manipulate children’s behavior. Platforms would be barred from processing children’s personal data in ways that could cause harm, collecting precise geolocation data without clear notice, or using algorithms that pose foreseeable risks. Violations could result in civil penalties of up to $7,500 per affected child for intentional breaches. The bill has been referred to the Committee on Judiciary.
South Carolina Social Media Regulation (H. 3431)
Status: Proposed
Introduced on the first day of the 2025 session, the South Carolina Social Media Regulation Act seeks to establish legal safeguards for minors interacting with social media platforms. The bill would require platforms to verify users’ ages with a level of certainty that reflects the risks associated with their data practices. It also would mandate that residents under 18 in South Carolina be unable to hold social media accounts without the express consent of a parent or legal guardian, and parents be provided with tools to supervise their children’s use of the service. The bill would further require platforms to block adult users from contacting minors and limit advertising and data collection practices, and would limit ads served to minors and data collection unless essential for the service’s functionality. Additionally, the bill would enforce access restrictions to prevent minors from viewing harmful content related to violence or self-harm or from viewing sexual material. If passed, the bill would take effect on March 1, 2026. While similar bills have been introduced in the South Carolina legislature, none have successfully passed into law to date.
Tennessee
Protect Tennessee Minors Act (HB 1614)
Status: Enacted and in force
Tennessee’s law requiring pornographic websites to verify the age of their visitors was blocked by the U.S. District Court for the Western District of Tennessee on December 31, 2024 on grounds that the law likely violates the First Amendment. However, on January 13, 2025, the Sixth Circuit Court of Appeals granted a stay of the injunction, allowing the law to go into effect while the Supreme Court considers the Paxton case out of Texas. The law, which requires websites offering a significant amount of content deemed harmful to minors to verify users’ ages every hour through “reasonable” methods, is designed to protect children from content considered harmful to minors.
Protecting Children from Social Media Act (HB 1891)
Status: Enacted and in force
HB 1891 is a Tennessee law passed on May 2, 2024 that took effect on January 1, 2025. The law mandates various social media platforms to verify the age of all users. If users are under 18, parental consent must be obtained before they can access the platform. Existing users must verify their age within 14 days of the law’s implementation. The law also grants parents the ability to monitor and control privacy settings, set daily usage limits, and enforce breaks from social media. The law prohibits companies from retaining data gathered during the age verification or consent process. While the law passed with minimal opposition, NetChoice filed a lawsuit against Tennessee’s Attorney General in October 2024, claiming it violates the First Amendment. This bill falls under the enforcement authority of the Tennessee attorney general.
Texas
Age-Verification for Adult Content (HB 1181)
Status: Enacted but partially enjoined
The U.S. Supreme Court heard oral arguments on January 15, 2025, in Free Speech Coalition v. Paxton, a case challenging the constitutionality of Texas HB 1181, which requires adult-oriented websites to implement age-verification measures for visitors and display health warnings about pornography’s potential effects. While the law aims to protect minors, critics argue it imposes undue burdens on adult viewers and infringes on First Amendment rights. The Fifth Circuit previously upheld the law’s age-verification requirement under rational basis review but struck down the health warning provisions as unconstitutional compelled speech. The plaintiffs appealed to the U.S. Supreme Court, which granted certiorari on July 2, 2024. Oral arguments were held on January 15, 2025, focusing on whether the Fifth Circuit erred in applying rational basis review instead of strict scrutiny to the age-verification requirement. As of February 2025, a decision from the Supreme Court is pending, with significant implications for online content regulation and First Amendment rights. A decision is expected by mid-2025.
Securing Children Online through Parental Empowerment (SCOPE Act) (HB 18)
Status: Enacted but partially enjoined
The Texas SCOPE Act, which aims to protect minors from harmful content and data collection practices, came into effect partially on September 1, 2024. It requires digital services facilitating online interactions, such as social media platforms, to implement strict age verification methods and limit data collection for minors under 18. The law also mandates parental consent for users under 18, restricts targeted advertising, and imposes content filtering to prevent minors from accessing harmful material such as content that promotes self-harm, eating disorders, substance abuse, or bullying. Digital service providers must also give parents tools to monitor and control their children’s online activity. Certain exemptions apply, including for small businesses and higher education institutions. The law is currently subject to a partial injunction issued by the U.S. District Court for the Western District of Texas on August 30, 2024, which temporarily halts the implementation of the “monitoring and filtering” requirements due to concerns over privacy, free speech, and the feasibility of enforcement. While the injunction does not impact other provisions, such as those limiting data collection and banning targeted advertising for minors, the overall status of the law is still in flux as legal challenges continue. Notably, in October 2024, Texas Attorney General Paxton sued TikTok for violating HB 18. On February 7, 2025, a federal judge struck-down provisions that include content monitoring and filtering requirements, age verification for platforms with a certain amount of harmful content, and restrictions on data collection for targeted ads. The ruling likely weakens Texas’ lawsuit against TikTok, as the company no longer faces the legal pressure to comply with the SCOPE Act’s content filtering and age verification requirements.
Texas Anti-Deplatforming Law (HB 20)
Status: Enacted but enjoined
Texas HB 20, also known as the Texas anti-deplatforming law, was signed into law on September 9, 2021. It prohibits large social media platforms from removing, moderating, or labeling posts based on a user’s viewpoint, with exceptions for unlawful content or content directly inciting criminal activity. It also requires transparency from platforms about their algorithmic and moderation practices. The bill applies to platforms with over 50 million monthly active users in the U.S. and aims to ensure that political speech is not censored. The law has been challenged in NetChoice v. Paxton, where industry groups argue that it violates the First Amendment by restricting platforms’ ability to moderate content based on viewpoint. In December 2021, a U.S. District Judge blocked the law on First Amendment grounds, with a subsequent series of the injunction being lifted and restored. The case was appealed to the U.S. Supreme Court, which agreed to hear the case, alongside a similar case from Florida (NetChoice v. Moody), addressing whether these state laws comply with the First Amendment. In July 2024, the Supreme Court ruled in favor of NetChoice, allowing the injunction against enforcement of HB 20 to remain in place while the case returned to the lower courts for further review.
Utah
Utah Minor Protection in Social Media (SB 194)
Status: Enacted but enjoined
In NetChoice v. Reyes, NetChoice sued to block enforcement of Utah’s SB 194, a law aimed at protecting children’s mental health by requiring social media platforms to verify users’ ages and apply enhanced privacy settings to minors’ accounts. SB 194 partially replaced a previous state law that had been repealed after NetChoice sued to block it. On September 10, 2024, the U.S. District Court for the District of Utah granted a preliminary injunction, blocking the enforcement of SB 194. The court found that NetChoice was likely to succeed in demonstrating that the law violated the First Amendment by imposing content-based restrictions on social media companies’ speech. The case remains on appeal in the Tenth Circuit, and litigation is ongoing.
Utah Social Media Amendments (HB 464)
Status: Enacted and in force
In March 2024, Utah enacted H.B. 464, the Social Media Amendments, which repealed and replaced the Utah Social Media Regulations Act previously blocked in the wake of a lawsuit by NetChoice. The law empowers parents of Utah minors to sue social media platforms for mental health harm caused by the design and content of their services, targeting negative health outcomes associated with excessive use caused by algorithmically curated content. The law establishes a rebuttable presumption that social media platforms cause harm to minors when they employ curated algorithms or engagement-driven design features. Social media companies can rebut the presumption of harm by obtaining parental consent for a minor’s use, removing features that promote excessive use, such as autoplay and push notifications, and limiting the amount of time a minor may spend on the platform.
Vermont
Vermont Age-Appropriate Design Code (H. 121)
Status: Vetoed
121 was a broad privacy bill in Vermont that included provisions from the age-appropriate design bill; state legislators had previously introduced privacy and age-appropriate design as separate bills (H. 712 and S. 289). The new bill would have required online services likely to be accessed by anyone under 18 to act in the best interest of children and minimize harm. It also mandated that covered entities conduct data protection impact assessments and provide privacy information, terms, and policies in a way that children could understand. H. 121 passed in the Vermont legislature in May 2024 but was vetoed by Governor Phil Scott, who cited concerns about the age-appropriate design provisions given that a similar bill in California had been stayed.
West Virginia
Child Social Media Protection Bill (HB 5226)
Status: Failed
West Virginia HB 5226, introduced in 2024, would have prohibited social media platforms from allowing minors under 18 to create accounts without explicit parental consent, starting July 1, 2025. Platforms would also be required to verify the age of new or existing users and confirm parental consent for minors within 14 days of account access. If age verification was not completed within the timeframe, access would be denied. The bill also aimed to restrict data collection and targeted advertising for minor users and mandated parental access to their child’s account. However, the bill did not progress past the committee stage and was not enacted into law.
NOTABLE INTERNATIONAL EFFORTS
Enacted in November 2024, this law bans individuals under 16 from accessing social media platforms and imposes penalties on companies that fail to enforce the restriction. The law grants social media companies a transition period of one year to introduce controls to prevent users under 16 from accessing their platforms, with stiff penalties and strict enforcement measures expected to take force later this year.
European Union (Spain’s Proposal)
In January 2025, Spain proposed EU-wide regulations, including ending social media anonymity, increasing algorithm transparency, and holding CEOs personally accountable for compliance violations. These proposals are under discussion at the EU level and have not yet been enacted.
In January 2025, families in France filed legal action against TikTok, alleging that the platform exposed minors to harmful content promoting suicide, self-harm, and eating disorders. This case is in its initial stages in French courts.
Brazil Looks to Regulate Social Media
The Supreme Court of Brazil is examining a number of cases that turn on how far social media should be regulated and what responsibilities platforms have in cracking down on illegal content. The Supreme Court of Brazil temporarily suspended a social media platform in 2024, with continued expectations of litigation this year.
SO WHAT’S NEXT?
That’s an awful lot of legislation and litigation centering on what has become a primary source of global communication. While many online entities and content carriers have gone all-in on protracted legal battles, some (particularly adult content sites) simply decided to go dark in states that have age verification or obscenity laws on the books rather than risk the wrath of the judicial system. With so much going on in so many places, it’s impossible to accurately predict what will happen other than to say that lawyers and judges will remain busy trying to sort it all out. Each side of the issue is passionate about the cause and show no signs of backing down. Is protecting children online more important than protecting the First Amendment? Are age-verification laws so critical that the entire online community should be required to submit a government ID before logging on? Reasonable people can disagree—and they’re doing so in courts around the world.
One thing you can be sure of is this: the fine folks at Socially Aware will be paying close attention to all these cases. It’s going to be an interesting year.