top of page

Shazam! Idaho lawmakers still have time to fix the social media bill

Idaho lawmakers are right to take the harms of social media seriously. Parents across the state see how addictive design features, targeted advertising, and algorithm-driven feeds can affect children. The impulse behind House Bill 542—the Stop Harms from Addictive Social Media Act (what lawmakers are calling the Shazam act) —comes from a legitimate concern: protecting kids online.

3D icons of social media apps, including TikTok, Twitter, Facebook, YouTube, Instagram, and LinkedIn, on a blue background.

But good intentions don’t automatically produce good policy.


The bill passed a Senate committee on Friday, but was sent to the 14th Order for amendments, giving legislators an important opportunity to address serious technical, legal, and privacy concerns before it moves forward.


If lawmakers truly want to protect children without creating new risks, here are the most important fixes the Legislature should make before final passage.


1. Eliminate the Mandatory Usage-Tracking Infrastructure


One of the bill’s biggest problems is the requirement that platforms track cumulative user activity at specific thresholds—such as 25, 50, and 100 hours—to trigger escalating age verification requirements.


To comply, companies would have to build systems that log and store detailed usage histories tied to identifiable users.


That creates a massive data honeypot, potentially including sensitive information like biometric data or government identification. Large centralized databases like this are prime targets for hackers, identity thieves, and foreign adversaries.


A bill designed to protect children should not require companies to collect more sensitive data about them than they do today. Lawmakers could remove the cumulative tracking requirement and allow privacy-preserving age assurance tools that do not require perpetual monitoring.


2. Remove the Ongoing Re-Verification Mandate


The bill also requires repeated re-verification every 100 hours of use, effectively turning age verification into a permanent surveillance system.


Protecting children should not mean requiring users—especially minors—to repeatedly prove their identity to private companies.


Age verification should be event-based, not continuous. Once a user is verified, the law should not require ongoing monitoring that creates additional points of failure for data breaches.


Eliminating the recurring verification trigger would dramatically reduce privacy risks while still allowing age safeguards.


3. Replace Arbitrary “Confidence Thresholds” with Flexible Standards


H-542 requires platforms to meet 80% and 90% statistical confidence thresholds when estimating a user’s age.


These numbers sound precise, but in reality they are arbitrary technical mandates. Different verification technologies produce different confidence scores, and those scores can vary based on demographics, device quality, or algorithmic models.


Mandating numerical thresholds will likely push companies toward the most invasive verification methods available, such as:

  • Facial recognition scans

  • Government ID uploads

  • Biometric analysis


Instead of dictating statistical quotas, lawmakers should require reasonable, risk-based age assurance methods that minimize data collection while protecting minors.


4. Remove or Narrow the Private Right of Action


The bill currently includes a private right of action, allowing individuals to sue platforms for violations.


Combined with complex technical mandates, this provision risks creating a wave of opportunistic litigation rather than meaningful child protection.


High statutory damages and strict compliance rules can easily produce a “sue-and-settle” environment that benefits trial lawyers while diverting resources away from building better safety tools.


If enforcement is necessary, it should primarily come through the Attorney General’s office, which can focus on real harms rather than technical gotchas.


5. Narrow the Scope to Actual Social Media Platforms


As written, the bill could sweep in many services that are not traditional social media platforms.


Sites that allow comments, reviews, or limited user interaction—including news outlets, e-commerce platforms, educational tools, and professional networking sites—could face the same compliance burdens as major social media companies.


This risks harming Idaho’s digital economy and discouraging startups from launching new online services.


The Legislature should clearly define large social media platforms and exempt incidental interactive features.


6. Reinforce Parental Authority


Finally, lawmakers should ensure the bill strengthens—not replaces—parental authority.


Parents should have access to tools that help them manage their children’s online activity. But the state should not dictate rigid technical triggers that override family judgment.


A better approach would focus on transparency tools, parental controls, and clear opt-in mechanisms rather than prescriptive monitoring requirements.


The concerns about social media and children are real. Lawmakers deserve credit for trying to address them.


But legislation that touches the internet, privacy, and child safety must be carefully designed. Poorly structured mandates can easily produce unintended consequences—more surveillance, more litigation, and more data risks for the very children the law is meant to protect.


The bill’s placement on the 14th Order gives Idaho lawmakers a valuable opportunity to improve it.


With thoughtful amendments, the Legislature can protect kids while also safeguarding privacy, encouraging innovation, and respecting the role of parents.


Idaho should take that opportunity.

Comments


bottom of page