Google and Meta at Odds Over Child Safety Online

The debate over children’s online safety has escalated into a public feud between Google and Meta. Both companies are shifting blame rather than taking accountability, sparking concerns about how tech giants prioritize user protection. While Meta supports legislation that places age verification responsibility on app stores, Google strongly opposes this approach, arguing that it fails to address the real risks children face online.
Google Condemns Meta’s Approach
Google’s criticism follows Utah’s recent enactment of a law requiring app stores to verify users’ ages and obtain parental consent before allowing minors to download apps. Meta, along with Snap and X, has backed this legislation, while Google has labeled it “concerning.”
According to Kareem Ghanem, Google’s public policy director, the law unfairly allows social media platforms to sidestep responsibility, despite being the primary spaces where children engage with digital content. Google maintains that “age verification should be the duty of the platforms themselves, not the app stores that distribute them.”
Google’s Alternative Proposal
In response to Utah’s law, Google introduced an alternative framework suggesting that age verification should apply only to specific high-risk apps rather than all applications. Furthermore, Google proposes that developers—not app stores—should determine what protections are needed for minors.
Critics argue this strategy creates loopholes that could enable unsafe apps to bypass proper verification. Apple has also raised concerns, warning that this approach might lead to excessive data collection from children, as developers might require sensitive personal information to comply with regulations.
Meta’s Stand: Shifting Responsibility to App Stores
Meta argues that app stores are better positioned to implement age verification and obtain parental consent. The company welcomed Google’s recognition that app stores could share age data with developers but questioned how Google would decide which apps require such information.
Despite Meta’s stance, the company itself has faced numerous legal challenges regarding child safety. Critics point out that Meta has a history of failing to protect young users from harmful content and online predators. Instead of enforcing stricter policies on its own platforms, Meta supports laws that shift responsibility elsewhere.
Are Children Any Safer?
As Google and Meta continue their dispute, children’s online safety remains at risk. Utah’s law lacks a standardized age-verification process, leaving app stores to determine what “commercially available methods” should be used. Without clear guidelines, enforcement may be weak, potentially exposing minors to harmful content and online threats.
The ongoing conflict between these tech giants highlights a deeper issue: the prioritization of business interests over user safety. Until companies take direct responsibility for protecting young users, children will continue to face digital dangers while corporations play the blame game.