Unmonitored Minors on Dangerous Devices
Better regulation on devices and in app stores is common sense.
Device manufacturers and app developers pretend that existing age-ratings of apps are effective; they turn a blind eye to the inappropriate ads that have pestered children for over a decade; they tell us that devices, in themselves, are child-safe. The engaged parent recognizes the deceitfulness of these assurances.
On November 16, the Institute for Family Studies (IFS) and the Ethics and Public Policy Center (EPPC) released a policy brief on device-level regulation to address these three predicates, among others. The brief couldn’t be timelier. Concerns regarding children’s online experiences drove 42 states to sue Meta over its addictive and harmful designs in October 2023. These lawsuits pose powerful challenges to unnecessarily dangerous and addictive features of Meta apps, and consistent litigation will continue to be essential for this issue. Yet these efforts, however critical, leave a primary issue unaddressed, that the dealers of these substances—device and app store manufacturers—have thus far avoided legislative and legal attention. The IFS/EPPC brief recommends several ways to close that loop.
Until now, devices and app stores, dominated by Apple and Google, have remained unregulated for children. The burden of responsibility for user safety rests disproportionately on the shoulders of parents, and only recently have companies like Meta begun to feel pressure. Legislators and policymakers have not given attention to the role of device manufacturers, focusing their child safety efforts largely on the regulation of dangerous (pornographic, violent) and highly trafficked (social media) sites, as in the case of several federal bipartisan bills that legislators hope to bring to the floor before the end of the year, including the Kids Online Safety Act and COPPA 2.0.
Until now, we have hoped that smartphones and tablets might be neutral tools applied for good or ill. Considering the utter lack of regulation for device and app store makers, they would have to be neutral at the convenience of their manufacturers. The legal environment for devices and the app stores accessed on them is unjustly permissive for Big Tech, thanks to Reno v. ACLU (a 1997 ruling against child protections on the grounds that harmful material does not surface “unbidden” online), Ashcroft v. ACLU (a 2002 ruling against age verification for adult sites), and the overexpansion of Section 230 of the Communications Decency Act, which excuses social media platforms from liability for the posting of pornographic and illegal material.
If we ask Apple and Google to protect minors even before apps are downloaded, we ask them to do something well within their capabilities. The elegance of many of their products, contrasted with the bugginess of the parental controls on offer, suggests that these companies have no motive to resolve their child safety issues.
It’s not that Apple and Google don’t have enough data to recognize whether users are underage and must be guarded from mature content. These companies know enough about your children to recommend games and other apps, but they claim ignorance when it comes time to judge whether material is safe and suitable for them.
Apple and Google don’t want their app stores regulated externally because they, as the brief explains, enjoy a 30% commission on every app downloaded from their stores. And that is before factoring in their profits from ads both in-app and in-store.
Parents simply don’t know what their kids are up to online. It is even less likely that they know that the U.S. Surgeon General Vivek Murthy issued an advisory against social media for all minors, going so far as to suggest that the account holder minimum age be raised. App stores should ensure that parents know this when they purchase certain apps for their kids, just as other consumer goods come with Surgeon General advisories when hazardous to the public health, especially that of children.
Just imagine, for a second, walking your kid into a toy store plastered with posters of Cardi B, erotic literature, and alluring factoids on how to attract lovers. No brick-and-mortar store could get away with it. Imagine picking up a movie rated for a General Audience, only for its graphic violence to haunt your child’s dreams for the next ten years. In other markets, we do not tolerate such a complete lack of regard for children’s welfare and routinely correct it via regulation.
It is, without a doubt, good that 42 states are suing Meta. But so must we prosecute the stores promoting Meta’s products. Litigation by private right of action will prove crucial. As Sen. Hawley (R-MO) highlighted during the hearing of Meta whistleblower Arturo Béjar on November 7, the doors for litigation against these companies must be thrown open: “They fear parents going into court and holding them accountable. That’s the hammer. That’s what happened with Big Tobacco, that’s what happened with opioids.”
Legislators must open the door for litigation by amending the Federal Trade Commission Act, specifically its prohibition against “unfair or deceptive acts or practices in or affecting commerce,” as well as the various state “Little FTC Acts.” The amendments would explicitly prohibit marketing goods to minors with deceptive age ratings and advertisements for abusive, age-inappropriate products. The change could be as simple as clarifying that digital commerce is included in the above prohibition, which would provide state and federal regulators with new litigatory tools to punish Big Tech for failure to comply.
The brief makes yet another commonsense proposal: age verification for all who possess an individual device.
Imagine the ease of it: When you buy a phone, you show your ID to verify your age. This solution is a no-brainer. Or let the buyer age-verify at home by uploading their ID, as they already do for various Apple and Google features such as Apple Wallet. Or if a buyer doesn’t want their ID on their device, Apple and Google could host in-store verification. But, to reiterate, Google and Apple already have precisely what they need to age verify users. They already do it elsewhere—just not for child safety.
If lawmakers choose not to require age verification, the brief continues, they can still require child-friendly default settings across their devices. Our devices should be child-friendly, as these companies claim they are. The brief offers detailed directions for what these child-proof settings can cover, incorporating the recommendations of Protect Young Eyes and the National Center on Sexual Exploitation, such as their model system for an obscenity filter activated automatically when the age of a user can be determined during account setup without formal age verification.
Get daily emails in your inbox
At bottom, Apple and Google’s duopoly over devices and app stores must be broken. We need alternatives. As of right now, these Big Tech behemoths make it impossible for device holders to access app store competitors. A federal bill, the Open App Markets Act (OAMA) presented by Senators Richard Blumenthal, a Connecticut Democrat, and Marsha Blackburn, a Tennessee Republican, aims to, among other things, decentralize the market and make it possible for other app stores to break in. We must not forget that the world made in Apple’s image is not the only possible world.
All these measures—alternative device and app stores, child-safe default settings, age ratings for apps—must be navigable and legible for parents. For example, one would expect to see the age rating of an app before approving or declining a child’s request to download. Placement matters—parents should not have to go searching for information that, in the spirit of these devices, should be right at their fingertips.
Restaurants offer non-smoking areas; quiet hours are enforced across whole neighborhoods. The devices we use to communicate, navigate, and socialize have proven unworthy of our trust and do not deserve exemption from our basic standards of living.