The fight over children's access to social media
... and other child welfare news for January 9, 2024
A new Ohio law requires social media apps to verify that users are at least 16 years old or have parental permission. The state joins others such as Utah and Arkansas in making efforts to protect children from the widely-recognized harmful effects of social media.
Unfortunately, the tech giants don’t like parents having a say in whether their children are addicted to TikTok, daily exposed to its harmful content, and drowning in social media-induced depression. An organization claiming to represent “some of the most popular websites, applications, and digital services” including TikTok, Meta (Facebook, Instagram), and Twitter has sued Ohio for violating these companies’ rights to “publish valuable expression to minors and adults alike” and for violating the rights of children to do whatever they want online free from parental oversight.
A federal judge has been asked to block the law temporarily. An Arkansas federal judge temporarily blocked that state’s law. The organization has filed a similar suit to block the Utah law, arguing that “No social media law is as overburdensome and restrictive as this . . . . There’s no comparison to Utah’s expansive and almost complete government control on how Utahns access the internet.”
[At this point, I’d be strongly tempted to point out the bizarre nature of a Chinese company, TikTok, that willingly allows the Chinese government to censor its apps in China along with the entire internet, arguing that it has the right to show US children whatever it desires even over parental objection. But I won’t.]
The Arkansas ruling held that the state’s requirements were overly broad and vague and emphasized that there are less restrictive ways that parents can limit their children’s access to harmful material. That ruling also cited the Supreme Court’s 2011 ruling in Brown v. Entertainment Merchants Association, where the Court struck down legislative prohibitions on the sale of violent video games to minors.
I’d predict that this issue — whether governments can require that social media companies ensure that children have parental consent to access their content — will eventually reach the Supreme Court. In the Brown case, one of Justice Scalia’s arguments against the constitutionality of the video game prohibition was that it wasn’t needed because the industry had already created a video game rating system that alerted children (and caring parents) as to the age-appropriateness of games. Here, it appears the tech industry exercises very little control over what independent contractors (i.e, “influencers”) put out there for children to see. And while First Amendment jurisprudence in this area has often put the burden on the parent to limit the child’s information intake rather than on the information producer to limit what they put out there for the public to see, one has to ask whether that rubric applies in a world where social media companies have algorithms that can specifically target children, often providing them with harmful content. What’s wrong with requiring the company to know whether its content consumer is a child so that its algorithm can avoid filling the child’s feed with age-inappropriate material?
Here’s an article on an issue that probably deserves its own deep dive. The US Center for Safe Sport, which was formed after the Larry Nasser gymnastics scandal, appears overwhelmed with its responsibilities. In April 2022, a 13-year old swimmer from the Denver area with a promising competitive future received notification from SafeSport that someone had filed a complaint against him. “Between approximately 2019 and 2022,” it said, “you allegedly engaged in a pattern of behavior which constitutes Sexual Misconduct.” The allegation, he discovered three months later, was that he had slapped a fellow swimmer on the butt. He denies it occurred, and the police found no reason to take any action. Almost two years later, the complaint still hasn’t been resolved and he’s still not able to compete until it is. One problem, the article notes, is that the agency receives around “7,000 complaints annually and only has about 65 employees in its response and resolution department.’ Cases take too long because there are too many and too few people to handle them.” I’d suggest the problems may be much deeper if you’re taking two years to address one teenager’s popping another’s rear. Does the organization not have a triage system? Is it treating this kind of allegation the same way it treats sexual assaults by coaches? If so, there’s a serious problem, as this article also posits.
Merry Christmas, foster youth! We don’t have any place to put you, so you get to stay in a locked facility while we go enjoy the holiday (New Mexico).
Parenting classes — are they just another box to tick in a reunification or family preservation case plan with no real impact?
Colorado considers narrowing its definition of child abuse.
Maine considers making child welfare a stand-alone agency.
Arizona legislators are so fed up with the Department of Children’s Services that they considered abolishing it. (No, not that kind of abolition).
Illinois DCFS has a new leader.