[ad_1]
Utah’s regulation is one of the most aggressive laws passed by any state to limit the use of social media by young people at a time when experts are sounding the alarm about the deteriorating mental health of America’s youth. Some. Congress is struggling to pass tougher online child safety legislation despite bipartisan concerns about the impact of social media on children.
Two bills were previously passed by the Utah Legislature.
“We will no longer allow social media companies to continue to undermine the mental health of our young people,” said Cox. tweeted on thursday“Utah is leading the way in holding social media companies accountable, and our momentum isn’t waning anytime soon.”
The bill’s passage coincided with TikTok CEO Shou Zi Chew’s first appearance in Congress, when he criticized the hugely popular video app for harming children’s welfare. faced extensive questioning by lawmakers who feared that They also said the company represents a national security threat because it is owned by Beijing-based ByteDance.
Tech companies face increased scrutiny from lawmakers and advocates for the impact their services have on youth. Last year, California lawmakers passed the California Age-Appropriate Design Code Act. It requires digital platforms to scrutinize new products for potential harm to minors and provide privacy guardrails to young users by default. But tech trade group NetChoice said the law violated the First Amendment, giving tech companies the right under the Constitution to make “editorial decisions” about what content to publish or remove. appealed to block the law, claiming that
Efforts to tighten federal rules governing how tech companies handle minors’ data and protect their mental and physical safety have stalled. Late last year, senators tried to push Congress to pass a new online privacy and safety protection for kids as part of an omnibus spending package.
New Utah regulations require tech companies to block children’s access to social media apps between 10:30 p.m. I can. Platforms should also prohibit Direct Her messages from people the child does not follow or befriend, and should block underage accounts from search results.
Utah restrictions further prohibit companies from collecting data on children and targeting their accounts with ads. are trying to prohibit designing into their own services.
Privacy advocates say the bill goes too far and could endanger LGBTQ children and those living in abusive homes.
“These bills fundamentally undermine the constitutional and human rights of Utah’s youth, but they also make no sense,” said Evan Greer, director of digital advocacy group Fight for the Future. says. “I don’t know if anyone has really thought about how this actually works. How do tech companies determine if someone is someone else’s parent or legal guardian? What about situations where there is a custody battle or abuse allegation and an abusive parent is trying to gain access to their child’s social media messages?”
Common Sense Media, a family-oriented media advocacy group, had mixed reactions to Thursday’s news. In a statement on that site, the group said it supports HB 311, one of her laws passed by Utah. The group does not support a second law, SB 152, which provides parental oversight and requires parental consent to create social media accounts.
“Unfortunately, Governor Cox also signed SB 152 into law, which gives parents access to their underage children’s posts and all the messages they send and receive. This deprives children of the online privacy protections we stand for.”
Industry groups have indicated that they have concerns about the First Amendment to the rule. Carl Szabo, vice president and general counsel for NetChoice, said the group is considering next steps in Utah legislation and is in discussions with other allies in the tech industry.
“This law violates the First Amendment by encroaching on adults’ lawful access to constitutionally protected speech while mandating massive data collection and tracking of all Utah citizens. ,” said Szabo. In the past, NetChoice has worked with industry groups to challenge social media laws in Florida and Texas.
Social media platforms face increasing scrutiny for exposing young people to toxic content and dangerous predators. Earlier this year, the Centers for Disease Control and Prevention found that in 2021, nearly one in three of her high school girls reported having seriously considered suicide. Some experts and schools have also claimed that social media is leading to a mental health crisis among young people.
It’s unclear how the tech companies will impose age restrictions on their apps. Social media companies already ban children under the age of 13 from using most services, but advocates, parents, and experts warn children to lie about their age, You can easily circumvent these rules by using your account.
Tech companies like Meta, TikTok, and Snapchat are also adjusting their services to offer more parental controls and moderation for minors.
Antigone Davis, Meta’s global head of safety, said in a statement that the company is using “age verification technology” to ensure “teens have an age-appropriate experience” on social networks. Instagram automatically sets accounts private for teens when they join and sends them notifications reminding them to take regular breaks.
“We do not allow content that promotes suicide, self-harm, or eating disorders. We identify and report over 99% of the content we remove or take action on,” Davis said. said Mr. “We will work closely with experts, policy makers and parents on these important issues.”
Snap declined to comment.
[ad_2]
Source link