Meta Platforms Inc. will cease access to its artificial intelligence characters for users identified as minors in the coming weeks. This action coincides with an impending trial in Los Angeles, where Meta, TikTok, and YouTube will face allegations regarding the detrimental impact of their applications on children's well-being.
Although teenagers will lose access to AI characters, they can still utilize Meta’s AI assistant, allowing a degree of technological interaction. This decision aligns with a trend among tech companies toward restricting minors' engagement with AI, a movement sparked by child safety concerns amid ongoing legal challenges.
Notably, Character.AI imposed similar restrictions last fall following lawsuits linked to tragic incidents involving minors and their chatbots. As public scrutiny intensifies, the upcoming trial is expected to shed further light on these critical issues, with advocates calling for stronger regulations to ensure the safety of young users in digital spaces.