Meta is expanding and updating its child safety features in response to allegations of the proliferation of sexual abuse content about children on its platform. The company has stated that it is developing technology to tackle abuse, hiring specialists dedicated to online child safety, and sharing information with law enforcement and industry peers.
“In addition to developing technology to tackle this abuse, we employ specialists dedicated to online child safety and sharing information with industry peers and law enforcement,” the company said in a statement.
The company also reported taking immediate steps to strengthen its protections, and that its child safety teams continue to work on additional measures to protect young people.
The Wall Street Journal recently detailed how Instagram and Facebook show inappropriate and sexual child-related content to users. The report showed how Instagram connects a network of accounts buying and selling child sexual abuse material and that Facebook Groups have an ecosystem of pedophile accounts with as many as 800,000 members.
In response, Meta stated that potentially suspicious adults on Instagram will be prevented from following one another, will not be recommended to each other, and will not be shown comments from one another on public posts.
Meta also claimed that after launching a new automated enforcement effort, “we saw five times as many automated deletions of Instagram Lives that contained adult nudity and sexual activity,” and that “we actioned over 4 million Reels per month, across Facebook and Instagram globally, for violating our policies.”
This demonstrates Meta’s commitment to preventing child exploitation and ensuring a safer online environment for young users.
I have over 10 years of experience in the cryptocurrency industry and I have been on the list of the top authors on LinkedIn for the past 5 years. I have a wealth of knowledge to share with my readers, and my goal is to help them navigate the ever-changing world of cryptocurrencies.