Currently, from 8 to 80, people of all ages have smartphones in their hands. The use of social media is increasing along with it. Tech giant Meta is the guardian of all social media. Even if children don’t have their own accounts, they use social media from the accounts of their parents or any close relatives. In social media, modest and nasty contents go hand in hand. These contents affect children. On December 1, in a blog post Meta conformed that it is expanding child safety measures.
Main Objective of Meta Child Safety Measures
There was an extended investigation into child sexual abuse content on the meta platform. Meta said it has begun expanding and updating its child protection features aimed at protecting children.
Why did Meta make such a decision?
The European Union’s pressure on Meta contributed to this decision. The European Union (EU) on November 10, asked Meta as well as social media firm Snap to take tougher measures to stop the circulation of illegal content online. The companies were given time till December 1 to respond.The investigation was at an early stage under the European Union’s Digital Services Act.
In October, the EU launched an investigation into TikTok, X and Meta spreading disinformation amid an upsurge in violence between Israel and the Palestinian militant group Hamas. After Hamas terror attacks against Israel, they realized that the Meta platform was being used to spread illegal content and disinformation. The head of the union, Thierry Breton, also wrote to Mark Zuckerberg in this regard. He revealed it on his X handle.
“I would ask you to be very vigilant to ensure strict compliance with the DSA rules on terms of service, on the requirement of timely, diligent and objective action following notices of illegal content in the EU, and on the need for proportionate and effective mitigation measures,” he wrote in the letter.
EU regulators made the same request to Elon Musk’s X, tiktok with Meta to tackle child sexual abuse content. They emphasized the need for meta platforms to provide detailed insights into actions taken.
According to a report by The Verge, Instagram and Facebook have been detailed by The Wall Street Journal for serving inappropriate and sexual child-related content to users. In June, a report detailed how Instagram connected a network of accounts buying and selling child sexual abuse content, recommending them.
Meta said on Friday that it would place restrictions on how “suspicious” adult accounts can communicate with each other.
What Steps will be taken by Meta?
Meta said it takes recent complaints about the effectiveness of its work very seriously and has created a task force to review existing policies that examine enforcement systems to come up with changes that will strengthen protections for young people.
“Predators do not limit their efforts to harm children to online spaces, so it is imperative that we work together to stop predators and prevent the exploitation of children,” the company said in a statement.
As per CNN reports, Meta has been accused by 33 U.S. attorneys general of receiving more than one million reports of users under age 13, on Instagram between early 2019 and mid-2023 from parents, friends and online community members. The tech giant is also alleged to have violated several state-based consumer protection laws as well as the Children’s Online Privacy Protection Rule (COPPA).
A follow-up investigation published by Meta showed the problem extends to Facebook groups, where there is an ecosystem of pedophile accounts and groups, including 800,000 members.
As a result of the investigation, Meta removed more than 37 million pieces of bad content on Facebook and Instagram in India in October.
“At Facebook, we’re using this technology to better find and address certain groups, pages and profiles,” the company said.
Meta said it is sending Instagram accounts exhibiting potentially suspicious behavior to its own content reviewers, if they can observe enough of the 60+ signals , these accounts will be automatically disabled.
The company said that it has taken action on more than 4 million reels per month on Facebook and Instagram worldwide for policy violations.
Comments 1