01/12/2021 / By Cassie B.
When Facebook released figures related to the amount of harmful and abusive content it removed in a “transparency report,” they wanted us to believe that they are a caring social media platform who protects their users.
However, not everyone was convinced. Children’s charity The National Society for the Prevention of Cruelty to Children (NSPCC) quickly accused the social network of underplaying the amount of abusive and harmful content using “selective numbers” that do not paint the full picture of the damage.
The charity’s head of child safety online, Andy Burrows, stated: “It is incredibly disappointing but unfortunately not surprising that Facebook has yet again used selective big numbers that don’t show us the full picture.”
He added: “The statistics we’ve heard on the number of self-harm images is likely to underplay the lived experience of a vulnerable young person on social media being served up these images by an algorithm.”
Facebook’s report claimed it detected and removed 2.5 million pieces of content linked to self-farm across a three-month period. They also reported that their technology proactively detected 99 percent of content relating to child exploitation, terrorism, self-harm and suicide.
Burrows believes, however, that social networks should not be allowed to self-report their efforts in content moderation. “The time for this self-selective reporting must end now and reinforces why we need an independent regulator in place who can call social media networks to account,” he said.
Of course, manipulating people is Facebook’s modus operandi, so this really shouldn’t come as much of a shock.
In December, Facebook removed a series of security features from its Messenger and Instagram apps in Europe that scan for child abuse content. Although they claimed at the time that the move was in response to an update to EU privacy rules designed to stop companies like Facebook from mining message content for advertising purposes, other major companies, such as Google, LinkedIn and Microsoft, have said they plan to continue to proactively scan their own platforms for child abuse content despite the rules.
NSPCC Head of Policy Anna Edmundson said: “Tech firms’ ability to scan for child abuse images and signs of grooming is fundamental to protecting children and fighting against these crimes that transcend national borders. Until the EU finds a way forward, it’s vital children are not put at risk of exploitation and that offenders are still found, stopped and prosecuted.”
Meanwhile, the British government is reportedly mulling issuing an injunction against Facebook to stop them from rolling out end-to-end encryption on services like Messenger and Instagram. This encryption means that no one can see the content of sent messages, not even the platform’s owners, which can make it impossible to track down those who are exploiting and abusing children.
Experts have estimated that 70 percent of the thousands of reports Facebook submits to U.S. authorities each year about predators attempting to groom children on its platforms and the millions of reports on detected images and videos of child abuse could be lost should they move forward with their plans for end-to-end encryption.
In addition to the possibility of encountering predators, social media sites like Facebook are bad for young people’s mental health, with studies linking social media use with anxiety and depression. Research has shown that 9th-graders who use social media for six to nine hours per week have a 47 percent greater likelihood of reporting unhappiness than peers who do not use as much social media. These platforms also deepen young people’s body image concerns, exacerbate bullying and increase loneliness.
Facebook can be harmful to young people in so many ways, and parents need to be very vigilant and pay attention to what their children are doing online.
Sources for this article include:
Tagged Under: Anxiety, Child abuse, depression, Facebook, Glitch, mental health, NSPCC, Social media
COPYRIGHT © 2017 FACEBOOKCOLLAPSE.COM
All content posted on this site is protected under Free Speech. FacebookCollapse.com is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. FacebookCollapse.com assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.