Select Page
Share

This information is directly from Fairplay for Kids. See website with full details here. 

Another Meta whistleblower steps up

For years, Meta’s Instagram platform has exposed its youngest and most vulnerable users to sexual exploitation, harassment, pro-eating disorder content, and a whole host of other harms. In 2021, whistleblower Frances Haugen revealed that Meta executives were fully aware that their platform design was making teens feel worse about themselves and cultivating body image issues amongst teen girls. When faced with that disturbing evidence, Meta executives didn’t act on it – they ignored it.

Now, two years later, another Meta whistleblower has spoken out, and his revelations paint an even more damning picture of the company. According to former Meta engineer Arturo Béjar, Meta executives – including Mark Zuckerberg – are fully aware of how to make their platforms safer for kids, but they are choosing not to.

In his November 7 testimony before Congress, Béjar shared how the Well-Being team at Meta collected and presented top executives with research and data indicating that the company’s rules, automated systems, and metrics for determining children’s safety were simply inadequate and failing to keep kids safe. Mr. Béjar described the extent of intel that Meta was privy to and the design solutions proposed to fix these problems. Béjar gave his testimony not only as a whistleblower, but also as a concerned father. His own teenage daughter experienced unwanted sexual advances from users on Instagram, an experience he described to be deeply disturbing for both parent and child.

Most shockingly, Mr. Béjar revealed that Adam Mosseri – the head of Instagram himself – acknowledged and agreed with the child safety issues that the data pointed to. And yet, Meta ultimately did not resolve any of these problems and continued to allow these online harms to persist.

Read more at Fairplay for Kids website. 

Watch Senate Subcommittee hearing on Social Media and the Teen Mental Health Crisis, featuring Arturo Bejar, Former Director of Engineering for Protect and Care Facebook. Download his testimony. 

“My name is Arturo Bejar and I am a dad with firsthand experience of a child who experienced unwanted sexual advances on Instagram and an expert with 20 years of experience working as a senior leader, including leading online security, safety, and protection at Facebook.”

“Meta continues to publicly misrepresent the level and frequency of harm that users, especially children, experience on the platform, And they have yet to establish a goal for actually reducing those harms and protecting children. It’s time that the public and parents understand the true level of harm posed by these “products” and it’s time that young users have the tools to report and suppress online abuse.”

“…October 5, 2021, after having the analysis reviewed internally, I sent a detailed email to Mark Zuckerberg and the other senior leaders detailing what I had found. First, I pointed out how the reporting process grossly understated misconduct on the site. I explained that the number of people reporting to surveys that they had a negative experience on Instagram was 51% every week but only 1% of those reported the offending content and only 2% of those succeeded in getting the offending content taken down. Thereafter, I detailed the staggering levels of abuse that teens aged 13-15 were experiencing every week. The initial data from the research team indicated that as many as 21.8% of 13-15 year olds said they were the target of bullying in the past seven days, 39.4% of 13-15 year old children said they had experienced negative comparison, in the past seven days, and 24.4% of 13-15 year old responded said they received unwanted advances, all in the prior seven days. Later, the research team revised the survey results to state that the likely number of 13-15 year old children receiving unwanted sexual advances in the past seven days was likely only 13 percent, still a shocking number. Obviously, an even higher percentage of these children are receiving unwanted sexual advances on a monthly basis. The reaction was not constructive. Sheryl Sandberg expressed empathy for my daughter but offered no concrete ideas or action. Adam Mosseri responded with an request for a follow up meeting, Mark Zuckerberg never replied. That was unusual. It might have happened, but I don’t recall Mark ever not responding to me previously in numerous communications, either by email or by asking for an in-person meeting.”

“Today, most harm remains unaddressed Most of the distress people experience online because of unwanted contact and content is not addressed today by Meta and other social media companies. I say that based on my extensive experience working to keep users safe, along with my direct knowledge of extensive research and data about what people experience on Meta’s services and elsewhere online. But it’s not just that the companies disregard people’s distress. The way they respond to problems often makes those problems worse, because it normalizes harmful behavior and encourages unwanted contact and content. In addition, the way companies talk about these problems to regulators, policy makers, and the general public is seriously misleading.”


Shocking revelations from Meta lawsuit

Last month, 42 attorneys general sued Meta alleging that Facebook and Instagram’s design and business practices harm young people. The Massachusetts Attorney General’s complaint says Meta leadership repeatedly declined to make changes and implement safety features that would protect kids and teens on Instagram. For example, internal research demonstrated that “plastic surgery” filters and like counts on posts harm young users. When staff asked Meta leadership to make changes to protect youth from these features, however, Mark Zuckerberg denied those requests, the complaint says. It also highlights the fact that Meta counts on young Instagram users to spend hours on the app every day. Massachusetts’ allegations reveal what we already know: when it comes to kids and teens, Meta will consistently put profits over wellbeing.

——————–

Watch Fairplay Executive Director Josh Golin discussed recent legal challenges to social media companies over claims the platforms negatively impact youth mental health.

 

Share
Share