Meta did not act to protect teenagers, second whistleblower testifies

Arturo Bejar, former Facebook employee and consultant for Instagram, testifies before the Senate Judiciary Subcommittee on Privacy, Technology and the Law during a hearing to examine social media and the mental health crisis of teenagers, Tuesday, November 7, 2023, on Capitol Hill. in Washington.

Stephanie Scarbrough | P.A.

A second Meta The whistleblower testified before a Senate subcommittee on Tuesday, this time describing what he called unsuccessful efforts to report the extent of the harmful effects Meta’s platforms could have on teens to senior leaders. of the company.

Arturo Bejar, former director of engineering at Facebook from 2009 to 2015, who later worked as a consultant at Instagram from 2019 to 2021, testified before the Senate Judiciary Subcommittee on Privacy, Technology and the Law that top Meta officials had not done enough to stem the damage caused. its youngest users experienced on the platforms.

Lawmakers on both sides have blamed the tech lobby for Congress’s failure to pass laws protecting children online. Despite broad support in Senate committees for bills aimed at protecting children on the Internet, they ultimately sat dormant, awaiting a vote in the Senate or action in the House.

Bejar’s appearance shows the frustration of lawmakers who believe big tech companies operate with largely unchecked power.

Béjar’s allegations

Bejar recently made allegations against the company in a Interview with the Wall Street Journal. He follows in the footsteps of Frances Haugen, another former Meta employee who leaked internal documents and research to news organizations and the Senate to shed light on the company’s security issues.

Meta executives were aware of common harms to their youngest users but refused to take adequate steps to address them, Bejar told lawmakers Tuesday.

Blumenthal said that before the hearing, Bejar told him about a conversation with chief product officer Chris Cox. At that meeting, Bejar said he brought up research on the platform’s harm to teens, and he reiterated that Cox acknowledged he was already aware of the statistics.

“When I came back in 2019, I thought they didn’t know,” Bejar testified. But after this meeting with Cox, he no longer believed it.

“I found it heartbreaking because it meant they knew and they weren’t acting on it,” Bejar said.

Part of the problem, according to Bejar, is that Meta directs its resources toward a “very narrow definition of harm.” He said it’s important to break down the prevalence of different harms on the platform by different user demographics in order to understand the true extent of harms caused to certain groups.

On the day Haugen, Facebook’s first whistleblower, testified in the Senate on October 5, 2021, Bejar sent an email to top Meta executives, including Meta CEO Mark Zuckerberg, COO of then Sheryl Sandberg and Instagram CEO Adam Mosseri.

Bejar, who shared the email as part of a a wealth of documents with the committee, directed the message to Zuckerberg, saying he had already raised these issues with Sandberg, Mosseri and Cox.

In an email to Mosseri on October 14, 2021, in which Bejar outlined his arguments for a meeting scheduled for the next day, Bejar highlighted a survey of 13-15 year olds on Instagram.

According to the survey, 13% of respondents have received unwanted sexual advances on Instagram in the past seven days alone, 26% have witnessed discrimination against people on Instagram based on various identities, and 21% feel less feel good about themselves because of other people’s posts on Instagram. platform.

Bejar wrote in the email to Zuckerberg that his teenage daughter had received unsolicited photos from male users since she was 14. Her daughter said she would block users who sent the photos.

“I asked him why do boys keep doing that?” Bejar wrote in the email. “She said if the only thing that happened was they got stuck, why wouldn’t they do it?”

He advocated for funding and prioritizing efforts to understand what content is fueling poor user experiences, what percentage of that content violates policy, and what product changes they could make to improve the experience on the platform.

Bejar said he never received a response or met with Zuckerberg or Sandberg about the email.

“Every day, countless people inside and outside of Meta are working on how to keep young people safe online,” Meta spokesperson Andy Stone said in a statement. “The issues raised here regarding user perception surveys highlight part of that effort, and surveys like these have led us to create features like anonymous notifications of potentially hurtful content and comment warnings. Working with parents and experts, we also introduced more than 30 tools to help teens and their families have safe and positive online experiences. All of this work continues.

Stone pointed to a tool called “Restrict,” developed based on feedback from teens. If a user restricts a second user, only the second user will be able to see their own comments on the user’s posts. He also emphasized Content Distribution Guidelines 2021designed to meet what the company calls borderline content that falls within the lines of its policies.

Blaming tech money for the lack of new laws

Subcommittee Chairman Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, R-Tenn., positioned their bill, the Kids Online Safety Act (KOSA), as a key solution to the harms described by Bejar. KOSA aims to put more responsibility on technology companies to safely design their products for children.

“Now is the time for Congress to provide protective tools that parents and children can use to disconnect from these algorithms, from these black boxes that generate toxic content,” Blumenthal told reporters before the start of the hearing.

He addressed concerns from some progressive groups that the bill could negatively impact vulnerable children, including LGBTQ youth, saying they had made changes to reflect their concerns.

“This action is not about content or censorship. It’s about the design of the product that drives this toxic content to children,” Blumenthal said. “We’re not trying to come between kids and what they want to see, but simply allowing them to disconnect from the algorithms when they’re generating content they don’t want.”

While some fear that advancing narrow legislation will further delay the implementation of broad privacy protections in Congress, Blumenthal said, “We have now reached a consensus that we must do the possible rather than aiming for the ideal. I am very much in favor of broader legislation. privacy bill, but let’s take it step by step, and the more bipartisan consensus we have on protecting children, the better we’ll be able to craft a privacy bill more wide.

“It’s an indictment of this body, to be honest with you, that we didn’t act,” Rankings subcommittee ranking member Josh Hawley, R-Mo., said. “And we all know the reason. Tech giants are the largest and most powerful lobby in the United States Congress… They have managed to defeat every significant piece of legislation.”

Judiciary Committee Chairman Dick Durbin, D-Ill. criticized the House’s failure to pass bills aimed at protecting children’s safety online after they passed committee with overwhelming support.

Sen. Lindsey Graham, R-S.C., blamed Section 230, tech’s legal liability shield, for enabling tech’s lobbying practices. “The other bills aren’t going to go anywhere until they believe they can be prosecuted,” he said.

Subscribe to CNBC on YouTube.

WATCH: Nation’s Attorneys General File Lawsuit Against Meta, Alleging Addictive Features


Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top