Facebook doesn’t want to take responsibility

Texas News Today

Facebook whistleblower Frances Haugen speaks at a hearing of the Senate Commerce, Science and Transportation Subcommittee on Tuesday, October 5, 2021 in Washington, DC, USA.

Stephanie Reynolds | Bloomberg | Getty Images

Facebook whistleblower Frances Haugen on Monday refused to ask British lawmakers to deny service responsibility or encourage employees to speak up about problematic behavior, saying what they did would have created the toxic situation that exists today. Is.

“Facebook doesn’t want to accept that Facebook is responsible for anyone,” Hogan testified on Monday at a hearing in the British Parliament about a new law that aims to address harmful content online. said.

This is the second time Haugen has been published since revealing himself as the source behind several internal documents that triggered the Wall Street Journal series “Facebook Files”. Hogan testified to the US Congress earlier this month and has since started sharing his series of documents with several media outlets.

Facebook’s leadership is focused on growth, creating a culture that focuses on the positive aspects of the company’s services, at the expense of addressing the issues they present, Hogan said Monday.

“Facebook is full of very honest, kind and sympathetic people,” she said. “Good people are built into a system with bad incentives that leads to bad behavior. It fosters a true pattern of people who are willing to look the other way rather than those who warn.”

According to Haugen, Facebook has not offered any way for employees to indicate issues that management should address or that researchers may consider.

“Facebook has shown time and again that not only do we not want to publish that data, but if we do publish it, it often misleads people,” she said.

This is an attitude rooted in Facebook’s start-up culture and will not change unless the company is forced to change its incentives through regulation, Hogan said.

“When they see a conflict of interest and interest between people, they continue to choose interest,” Hogan said.

A Facebook spokesperson said in an emailed statement that he agreed with the need for regulation “to prevent companies like us from making these decisions on their own.” where did it go. Representatives also reiterated Facebook’s contention from recent stories, saying the company “spent $13 billion and hired 40,000 people to do one thing: keep people safe with our apps.” Rice field.

Highlights of Monday’s inquiry:

Facebook Chairman and CEO Mark Zuckerberg.

Erin Scott | Reuters

Is Facebook evil?

MP John Nicholson asked Hogan if Facebook was just bad.

“Your evidence has shown us that Facebook can’t harm children, it can’t stop the spread of propaganda, it can’t stop the story of hate.” Nicholson said. “It has the power to deal with these issues, it chooses not to do it, it wonders if Facebook is just fundamentally evil. Is Facebook evil?”

Hogan said the wording he said was “reckless.”

“I believe there is an insufficient pattern that Facebook is reluctant to acknowledge its power,” she said. “They believe in flatness and don’t accept the consequences of their actions, so I think it’s reckless and ignorant, but I can’t see their hearts.”

Adam Mosseri, Facebook

Beck Diefenbach | Reuters

Instagram kids concern

In its series, the magazine emphasized that Facebook believes its Instagram service is harmful to the mental health of adolescents.

Following public outcry following reports, Facebook announced last month that it would suspend development of a version of Instagram designed for children under the age of 13.

The matter was reconsidered in the hearing held on Monday.

Inside Facebook, Haugen said the company’s reliance on the products is called “problematic use.” According to Hodgen, Facebook found that problematic use was much worse for young people than for older people.

To meet the criteria for problematic use, you must be honest enough to be self-aware and accept that you have no control over your use. According to Haugen, 5.8% to 8% of teens have had problematic use by the age of 14 after using Facebook products for a year.

“It’s a big problem,” she said. “If 14-year-olds were so self-aware and conscientious, the real numbers would probably be between 15% and 20%. I am deeply concerned about Facebook’s role in hurting the most vulnerable of us. Growth.”

According to Haugen, Facebook itself reports that the problem is that Instagram is not only dangerous for teens, but more harmful than other forms of social media.

“When kids explain how to use Instagram, Facebook’s own research suggests it’s an addict’s story. Kids make me sad. To me it feels like I don’t have the ability to control use. Drive me away.” will be given,” Hogan said. “I’m deeply concerned that I can’t make my 14-year-old’s Instagram secure, and I really doubt I can make my 10-year-old’s Instagram secure.”

“A novel that is scary to read”

At a hearing, Haugen cited an article in the Journal that said armed groups used Facebook to incite violence in Ethiopia. According to the report, the company does not have enough employees to speak the relevant language to monitor the status of Facebook’s services.

Hodgen said the situation risks being in other vulnerable countries in the north and south. This is one of the main reasons he has made progress.

“I believe that conditions like Ethiopia are only part of the opening chapter of a terrifying novel to read,” Hogan said.

regulation can be good

Haugen praised the UK for considering regulations on social media services and said the regulations could help Facebook.

“I think regulation can actually be good for Facebook’s long-term success, because it forces Facebook to be in a more comfortable place,” she said.

The Verge on Monday released a documented report on Haugen, predicting that teen users of the Facebook app in the United States will drop by 13% from 2019 and 45% over the next two years. According to internal documents, users aged 20 to 30 were expected to decline 4% during that period.

Haugen said the company could reverse this fall if the rules changed Facebook’s incentive to make apps more comfortable for users.

“If we make Facebook safer and more comfortable, I think it will be a more profitable company in 10 years, because the toxic versions of Facebook are slowly losing users,” she said.

Look: How can Facebook fix trust issues?


Please enter your comment!
Please enter your name here