‘Facebook profile’ exposes the ugly dark side of the largest social network on the planet

Tram Ho

Facebook has come under constant criticism over the years for its role in spreading fake news, especially related to the 2016 US election. In the past two months, the company has fallen into a new crisis, rooted in the former employee Frances Haugen.

Haugen’s testimony exposed the dark areas inside Facebook. Not only that, a coalition of 17 major US newspapers coordinated, reviewed and reported on the documents provided by Ms. Haugen.

Facebook denied Haugen’s accusations. CEO Mark Zuckerberg even wrote a 1,300-word article alluding to the document only citing misleading passages for the social network.

Here are some highlights from the “Facebook Profile”:

 ‘Hồ sơ Facebook’ vạch trần mảng tối xấu xí của mạng xã hội lớn nhất hành tinh - Ảnh 1.

Spreading fake news

“Facebook misled investors and the public about its role in perpetuating disinformation and violent extremism related to the 2020 US election and the January 6 riots,” Ms. Haugen said. “.

One of the documents detailing the June 2019 study is titled “Carol’s Journey to Qanon”. In it, Facebook opened an account for Carol Smith – a bogus conservative 41-year-old mother – to examine the impact of page and group algorithms. After “Carol” followed several green-tick pages of conservative figures like Fox News and former US President Donald Trump, just two days, the algorithm suggested she follow QAnon, an organization that believes notorious conspiracy.

Another document presents research done after the 6/1 Capitol Hill riots, arguing that Facebook could have done more to prevent it.

In response, a Facebook spokesperson asserted that the responsibility for the riots lies with those who attacked Capitol Hill and those who incited them.

Lack of global support

According to documents provided by Haugen, there is a disparity in capacity to prevent hate speech and disinformation in countries such as Myanmar, Afghanistan, India, Ethiopia and much of the Middle East compared to the rest of the country. world. This is where many local languages ​​are not updated.

Although Facebook’s platform supports more than 100 languages, the global content management team consists of just 15,000 content reviewers in 70 languages ​​in more than 20 locations around the world.

For example, in India, the largest market for Facebook users, Facebook doesn’t have a hate speech filter for Hindi or Bengali, even though it has more than 600 million speakers. That means a lot of content is never flagged or acted upon, Facebook researchers admit.

However, a Facebook spokesperson said the company added a hate speech filter for Hindi in 2018, Bengali in 2020, Tamil and Urdu recently.

In addition, Facebook’s Director of Human Rights Policy, Miranda Sissons, shared that the company has a process of assessing and prioritizing countries with the highest risk of violence and influence in the real world. They will deploy support by country when needed.

Trafficking

Facebook has been aware of traffickers exploiting the platform since at least 2018, but has struggled to curate related content, according to documents obtained by CNN.

Internal documents from September 2019 refer to Facebook’s investigation, including the segment of people in the human trafficking ring using Facebook, Instagram, Page, Messenger and WhatsApp.

Other documents describe the process by which Facebook researchers flagged and deleted Instagram accounts used for human trafficking, outlining the measures taken to address the issue, including removing some hashtags. Still, CNN discovered a number of similar Instagram accounts were still active last week and advertised human trafficking. After being contacted by CNN, Facebook confirmed the accounts violated the policy and deleted the account and the post.

A Facebook spokesperson said the company has been fighting human trafficking for many years and its goal is to prevent anyone who wants to exploit others from operating on the platform.

Inciting violence

Internal documents indicate that Facebook knows that current strategies are not effective in preventing the spread of posts that incite violence, in countries at risk of conflict, such as Ethiopia.

Facebook relies on third-party verification organizations to identify, evaluate, and rate false information on the platform using internal tools. Accordingly, Ethiopia, which was hit by civil war last year, is among the countries most at risk. Facebook’s report said that armed organizations here used Facebook to incite violence against minorities.

This is not the first time Facebook has been criticized for its role in promoting hate speech and violence. After the United Nations criticized Facebook during the 2018 Myanmar crisis, the company admitted it had not done enough to prevent it, and Zuckerberg also promised to strengthen Facebook’s governance efforts.

Still, Ms. Haugen said, “Myanmar and Ethiopia are just the opening chapters.”

The company has invested $13 billion and 40,000 people in safety and security on the platform, and has more than 80 partners in its authentication program, a Facebook spokesperson said.

Impact on minors

According to the documents, Facebook is actively growing its base of young users, even as internal research suggests its platforms, particularly Instagram, have negative effects on mental health and well-being. of the user.

Facebook uses many strategies for young people to choose Facebook as their preferred platform when connecting people and interests. These include radical design and navigation changes to make it more user-friendly and entertaining.

However, according to the Wall Street Journal, Facebook’s platforms “make body image problems worse for 1 in 3 girls. Internal research also found that Instagram made girls think more about suicide, self-harm or fasting.

In response, Instagram’s Director of Public Policy, Karina Newton, acknowledged some of the points, but insisted that the Wall Street Journal focused on only a few parts of the report and reported negative news.

Division Blowing Algorithm

In 2018, Facebook tweaked the News Feed algorithm to focus on “meaningful social interactions”. However, according to internal documents obtained by CNN, it wasn’t long after this change that anger and division on the internet was ignited again.

Analysis of 14 publishers on Facebook at the end of 2018 shows that the more negative comments, the more clicks the article gets. One Facebook employee wrote: “Our platform is not neutral.”

Share the news now

Source : Genk