Is it necessary to “label” extremist rhetoric on social platforms such as FB

Beijing time on the morning of May 29, according tomedia reports, it is Facebook’s own algorithm recommendation led to the prevalence of extremist content on the Facebook platform, and Facebook executives have known about this since 2016, but did not take strong action.

The Wall Street Journal published an article Tuesday that reported Onto Facebook’s efforts to combat radicalization since 2016. The report, based on internal Facebook documents and interviews with current and former employees, details how Facebook has used various means to stop the spread of divisive content on the platform. In many cases, however, Facebook has rejected offers from employees working to address the issue.

More than 60 percent of the people who joined the extremist group on Facebook were prompted by Facebook’s algorithmic recommendations, and Facebook executives, including Joel Kaplan, a senior vice president, have known about the problem since 2016, but have stifled efforts to address it, the article said.

The article notes that facebook conducted a study in 2016 that found a worryingly high proportion of extremist content and groups on the platform. In a speech at the time, Monica Lee, a Facebook researcher and sociologist, said that more than a third of Germany’s large political Facebook groups contained a wealth of extremist and racist content.

Monica Lee also found that nearly two-thirds (64%) of Facebook users were recommended by Facebook’s algorithm when they joined extremist groups, meaning the site’s own referral system “exacerbates extremism” among social media users.

“Facebook takes advantage of the human brain’s appeal to division, and if left to develop, Facebook will provide more and more divisive content to users in an effort to gain user attention and increase time spent on the platform,” the article reads.

Addressing this radicalization is very difficult and requires Facebook to rethink some of its core products. Most notably, the project forced Facebook to consider how to prioritize “user engagement.” User Engagement is an indicator of stay times, likes, sharing and comments that have been a guiding light for its system for many years.

The first thing to say about this description is that “extremeness” can mean a lot of things, which can make it difficult to discuss Facebook’s contribution to the issue. In a narrow sense, you can talk about the way partisan news feeds can divide the country. It could also be used as a general term to talk about what Facebook and other social networks have recently called “platform integrity”-related initiatives, such as eliminating hate speech or labeling misinformation.

The second thing to say about “extremeization” is that while it has a lot of negative effects, it’s important to consider what the alternative you’re proposing is. If you’re the CEO of a technology company, the problem involves the challenge of “confrontation” extremes: Even if you see it as an enemy, it’s not clear what criteria you’ll use to unite your company against it.

In any case, Facebook’s response to the Wall Street Journal report was dismayed. Guy Rosen, Facebook executive in charge of the matter, published a blog post Wednesday listing some of the steps the company has taken since 2016 to combat “radicalization.”

These include changing news feeds to include more posts from friends and family than publishers, launching a fact-checking program, detecting hate speech and other malicious content more quickly through machine learning systems and content censors, and removing groups that violate Facebook policies from algorithmic recommendations.

In this blog post, Rosen said:

We have taken a number of important steps to reduce the amount of content that can lead to radicalization on our platforms, sometimes at the expense of revenue. This work will never end, because in the final analysis, network discourse is an extension of society, and network discourse is highly extreme. But our job is to reduce the impact of extremes on our product experience, and we’re working on that.

One of the reasons Facebook was dismayed by the report was that it had spent “months” discussing the findings with a Wall Street Journal reporter, according to a post on Facebook’s internal Workplace. Facebook has had public and informal conversations with several of the company’s executives, including Kaplan, who is responsible for global public policy.

In any case, there are two things worth mentioning about this story and Facebook’s response. First, there is internal tension over Facebook’s perception of radicalization. Second, asking Facebook to address the divisive (extreme) issue could affect Facebook’s attention to other issues, including the viral spread of conspiracy theories, misinformation and hate speech.

On the one hand, Facebook’s efforts to combat radicalization, as described in the article, are real. Facebook has invested heavily in platform integrity over the past few years. And, as some Facebook employees have said, Facebook has good reason not to implement every recommendation the team makes.

For example, some efforts may not be as effective as others already under implementation, or they may have unintended negative consequences. Obviously, some employees on the team feel that most of their ideas have not been adopted or are watered down. But that’s the case with many teams in many companies, and that doesn’t mean all their efforts are in vain.

Facebook executives, on the other hand, largely reject the idea that the platform is radicalized at the breakaway level. The fact that Facebook usage there is high in many big countries, but extremes are decreasing, which helps explain why Facebook executives are not putting the issue first.

So such a case suggests that Facebook may be “right” in terms of platform integrity, and the Wall Street Journal is right on a larger level: Facebook is designed to be a place for open discussion, and human nature has led to these discussions that tend to be intense and polarizing, and Facebook has chosen a relatively relaxed way of managing that divide. This is because executives believe the world benefits from loud, almost unrestricted discussions, and because they do not believe they are dividing the country. (Li Ming)