top of page

Executive Summary: FACEBOOK’S NEW EXTREMISM FEATURE

Indirah Canzater, Beatrice Fratini, Ian Kemp, Sophie Provins, Kesa White, NORTHCOM Team

Week of Monday, July 12, 2021


Facebook logo[1]


Facebook has begun to implement a new feature allowing users to report other users they believe are at risk of engaging in extremist behavior. The new feature is aimed at early detection of extremist messages which could further radicalize individuals on Facebook. Extremist messages are frequently used by extremists to disseminate messages, radicalize, and recruit. Social media platforms offer an easy mechanism to organize, communicate, and disseminate propaganda, which is appealing to extremists because of the large pool of potential recruits they have easy access to. Many violent attacks such as the 2019 Christchurch shooting utilized social media by being live-streamed on Facebook. It is very likely that the new Facebook initiative will be beneficial. Despite this, conservative political parties and the far-right will almost certainly resist the feature, citing overreach by the government and social media companies.


Discussion


Facebook is trialing a new feature in the US aimed at preventing extremist content from spreading on its platform. It consists of warning its users that they might have been exposed to extremist posts, as well as asking them whether they think someone they know is becoming radicalized. It is very likely that the attack in Christchurch, New Zealand on March 15, 2019, was a significant influence on Facebook’s decision to implement the new extremism feature. Two consecutive mass shootings were conducted by one individual at two mosques, causing the deaths of over 50 individuals. The attack was live-streamed by the shooter, Brenton Tarrant, on Facebook, attracting the attention of around 200 viewers during the attack and a further 4,000 before it was removed. The video was re-uploaded 1.5 million times within 24 hours of the attack.[2] Therefore, it is almost certain that the true number of viewers is significantly higher. Exactly one year after the attack on March 15, 2020, an individual released a video of himself outside of a mosque with a pistol with a Kurt Cobain song in the background. The song artist is a popular artist amongst extremists who intend to conduct attacks in which they will lose their lives. The lyrics often resonate with such extremists.[3] The video was also recirculated on the Christchurch attacks’ 2021 anniversary, as were other memes related to Tarrant. Some of the viewers were probably inspired to conduct attacks and follow a similar extreme right-wing ideology that Brenton Tarrant held. For example, in January 2021, a 16-year-old was arrested for aspiring to conduct an attack inspired by Christchurch, in which he planned a knife attack on two mosques.[4] Therefore, the sharing of videos depicting right-wing terrorist attacks on Facebook has contributed toward the radicalization of individuals across the globe.


Facebook and US intelligence agencies, such as the Federal Bureau of Investigation (FBI), should be concerned about the new Facebook test feature as its shortcomings could hinder a successful deterrence against domestic extremism. One weakness is Facebook lacks a clear definition of extremism.[5] This is very likely to result in a perceived lack of transparency and legitimacy in Facebook’s authority and impartiality in deciding what constitutes extremist content or not. Facebook, as part of this new feature, is also affiliated with the nonprofit organization Life After Hate, which specifically seeks to help people abandon right-wing extremist organizations.[6] It is highly probable that there will be claims regarding Facebook only targeting right-wing extremism through its new warnings, instead of also monitoring left-wing movements, such as Black Lives Matter (BLM) and Antifa. It has become clear that domestic violent extremism, specifically white supremacy, presents the most lethal threat to the security of the US homeland.[7] While Facebook’s attention to far-right extremism instances can be justified, it is very likely that its new feature will perpetuate the idea that Facebook is targeting right-wing extremist groups. This will likely further erode faith in efforts to curb extremism, making it more complicated to detect and defeat extremism. Moreover, it is highly probable that this new feature will be heavily protested and that Facebook will not be able to apply it as a new and effective counter-extremism tool.


An additional limitation of Facebook’s test includes the improbability of family members and friends reporting a loved one for their alleged extremist views. It is very unlikely that the new feature will be used when it comes to reporting potential extremist content from friends and relatives as it is likely friends and family sympathizes with the extremist content. Additionally, they are the most likely to see the content based on algorithms and are unlikely to take action against it. Those who are fundamentally opposed to the content in question are likely to make use of the feature and report, but because of the way Facebook’s algorithm works, they are the least likely to see the posts.[8] The Facebook algorithm analyses and displays content a user would find most favorable, often matching the political preferences of the user.[9] As most extremist content is shared within private Facebook groups and not freely disseminated amongst the general public’s view, it is likely that there will be a lack of incidents being reported.


Although Facebook’s new extremism reporting feature has its benefits in countering extremism, it may also help facilitate in creating new extremist content. Members of the far-right are galvanized by the labeling of their posts as “extremism,” likely viewing it as a personal attack on their core beliefs.[10] When far-right extremists feel singled out and discriminated against by big-tech, who the far-right believe is ran by the far-left, this alleged discrimination could likely incentivize far-right extremists to create and spread more extremist posts more frequently. This will likely lead to more far-right content on the platform. Many far-right friendly platforms such as Parler have set the narrative that Facebook’s new feature is a meme, that it will not help in reducing extremism, and that it is just an effort to politically discriminate against the right. Content is often shared on these sites making fun of Facebook and criticizing Facebook for silencing their beliefs. This likely serves as propaganda as it provides a focal point for those on the far-right to rally around.


A meme concerning the new extremism feature on Facebook[11]


The opposition to the Facebook Extremism Feature primarily comes from right-wing extremists and conservatives. It is very likely that this new feature will be considered an invasion of privacy to those that it is trying to target. Many view this feature as an easy way for technology companies and the government to control the media and the public’s thinking. Without the proper approach and integration, enacting this new feature will likely push people with extremist views and positions to other platforms that are more difficult to monitor, such as Parler.[12] This will likely result in the government not being able to learn as much about various groups and ideologies developing. As Platforms like Parler do not self-monitor for extremism, growing extremist narratives on the platform likely go unnoticed. Government officials and law enforcement officers likely cannot monitor all communications within the platform without the assistance of the platform itself. For example, Facebook has a security team that cooperates with law enforcement and will alert authorities to content that Facebook believes to be an imminent security issue; Parler does not engage in such activity as of yet.


Those who subscribe to far-right ideologies feel alienated by big tech platforms like Facebook and Twitter and are flocking to sites like Parler and GETTR. GETTR is a new social media platform spearheaded by former President Trump and his former campaign team. Their mission is “fighting cancel culture, promoting common sense, defending free speech, challenging social media monopolies, and creating a true marketplace of ideas.”[13] The usage increase of sites like Parler and GETTR may have security implications as these sites could be origin points for further extremist content and serve as extremist group recruitment spaces. Because these sites do not monitor for extremist activity, it could result in those recruitment efforts being more successful than on popular, more heavily monitored sites like Twitter and Facebook.


To ensure that extremist content does not go undetected, the Counterterrorism Group’s (CTG) NORTHCOM and Extremism Teams will continue to monitor developments of Facebook’s test for tackling extremism on its platform. CTG will also observe the response this new feature will spark within the far-right community online, including on Parler and GETTR. If the new extremism feature is implemented, more data will become available, and CTG will investigate and analyze its success by producing more reports and recommendations on the issue for social media platforms and state legislators.

________________________________________________________________________ The Counterterrorism Group (CTG)

[2] Facebook says the Christchurch attack live stream was viewed by fewer than 200 people, The Verge, March 2019, https://www.theverge.com/2019/3/19/18272342/facebook-christchurch-terrorist-attack-views-report-takedown

[3] Suicide, Mental Illness, & Music 25 Years After Kurt Cobain’s Death, Stereogum, May 2019, https://www.stereogum.com/2041678/kurt-cobain-sucide-mental-illness-music/columns/sounding-board/

[4] Singapore boy held for Christchurch-inspired mosque attack plot, BBC News, January 2021, https://www.google.co.uk/amp/s/www.bbc.com/news/world-asia-55836774.amp

[5] Dangerous Individuals and Organizations, Facebook, n.d., https://www.facebook.com/communitystandards/dangerous_individuals_organizations

[6] Facebook tests prompts that ask users if they're worried a friend is 'becoming an extremist', CNN, July 2021 https://www.cnn.com/2021/07/01/tech/facebook-extremist-notification/index.html

[7] Homeland Threat Assessment, Department of Homeland Security (DHS), October 2020, https://www.dhs.gov/sites/default/files/publications/2020_10_06_homeland-threat-assessment.pdf

[8] Study: How Facebook pushes users, especially conservative users, into echo chambers, The University of Virginia, November 2020 https://news.virginia.edu/content/study-how-facebook-pushes-users-especially-conservative-users-echo-chambers

[9] Ibid

[10] ‘Stop the Steal’ supporters, restrained by Facebook, turn to Parler to peddle false election claims, The Washington Post, November 2020 https://www.washingtonpost.com/technology/2020/11/10/facebook-parler-election-claims/

[11] I’m a God-fearing, Gun-toting conservative. I know who they’re talking about. #extremist #facebook, Twitter, July 1, 2021, https://twitter.com/dtwilson117/status/1410633373228412934

[12] Right-wing users flock to Parler as social media giants rein in misinformation, PBS, December 2020, https://www.pbs.org/newshour/nation/right-wing-users-flock-to-parler-as-social-media-giants-rein-in-misinformation

[13] Team Trump quietly launches new social media platform, Politico, July 2021 https://www.politico.com/news/2021/07/01/gettr-trump-social-media-platform-497606

bottom of page