- Policy Analysis
- PolicyWatch 2577
Internet Security and Privacy in the Age of the Islamic State The View from Facebook
Part of a series: Counterterrorism Lecture Series
or see Part 1: U.S. Efforts against Terrorism Financing: A View from the Private Sector
Facebook's head of global policy management, a former federal prosecutor and legal advisor to the State Department, shares her insights into the evolving threats to personal privacy and security and the private sector's role in defending against them.
On February 26, Monika Bickert addressed a Policy Forum at The Washington Institute. The head of global policy management at Facebook, she previously served as assistant U.S. attorney in the Northern District of Illinois and as resident legal advisor at the U.S. embassy in Bangkok. The following is a rapporteur's summary of her remarks.
Facebook has long sought to ensure that its site is safe and that people are not exploiting it to promote terrorism. This is a challenge given the size of its community: currently 1.6 billion regular users, the vast majority of them outside the United States. To meet this challenge, Facebook established a set of "community standards" barring certain activities, and it enforces these standards through a content policy team based in five offices around the world. Team members have many different backgrounds (lawyers, NGO workers, etc.), but the company also realizes the necessity of consulting with outside experts. For example, it frequently reaches out to other organizations for their interpretation of terrorism-related events, including The Washington Institute.
POLICY AND RESPONSE
Facebook does not allow any member of a terrorist group or other violent organization to have a presence on the site. This is a broad policy, meaning that any users who are found to be members of such groups are barred from the site, regardless of what they may be talking about on their accounts. Similarly, when Facebook becomes aware that an account is supporting terrorism, it removes that user and looks at associated content and accounts. Experts have repeatedly told the company that the best way to find terrorists is to find their friends.
Another Facebook policy is to remove content supporting or promoting violent groups or their actions even if there is not sufficient cause to close that person's account. In such cases, the consequences for the user vary; first-time violators generally get a warning, but if Facebook becomes aware of a credible threat and believes that referring information to law enforcement is necessary to prevent harm, it will do so. The company also has a robust process in place for responding to requests from law enforcement, assuming authorities can provide the appropriate court order.
Facebook receives more than a million reports of potential site violations per day, related not only to terrorism but also to bullying, harassment, child exploitation, and other prohibited behavior. These reports are assessed by real people based around the world who review content in more than forty languages and can access external translation services as needed. The trick is getting the reports to the right person for that subject matter. For example, Facebook has in-house experts who specialize in analyzing terrorism support, and they receive ongoing training from academics and researchers who come in to update the team on relevant terminology, iconology, and other information.
Of course, even when violators are shut down, they will inevitably try to come back. It is easy to create an account on Facebook, Twitter, YouTube, and other social media sites. That's by design -- companies want people to use these services, and they don't want to put too many barriers in their way. As a result, these sites necessarily get the bad with the good. Facebook has measures in place to prevent the bad apples from returning, but the system is not perfect.
PROMOTING COUNTER-SPEECH
Facebook is mindful of the fact that removing content alone will not fix the problem -- getting people to actually stand up and challenge terrorist ideology requires more. With that in mind, the company has been investing in counter-speech, which involves raising awareness, pushing back against certain kinds of speech, and encouraging people to question hateful or extremist ideology.
Users have already been creating this sort of content on the site for years. For instance, Facebook pages helped raise awareness about Boko Haram's kidnapping activities and the hashtag "#BringBackOurGirls." And in the wake of the January 2015 Charlie Hebdo attack, more than seven million people used the site to express solidarity with the victims and stand against terrorism. So Facebook knew that counter-speech was happening -- what was needed was a more data-oriented approach to the issue, with the goal of empowering people to create more of this sort of speech.
Toward that end, the company began conducting research with the British think tank Demos nearly two years ago, looking at which types of Facebook campaigns succeed against violent extremism and why. Once the factors behind that success were identified, Facebook could impart them to others who want to share the same message in other regions.
Three such factors have stood out thus far. The first is format -- visual imagery is very important to a campaign's success, as is conciseness. The average viewing time that users devote to videos on social media is shockingly short, so a five-minute message is not the best way to reach an audience.
The second factor is tone. In France, for example, about a quarter of the content on pages espousing hateful ideology consists of comments standing up against that ideology. Yet much of this counter-speech is not especially constructive because it relies on an attacking approach. Through its research with Demos, Facebook has learned that positive, constructive messages are more successful in getting people to question certain ideologies. Humor and satire are particularly effective in that regard.
The third factor involves determining the most effective speaker for a given audience. For example, a government figure would probably not be very compelling to young people who are skeptical about authority -- they are more likely to respond to a celebrity, a young person, or someone who has otherwise stood in their shoes. The choice of spokesperson and audience also depends on a campaign's goal, whether it be to raise awareness or actively turn people away from violent ideologies.
In addition to its work with Demos, Facebook has been supporting efforts by other groups to promote different types of counter-speech. One such group is the Institute for Strategic Dialogue, which put out research late last year about one-to-one intervention. As in other types of campaigns, tone is very important in one-to-one outreach -- the institute pointed out that such conversations need to begin in a very casual, non-accusatory manner until the relationship is built up sufficiently.
Facebook is also interested in gamification, making the act of creating counter-speech fun and perhaps a bit competitive, especially for young people. The company has worked with various external groups that excel in this area, including EdVenture Partners. In collaboration with the State Department and Facebook, EdVenture has been running a program called "Peer 2 Peer," where university students from around the world compete in a semester-long course to create campaigns against violent extremism.
COOPERATION WITH LAW ENFORCEMENT AND SOCIAL MEDIA
Over the past two years, Facebook has participated in closed-door roundtable discussions on countering terrorism and violent extremism. Because the other companies involved in this forum remain anonymous, they can be very open with one another about what they are seeing and how they are trying to tackle these challenges, without being judged by outsiders.
Facebook also participates in working groups with Interpol and the European Union. This dialogue has been very productive -- the company has learned about extremism-related trends in Europe and taken these lessons to heart when formulating its enforcement policies.
At the same time, Facebook has long been mindful of the need to protect the accounts of activists who stand against extremism and other problems. In addition to preventing hackers from compromising these accounts, the company has a very rigorous process for scrutinizing government requests for people's data, making sure they have gone through the proper channels.
CHALLENGES GOING FORWARD
The constantly evolving nature of social media means that new and perhaps unexpected challenges will often arise. For example, video sharing has exploded in the past few years as smartphones and networks became better equipped to handle it. But look at this development from the perspective of Facebook team members tasked with reviewing such content for violations. What do they do when the average length of posted videos doubles or triples? There are tools out there that make it easier to review videos, separate them into more digestible segments, et cetera. But what about the audio? What if the video content is fine but the audio contains a threatening speech? At present, there is no easy answer to these questions, and Facebook will have to remain mindful of such issues as the relevant technologies evolve.
This summary was prepared by A. J. Beloff.