Share if this makes your blood boil! Information representation and emotions in far-right visuals

Share if this makes your blood boil! Information representation and emotions in far-right visuals

By Ofra Klein (European University Institute)

An image of a blonde girl walking hand in hand with a woman in a burqa was shared on the Facebook page of the British National Party (BNP). The description stated: “OUTRAGE: The politicians, police chiefs and social services have handed thousands of young white children for torture, gang rape and sale to Muslim Rape Gangs. Now they’re handing our children over to Muslims for forced conversion to Islam.” The image was manipulated (albeit not by the BNP itself), as the burqa was photoshopped on the woman. The same page had a banner image of violent looking men and in screaming red font the words “mass immigration crisis with no end in sight”, “terrorists on our streets”, and “mass colonization”.

Visual forms of political propaganda are an age-old phenomenon. However, much has changed since printed posters and leaflets. Whereas before, images were difficult to manipulate and hard to circulate, nowadays it is easier for almost anyone to produce fake images in a convincing manner and rapidly disseminate them. Photoshop allows for falsifications that are sometimes impossible to distinguish from reality. On social media manipulated memes can quickly gain virality and visibility. Visuals can effectively arouse emotions (Joffe, 2008), which are important in shaping political views and behaviour (Gross, 2008). My research investigates whether representing information in a manipulated way or not evokes different emotions from Facebook users. I use the Facebook page of the extreme right British National Party (BNP) as a case study. The BNP has been the only electorally successful extreme right party in Britain (Trilling, 2012). It was the first party in Britain to use the internet (Copsey, 2003, p. 219), and has done so successfully, generating more support online than offline. In April 2019, after carrying out this study, the BNP’s page was censored from Facebook.

Visual manipulation, emotions and political behaviour

Images contain certain characteristics that make them more effective for propagandizing political views than texts. Images are less threatening and are understandable to a larger audience than the written word (Entman, 2004). They offer limited space for sharing detailed information or presenting counter arguments, which can be especially persuasive if viewers have little knowledge on the topic portrayed (Baumgartner & Morris, 2008). Photographs are perceived as less falsifiable, causing less scepticism with the audience. Images “come with an implicit guarantee of being closer to the truth than other forms of communication… [and] diminish the likelihood that viewers (…) question what they see” (Messaris & Abraham, 2001, p. 217). Visuals leave a stronger memory mark and tend to evoke stronger emotions than texts (Joffe, 2008, p. 85). Iyer and Oldmeadow (2006) for example show how people who saw imagery of a kidnapping felt significantly more fear than people who only read about it.

Emotions can shape political views and behaviour. Brader (2005) showed how candidates could significantly alter the persuasive power of their ads by using music and images to prompt fear or enthusiasm. Emotions can have varying effects on political behaviour. Anger, for example, increases political mobilization, more so than anxiety or enthusiasm (Valentino et al., 2011). Anger arises “when threats are attributable to a particular source”, giving the individual a certain feeling of control over the situation, whereas with anxiety, the individual is less certain and thus less in control (Valentino et al., 2011, p. 159; Lerner & Keltner, 2001; Smith & Kirby ,2004; Tiedens & Linton, 2001). Consequently, anger triggers risk-seeking behaviour, whereas anxiety leads to risk-avoidance (Lerner & Keltner, 2001; Valentino et al., 2011).

Different representations of information evoke different emotional responses. Vosoughi, Roy & Aral (2018) show how misleading stories led to greater surprise than true stories, which in turn led to more joy and trust. Emotions can therefore make misleading political messages more persuasive. Content can be misleading in several ways: an image can be an outright lie or hoax, but it can also be factual information presented in a partisan, sensationalist or satirical manner (Wardle & Derakshan, 2017). Whereas propaganda and manipulation are highly deceiving, satire or humour is less so (Tandoc et al., 2017)

Visual manipulation by the British National Party

In my research, I analysed how information was represented in images of the British National Party, and whether factual or misleading forms of information evoked different emotional responses. I manually coded how information was represented in about 350 images posted on the Facebook page of the BNP. To do so, I relied on the typology of information manipulation developed by Wardle and Derakshan (2017). They distinguish seven types of manipulation techniques applied in information representation: satire (when political criticism is delivered in a humorous manner), misleading content (when cropping photos, or choosing quotes or statistics selectively), false connection (when headlines, visuals or captions do not support the content), false context (when factual content is shared outside of its original context), imposter content (when genuine sources are impersonated), manipulated content (when images are photoshopped), and fabricated content (when content is a complete creation or lie). I also coded whether images were factual or not.

I observed that almost half of the images on the Facebook page of the British National Party represented information in a misleading way. About ten percent of the images were satirical in nature. Close to forty percent presented information in an objective manner or as an opinion of the party (factual). Completely fabricated content, manipulatedimages, images posted in the false context, or images having a false connection were rare and made up less than five percent of the images on the page. Imposter content did not occur at all.

To assess the emotional responses of users, Facebook reactions of users were retrieved from the platform through Netvizz (Rieder, 2013). This tool gathers the links of images, corresponding texts, the date of posting and how many Facebook users reacted to these images. Users can react, with the click of a mouse, to show whether the image surprised them, made them laugh, feel angry, sad or enthusiastic. Users can only click on one of these reactions per image.

To test whether different forms of information representation in imagery led to different emotional responses, I carried out ANOVAs to compare the mean values of Facebook reactions for images that portrayed information in a humorous, factual or manipulative manner. The analyses showed that satirical images evoked significantly happier responses than imagery containing either factual or misleading information. Similarly, users were more enthusiastic about images representing information factually than visuals that contained misleading or satirical content. Images that were misleading aroused significantly more angry, surprised and sad responses than factual or satirical content. Misleading images also motivated users to write more comments than images that were factual or satirical. These findings suggest that false or misleading information may induce highly negative emotional responses such as anger and sadness while also encouraging more users to engage with the content by writing comments on Facebook

These findings are important, as anger, sadness and disbelief can affect political beliefs, and increase people’s online involvement in politics through sharing, liking or commenting on posts. Vosoughi, Roy & Aral (2018), for example, found that online content that evokes anger and disbelief is shared more often than content that leads to happiness or joy. In turn, this online involvement can also reinforce offline mobilisation (Vissers & Stolle, 2014). Moreover, platforms such as Facebook tend to favour content that engages many users (Gerbaudo, 2018, p. 751), even when it is not factually accurate.

The persuasive potential of images is important now that younger generations are shifting their attention to visual-based platforms, such as Instagram and Snapchat (Andersen & Jiang, 2018). These platforms are commonly used for spreading so-called computational propaganda (Howard et al., 2018). The fear of the misleading power of visuals is not limited to images. New technologies enable users to create convincing fake videos in which real people say and do things they never said or did, a phenomenon called deep fakes (Chesney & Citron, 2019).

The easy accessibility for internet users to make use of tools to manipulate reality convincingly is thus an alarming development in the field of political propaganda. This trend therefore warrants more attention and research. While this study on an extreme right party may not be generalizable to populist-right wing parties, the findings are strong for a single case study, and can form a starting point for further research on the role of memes and visuals in politics.


About the Author

Ofra Klein is a Ph.D. researcher at the European University Institute (Florence, Italy), where she works on online political mobilization and the radical right. Ofra previously worked as a research assistant at Vrije Universiteit Amsterdam and at Harvard University’s Berkman Klein Center for Internet and Society.

References

Andersen, M., & Jiang, J. (2018). Teens, Social Media and Technology. Washington D.C.: Pew Research Center. Retrieved from: https://www.pewinternet.org/2018/05/31/teens-social-media-technology-2018/

Baumgartner, J. C., & Morris, J. S. (2008). One “nation,” under Stephen? The effects of the Colbert Report on American youth. Journal of Broadcasting & Electronic Media52(4), 622–643. doi:https://doi.org/10.1080/08838150802437487

Brader, T. (2005). Striking a responsive chord: How political ads motivate and persuade voters by appealing to emotions. American Journal of Political Science49(2), 388–405. doi:https://doi-org.ezproxy.eui.eu/10.1111/j.0092-5853.2005.00130.x

Chesney, R., & Citron, D. K. (2019, forthcoming). Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security. California Law Review. doi:http://dx.doi.org.ezproxy.eui.eu/10.2139/ssrn.3213954

Copsey, N. (2003). Extremism on the Net: The extreme right and the value of the internet. In R. Gibson, P. Nixon, & S. Ward (Eds.), Political parties and the internet: Net gain? (pp. 218–233). London: Routledge.

Entman, R. M. (2004). Projections of power: Framing news, public opinion, and US foreign policy. Chicago: University of Chicago Press.

Gerbaudo, P. (2018). Social media and populism: an elective affinity?. Media, Culture & Society40(5), 745–753.doi:https://doi.org/10.1177/0163443718772192

Gross, K. (2008). Framing persuasive appeals: Episodic and thematic framing, emotional response, and policy opinion. Political Psychology29(2), 169–192. doi:https://doi-org.ezproxy.eui.eu/10.1111/j.1467-9221.2008.00622.x

Howard, P.N., Ganesh, B., Liotsiou, D., Kelly, J., & François, C. (2018). The IRA, Social Media and Political Polarization in the United States, 2012–2018 (Working Paper 2018.2). Oxford: Project on Computational Propaganda.

Iyer, A., & Oldmeadow, J. (2006). Picture this: Emotional and political responses to photographs of the Kenneth Bigley kidnapping. European Journal of Social Psychology36(5), 635–647. doi:https://doi-org.ezproxy.eui.eu/10.1002/ejsp.316

Joffe, H. (2008). The power of visual material: Persuasion, emotion and identification. Diogenes55(1), 84–93. doi:https://doi.org/10.1177/0392192107087919

Lerner, J. S., & Keltner, D. (2001). Fear, anger, and risk. Journal of personality and social psychology81(1), 146–159. doi:http://dx.doi.org/10.1037/0022-3514.81.1.146

Messaris, P., & Abraham, L. (2001). The role of images in framing news stories. In S. D. Reese, O. H. Gandy & A. E. Grant (Eds.), Framing public life (pp. 231–242). London: Routledge.

Rieder, B. (2013, May). Studying Facebook via data extraction: the Netvizz application. In Proceedings of the 5th annual ACM web science conference (pp. 346–355). ACM. doi:http://dx.doi.org/10.1145/2464464.2464475

Smith, C. A., & Kirby, L. D. (2004). Appraisal as a Pervasive Determinant of Anger. Emotion, 4(2), 133–138. doi:http://dx.doi.org/10.1037/1528-3542.4.2.133

Tandoc Jr, E. C., Lim, Z. W., & Ling, R. (2018). Defining “fake news” A typology of scholarly definitions. Digital Journalism6(2), 137–153. doi:https://doi.org/10.1080/21670811.2017.1360143

Tiedens, L. Z., & Linton, S. (2001). Judgment under emotional certainty and uncertainty: the effects of specific emotions on information processing. Journal of personality and social psychology81(6), 973.doi:http://dx.doi.org/10.1037/0022-3514.81.6.973

Trilling, D. (2012). Bloody Nasty People: The Rise of Britain’s Far Right. Brooklyn: Verso Books.

Valentino, N. A., Brader, T., Groenendyk, E. W., Gregorowicz, K., & Hutchings, V. L. (2011). Election night’s alright for fighting: The role of emotions in political participation. The Journal of Politics73(1), 156–170. doi:https://doi.org/10.1017/S0022381610000939

Vissers, S., & Stolle, D. (2014). Spill-over effects between Facebook and on/offline political participation? Evidence from a two-wave panel study. Journal of Information Technology & Politics11(3), 259–275. doi:https://doi.org/10.1080/19331681.2014.888383

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science359(6380), 1146–1151. doi:10.1126/science.aap9559

Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an interdisciplinary framework for research and policymaking (DGI:9). Brussels: Council of Europe. Retrieved from: https://www.coe.int/en/web/freedom-expression/information-disorder


Your email address will not be published. Required fields are marked *

*