Covid-19: What are the Drivers of the Islamophobic Infodemic Communications on social media

Lead Research Organisation: Birmingham City University
Department Name: BLSS Sch of Social Sciences

Abstract

There is a direct gap in understanding how conspiracy theories and miscommunication on social media sites is being used to create a Covid-19 'infodemic'. This is particularly relevant in the context of Muslim communities as members of the far-right are able to use irrational beliefs and fake news ideology to peddle hate, with such narratives quickly being able to penetrate the mainstream and become normalised. For example, one video shared on the social media site Telegram by the former leader of the English Defence League, Tommy Robinson, alleges to show a group of Muslim men leaving a secret mosque in Birmingham to pray. West Midlands Police debunked this video as being fake as the Mosque had already been closed down. A number of similar examples show the rising tension of fear fake news creates and the implications for such information which risks alienating communities and can have a real significant offline effect where people become more insular. Understanding the drivers of such communication is critical to ensuring a more effective and trustworthy media source where complex information, can be used to aid policy-makers and the wider general public. This study will address this gap through our rich empirical data that can be used to highlight what law enforcement should do when confronting online conspiracy theories and offline attacks. The nature of such information is that it can spread quickly and our project will address the drivers of this and the perpetrators involved which will be significant for social media companies, the police, policy-makers and other key stakeholders. The current climate of conspiracy theories and racist 'infodemic' miscommunication on Covid-19 can have significant consequences when social distancing measures are lifted. Due to the nature of social media, and the range of social media comments and behaviour gathered, this project will be able to focus on national issues as we identify trigger events. The detail provided by the social media comments, including in some cases location (either explicitly in the social media comment, user profile, or comment meta-data), will allow for there to be a focus on certain regions with the UK, or countries as a whole. This may facilitate the tracking of and response to localised issues linked to Covid-19, extreme content and miscommunication.
 
Title Covid-19 and Islamophobia on Twitter - A Thematic Analysis 
Description A short animation video that details our thematic analysis on Twitter and Islamophobia. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
Impact None at this stage. 
URL https://www.bcu.ac.uk/social-sciences/research/security-and-extremism/research-projects/drivers-of-t...
 
Title Covid-19: What are the Drivers of the Islamophobic Infodemic Communications on social media? 
Description This is an animation video produced by the research team to go alongside the main research report which summarises the key findings. 
Type Of Art Film/Video/Animation 
Year Produced 2021 
Impact None at this stage. 
URL https://www.bcu.ac.uk/social-sciences/research/security-and-extremism/research-projects/drivers-of-t...
 
Description This research project consisted of four studies that examined language, sentiment, narratives, cases studies and the relationship between the online and offline Covid-19 misinformation theories and Islamophobia across Twitter and YouTube. The key findings from our main report, have been summarised below:

1. Online conditions, such as anonymity, on social media platforms like Twitter and YouTube, have provided conditions that can drive and motivate Islamophobia and Covid-19 related misinformation.

2. Over 100,000 Tweets and over 100,000 YouTube comments were collected and analysed. Highly identifiable accounts (those with information about the users' name, location, age etc.) were engaged in Islamophobia and the spreading of misinformation, indicating the normalisation of hate during the Covid-19 pandemic.

3. The YouTube Comments we collected revealed differences in how different countries, nationalism, and media sources framed issues around Muslims and Covid-19. For example, more Eurocentric videos tended to refer to Covid-19 concepts around Black, Asian and Minority Ethnic (BAME) groups, Whiteness and perceived issues around the
legitimacy of Mosques and spreading the virus.

4. A number of videos had increased misinformation comments around the role of Muslims supposedly spreading Covid-19 because of religious festivals such as Ramadan, and references made to the India/Pakistan conflict.

5. Antisocial tweets indicated a level of enjoyment in the act of being Islamophobic or spreading related misinformation. Conversely greater anger, fear and disgust was present in prosocial tweets as these were aimed at those sharing Islamophobic content.

6. A range of comments expressed joy at the suffering, death or suggested inequalities experienced by Muslims due to Covid-19 and Muslims are described as being super spreaders of the virus, receiving special treatment, whilst being unworthy of treatment.

7. Nationalism was important. Some narratives made false claims about the vaccine being part of a larger Muslim plot to rule the world.

8. Muslims are seen as poisonous. This depiction forms the basis of general blame in stating that Muslims are poisoning society through the spread of Islam. However, during the pandemic, it is clear how this portrayal has developed to describe them as poisonous by spreading the virus. This is evident within messages that make
references to Muslims as poisonous creatures.

9. A link was made between Islam and Covid-19. This theme underpins ideas that suggest that Covid-19 originated from the Quran.
Exploitation Route Our plan of action with regards taking this research forward is based on dissemination and impact with the following key organisations:

1. Policy-makers - To discuss with key political institutions and policy-makers, the research and present the evidence to the APPG on British Muslims and also to send the research to the Minister for Housing Communities and Local Government and get confirmation of the impact of the work and collaborative ideas to discuss the findings in relation to the Online Harms Bill.

2. SAGE Behavioural Expert Group - To work with SAGE and provide evidence of our key findings to the SAGE committee and discuss the measures required for creating better regulations at policy-level.

3. Social media companies - To work closely with Twitter and Facebook on creating new regulations that address social media based conspiracy theories. This will include providing a workshop for both organisations that discuss the key findings and give real-life case studies.

4. Community-based stakeholders - To work with local charities and organisations such as Victim Support that provide our key infographics that they can utilise and use alongside there research and evidence-base.

5. Education sector - To work within local schools and provide a key summary pamphlet that highlights our key findings but also provides links and ways school can manage social media based conspiracy theories.

6. Mosques - To work with local mosques and use the research for local communities to understand how they can report conspiracy theories and Islamophobic hate speech.
Sectors Education,Government, Democracy and Justice,Security and Diplomacy

URL https://www.bcu.ac.uk/social-sciences/research/security-and-extremism/research-projects/drivers-of-the-islamophobic-infodemic-communications-on-social-media
 
Title Corpus Linguistic Analysis 
Description Corpus Linguistic (CL) analysis is an existing technique that involves the creation of a body of text, or corpora, that can then be examined for statistical differences with other corpora. The analysis can also be focused within the data set to examine concordances and collocates, giving an indication of how language is being used. In this case we are applying CL analysis to our Twitter and Youtube comment datasets. Please see original grant proposal for more background on CL. Ethical clearance has been applied for and granted for data collection. Currently the data set consists of approximately 20k tweets from accounts identified via the use of Islamophobic and COVID misinformation hashtags and search terms. Approximately 13k YouTube comments in response to videos about Islamophobia, COVID and related misinformation have also been collected. It is anticipated that within the next two weeks this dataset will reach 50k tweets and 50k YouTube comments. 
Type Of Material Data analysis technique 
Provided To Others? No  
Impact N/A currently 
 
Title Sentiment Analysis 
Description Senitment Analysis (SA) is an existing technique that involves the automated process of assigning comments and words with a quantitative value that reflects the emotion or sentiment being expressed. For example a score of how happy, compassionaite, prideful, sad etc. a comment might be. This is achieved through the application of emotional or affective lexicons that have been pre-validated. This allows for the comparison of how different users or groups may express different sentiments, and how senitment may differ in response to different stimuli. For our research purposes this allows us to determine how people are responding, emotionally, to COVID content, news, misinformation and Islamophobia on Twitter and YouTube. Please see original grant proposal for more background on SA. Ethical clearance has been applied for and granted for data collection. Currently the data set consists of approximately 20k tweets from accounts identified via the use of Islamophobic and COVID misinformation hashtags and search terms. Approximately 13k YouTube comments in response to videos about Islamophobia, COVID and related misinformation have also been collected. It is anticipated that within the next two weeks this dataset will reach 50k tweets and 50k YouTube comments. 
Type Of Material Data analysis technique 
Provided To Others? No  
Impact N/A as yet 
 
Description Working with Facebook on new hate speech policy 
Organisation Facebook
Department Facebook, UK
Country United Kingdom 
Sector Private 
PI Contribution We are currently working in close collaboration with Facebook regarding it's new hate speech policy. Our work to this date has been providing a presentation to workers at Facebook via both a virtual stream and also via an in-person presentation.
Collaborator Contribution Facebook have had several meetings with us and are keen for us to develop there new hate speech policy. At present, we are working with them to create a stakeholder group that will be involved in developing new hate speech policy.
Impact N/A
Start Year 2022
 
Description Covid-19: What are the Drivers of the Islamophobic Infodemic Communications on social media? 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact This was a conference aimed at the general public in order to disseminate our findings and also begin a conversation about how these findings impact upon local communities. Over 50 people attended the virtual event via MS Teams and our findings sparked a range of questions and discussion about how local communities could engage with social media companies and better regulate and tackle online Islamophobia.
Year(s) Of Engagement Activity 2021
URL https://www.bcu.ac.uk/social-sciences/research/security-and-extremism/research-projects/drivers-of-t...