It depends: Exploring the role of context in subjective experiences of social media-enabled harms
Lead Research Organisation:
University of Oxford
Department Name: Computer Science
Abstract
Many people spend a significant amount of their daily lives communicating with one another online. Children and teenagers are significantly more likely than adults to engage socially online, enabling them to communicate with anyone, anywhere from a very young age. These users may be unaware that they are harming others by their words and actions online, while those exposed to such antisocial behaviour online may feel unable to remove themselves from situations detrimental to their mental wellbeing.
Research in psychology and the social sciences has shown that negative experiences with online contact can have deleterious effects on users' mental health and may in turn encourage certain types of antisocial behaviours.
While all users can and should be concerned about who they engage with online, this project focuses on groups of users who are likely to have specific concerns about exposure to online content and contact, such as parental figures, teenagers, and children. This project has the potential to both help users who exhibit antisocial conduct to engage in more prosocial behaviour, and to prevent others from being subjected to harmful behaviours.
At its core, this research seeks to develop tools that afford users greater control over the contact they experience in online spaces. It combines research from psychology, learning sciences, linguistics, and computer science in order to explore automated solutions that can identify specific threats within the context of online interactions in real time. Specifically, pattern analysis of word use in conversations will be used to assess whether a user's online conversation displays antisocial characteristics. These methods will be combined with machine learning to create tools that can be deployed as conversations unfold.
The ultimate purpose of this research is to provide two complimentary measures to make online communication safer: On the one hand, it will produce tools that provide real-time educational interventions to individuals whose conversations display traits of specific types of conduct, such as cyberbullying. On the other, it will provide users seeking to protect themselves and those under their care (e.g. parents and children) with a notification system when conversations escalate to levels that the users deem undesirable.
This project falls within the EPSRC Digital Economy research area where Trust, identity, Privacy and Security is one of the themes or research areas listed on this website https://www.epsrc.ac.uk/research/ourportfolio/themes/.
Research in psychology and the social sciences has shown that negative experiences with online contact can have deleterious effects on users' mental health and may in turn encourage certain types of antisocial behaviours.
While all users can and should be concerned about who they engage with online, this project focuses on groups of users who are likely to have specific concerns about exposure to online content and contact, such as parental figures, teenagers, and children. This project has the potential to both help users who exhibit antisocial conduct to engage in more prosocial behaviour, and to prevent others from being subjected to harmful behaviours.
At its core, this research seeks to develop tools that afford users greater control over the contact they experience in online spaces. It combines research from psychology, learning sciences, linguistics, and computer science in order to explore automated solutions that can identify specific threats within the context of online interactions in real time. Specifically, pattern analysis of word use in conversations will be used to assess whether a user's online conversation displays antisocial characteristics. These methods will be combined with machine learning to create tools that can be deployed as conversations unfold.
The ultimate purpose of this research is to provide two complimentary measures to make online communication safer: On the one hand, it will produce tools that provide real-time educational interventions to individuals whose conversations display traits of specific types of conduct, such as cyberbullying. On the other, it will provide users seeking to protect themselves and those under their care (e.g. parents and children) with a notification system when conversations escalate to levels that the users deem undesirable.
This project falls within the EPSRC Digital Economy research area where Trust, identity, Privacy and Security is one of the themes or research areas listed on this website https://www.epsrc.ac.uk/research/ourportfolio/themes/.
Planned Impact
It is part of the nature of Cyber Security - and a key reason for the urgency in developing new research approaches - that it now is a concern of every section of society, and so the successful CDT will have a very broad impact indeed. We will ensure impact for:
* The IT industry; vendors of hardware and software, and within this the IT Security industry;
* High value/high assurance sectors such as banking, bio-medical domains, and critical infrastructure, and more generally the CISO community across many industries;
* The mobile systems community, mobile service providers, handset and platform manufacturers, those developing the technologies of the internet of things, and smart cities;
* Defence sector, MoD/DSTL in particular, defence contractors, and the intelligence community;
* The public sector more generally, in its own activities and in increasingly important electronic engagement with the citizen;
* The not-for-profit sector, education, charities, and NGOs - many of whom work in highly contended contexts, but do not always have access to high-grade cyber defensive skills.
Impact in each of these will be achieved in fresh elaborations of threat and risk models; by developing new fundamental design approaches; through new methods of evaluation, incorporating usability criteria, privacy, and other societal concerns; and by developing prototype and proof-of-concept solutions exhibiting these characteristics. These impacts will retain focus through the way that the educational and research programme is structured - so that the academic and theoretical components are directed towards practical and anticipated problems motivated by the sectors listed here.
* The IT industry; vendors of hardware and software, and within this the IT Security industry;
* High value/high assurance sectors such as banking, bio-medical domains, and critical infrastructure, and more generally the CISO community across many industries;
* The mobile systems community, mobile service providers, handset and platform manufacturers, those developing the technologies of the internet of things, and smart cities;
* Defence sector, MoD/DSTL in particular, defence contractors, and the intelligence community;
* The public sector more generally, in its own activities and in increasingly important electronic engagement with the citizen;
* The not-for-profit sector, education, charities, and NGOs - many of whom work in highly contended contexts, but do not always have access to high-grade cyber defensive skills.
Impact in each of these will be achieved in fresh elaborations of threat and risk models; by developing new fundamental design approaches; through new methods of evaluation, incorporating usability criteria, privacy, and other societal concerns; and by developing prototype and proof-of-concept solutions exhibiting these characteristics. These impacts will retain focus through the way that the educational and research programme is structured - so that the academic and theoretical components are directed towards practical and anticipated problems motivated by the sectors listed here.
Organisations
People |
ORCID iD |
Lucas Kello (Primary Supervisor) | |
Claudine Tinsman (Student) |
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/P00881X/1 | 30/09/2016 | 30/03/2023 | |||
2816047 | Studentship | EP/P00881X/1 | 30/09/2018 | 29/09/2022 | Claudine Tinsman |