Supporting Security Policy with Effective Digital Intervention (SSPEDI)
Lead Research Organisation:
University of Aberdeen
Department Name: Computing Science
Abstract
The behaviour of people is known to be critical to the security of organizations across all sectors of the economy. As users of IT systems, their action, or inaction, can create cyber security vulnerabilities. For example, users can be tempted to give away their authentication credentials (by phishing), to install malign software (malware), choose weak or inadequate passwords, or they may fail to install security patches, to scan computers for viruses, or to make secure backups of critical data.
Organizations design security policies which users are supposed to follow, for example, instructing them not to give away their authentication (login) credentials, or not to open certain kinds of attachments sent in unsolicited emails. However, in practice, managers find it very difficult to encourage users to follow policy.
This project will investigate effective ways to improve security communications with users, to enable them to understand security risks, and to persuade them to comply with policy. Our hypothesis is that to be most effective, communications and policy implementations must take into account individual personalities and motivations. Technological support is therefore required to support security communications and security persuasion so that it can scale up to large organizations.
We propose to transfer ideas and knowledge from the existing academic field of persuasive technologies and digital behaviour interventions, and apply them to the user security compliance problem. We will build, and trial, real technologies that implement persuasive strategies in real user security scenarios. These scenarios will be selected in partnership with industrial security practitioners.
The project takes a broad, interdisciplinary view of the roots of the user compliance challenge, and draws additionally on expert knowledge from the fields of psychology, behavioural decision, security, sentiment analysis and argumentation in search of solutions.
Organizations design security policies which users are supposed to follow, for example, instructing them not to give away their authentication (login) credentials, or not to open certain kinds of attachments sent in unsolicited emails. However, in practice, managers find it very difficult to encourage users to follow policy.
This project will investigate effective ways to improve security communications with users, to enable them to understand security risks, and to persuade them to comply with policy. Our hypothesis is that to be most effective, communications and policy implementations must take into account individual personalities and motivations. Technological support is therefore required to support security communications and security persuasion so that it can scale up to large organizations.
We propose to transfer ideas and knowledge from the existing academic field of persuasive technologies and digital behaviour interventions, and apply them to the user security compliance problem. We will build, and trial, real technologies that implement persuasive strategies in real user security scenarios. These scenarios will be selected in partnership with industrial security practitioners.
The project takes a broad, interdisciplinary view of the roots of the user compliance challenge, and draws additionally on expert knowledge from the fields of psychology, behavioural decision, security, sentiment analysis and argumentation in search of solutions.
Planned Impact
User compliance with IT security policy is a difficult challenge faced by managers of most organisations, across all sectors of the economy. Security risks could often be mitigated if users were to comprehend and follow policies. Even very technical attacks often require a failing on the human side of cyber defences. For example, phishing is often used to steal authentication (login) credentials of users, which can then be exploited in a complex attack, and users can also be tricked into authorising actions by malware sent as email attachments. Encouraging appropriate user behaviour within organisations is thus key to securing organisations, and thus to building a resilient nation.
The proposed research aims to deepen knowledge of how to support the implementation of IT security policies using digital behaviour interventions. It will investigate strategies for security communication with users of IT in organisations, and trial effective, personalised computer technologies that can persuade users to comply with policy. Worldwide research in this area is in its infancy, although attempts to persuade, of one form or another, are used throughout security. This project will move the UK to a world leading position in digital behaviour intervention and persuasion in security.
The impact will be of three principal kinds. Firstly, it will generate empirical knowledge of which digital intervention strategies are likely to be successful for which kinds of security policies in which organisational contexts. This knowledge will be of interest to IT professionals and managers as well as those interested in the scientific outcomes. Secondly, this knowledge, and technologies similar to those trialled, will be applicable within organisations by IT security practitioners. The results may translate to broader IT security communication situations, for example, between governments and individuals (although this is not a goal for the project). Thirdly, the research will contribute to the nation's cyber-security capacity by building and strengthening relationships between academics and industry professionals, and by developing skills and human capital around security and persuasion.
The research will be conducted alongside industry experts to ensure that it tackles human factors security policy scenarios of the greatest relevance, and so that pathways to translate the research knowledge into practice are clearly identified.
The proposed research aims to deepen knowledge of how to support the implementation of IT security policies using digital behaviour interventions. It will investigate strategies for security communication with users of IT in organisations, and trial effective, personalised computer technologies that can persuade users to comply with policy. Worldwide research in this area is in its infancy, although attempts to persuade, of one form or another, are used throughout security. This project will move the UK to a world leading position in digital behaviour intervention and persuasion in security.
The impact will be of three principal kinds. Firstly, it will generate empirical knowledge of which digital intervention strategies are likely to be successful for which kinds of security policies in which organisational contexts. This knowledge will be of interest to IT professionals and managers as well as those interested in the scientific outcomes. Secondly, this knowledge, and technologies similar to those trialled, will be applicable within organisations by IT security practitioners. The results may translate to broader IT security communication situations, for example, between governments and individuals (although this is not a goal for the project). Thirdly, the research will contribute to the nation's cyber-security capacity by building and strengthening relationships between academics and industry professionals, and by developing skills and human capital around security and persuasion.
The research will be conducted alongside industry experts to ensure that it tackles human factors security policy scenarios of the greatest relevance, and so that pathways to translate the research knowledge into practice are clearly identified.
Publications
Li X.
(2020)
Latent space factorisation and manipulation via matrix subspace projection
in 37th International Conference on Machine Learning, ICML 2020
Mahesar Q
(2020)
Preference Elicitation in Assumption-Based Argumentation
Mahesar Q.
(2018)
Inferring Implicit Abstract Argument Preferences
Mao Q
(2022)
Adaptive Pre-Training and Collaborative Fine-Tuning: A Win-Win Strategy to Improve Review Analysis Tasks
in IEEE/ACM Transactions on Audio, Speech, and Language Processing
Oren N.
(2018)
Detecting Preferences for Dialogue
Description | The project conducted several experiments into behaviour change techniques in cybersecurity. One experiment established that certain well-known persuasion techniques applied to encouraging user engagement with a cyber security programme were more persuasive in that particular setting than others in aggregate across populations, as had been conjectured may more generally be the case by other researchers, but that in fact the differences in effectiveness were rather mild. Another study, the data from which is currently under analysis, investigated the effect of intensity of an additional task on user cyber security performance. That study has also created a game environment for more generally studying user performance in cyber security tasks in the presence of other tasks, and a significant ongoing collaboration with Monash University, Australia. Additional benefits included (1) an understanding of how to measure susceptibility to persuasion under important social persuasion techniques that are already used in cyber security; (2) development of state-of-the-art machine learning techniques to automatically classify conversational utterances; (3) development of cybersecurity techniques for modelling attacks that are more broadly applicable than pre-existing techniques through inclusion of modelling of higher-order uncertainty about the modelled system in subjective logic. |
Exploitation Route | The outcomes contribute to the academic body of knowledge in cyber security. Follow-up studies in other domains would be required to establish the generality and limits of the results on behaviour change, prior to attempts to apply them in pratice. The game environment that we have constructed with Monash for studying dual-task scenarios in cyber security, we believe could be a powerful framework for conducting further research. Our work on machine learning on natural language data could be applied to threat detection by those who have both appropriate data and the relevant expertise. Our work on attack modelling using subjective logic is an applicable technique, but would also require experts to deploy it. |
Sectors | Aerospace Defence and Marine Agriculture Food and Drink Chemicals Communities and Social Services/Policy Construction Creative Economy Digital/Communication/Information Technologies (including Software) Education Electronics Energy Environment Financial Services and Management Consultancy Healthcare Leisure Activities including Sports Recreation and Tourism Government Democracy and Justice Manufacturing including Industrial Biotechology Culture Heritage Museums and Collections Pharmaceu |
URL | http://www.sspedi.co.uk |
Title | DAH-CRF |
Description | This is the source code of our CoNLL 2019 paper (Li R., Lin C., Collinson M., Li X. and Chen G. A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification, The SIGNLL Conference on Computational Natural Language Learning (CoNLL), Hong Kong, 2019). |
Type Of Technology | Software |
Year Produced | 2019 |
Impact | This source code was published on Github and it has been starred once. |
Title | HR-VAE |
Description | This is the source code of our INLG 2019 paper (Li R., Li X., Lin C, Collinson M. and Mao R. A Stable Variational Autoencoder for Text Modelling, The 12th International Conference on Natural Language Generation (INLG), Tokyo, 2019). |
Type Of Technology | Software |
Year Produced | 2019 |
Impact | This source code was published on Github, and it has been starred 9 times and forked once. |
Description | BBC Radio Interview |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Radio interview and broadcast on July 18th 2017 covering investment in the project and its objectives. |
Year(s) Of Engagement Activity | 2017 |
URL | http://www.bbc.co.uk/news/uk-scotland-north-east-orkney-shetland-40643657 |
Description | BBC Website |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Media (as a channel to the public) |
Results and Impact | On 18th July 2017, there was coverage of investment in the project and its goals on the BBC news website. |
Year(s) Of Engagement Activity | 2017 |
URL | http://www.bbc.co.uk/news/uk-scotland-north-east-orkney-shetland-40643657 |
Description | Briefing to Home Office on use of machine learning on natural language data for security |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | We were invited to give a briefing on "Intention Detection for Security" to representatives from the Home Office (the ministerial department of Her Majesty's Government, responsible for immigration, security and law and order). Our research in this area focuses on identifying online security threats by analysing natural language data, for example email or social media, with a view to informing behavioural interventions. We use machine learning, deep learning and natural language processing, and adapt them to suit problems in the information security domain. Intention detection aims specifically to flag any suspicious ideas or comments shared online that pose a risk to information security. The briefing led to questions and an interesting discussion with experts from within UK government about the state-of-the-art in intention detection, and the potential for future technology. |
Year(s) Of Engagement Activity | 2018 |
Description | Coverage in FutureScot |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Coverage of investment in the project and its objectives in July 2018. |
Year(s) Of Engagement Activity | 2017 |
URL | http://futurescot.com/gone-phishing-new-1m-project-aims-guard-cyber-attacks/ |
Description | Coverage in Scotsman newspaper |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Coverage of project investment and objectives in July 2018. |
Year(s) Of Engagement Activity | 2017 |
URL | http://www.scotsman.com/news/scientists-to-examine-how-ai-can-help-people-avoid-cyber-attacks-1-4506... |
Description | Coverage in The National newspaper |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Interview given to The National newspaper in July 2018 covering investment in the project and its objectives. |
Year(s) Of Engagement Activity | 2017 |
URL | http://www.thenational.scot/news/15419068.Aberdeen_researchers_in___1_million_initiative_to_beat_hac... |
Description | Festival of Social Science |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | A talk and discussion on cybersecurity and human behaviour within the Festival of Social Sciences in Aberdeen. |
Year(s) Of Engagement Activity | 2018 |
URL | https://www.abdn.ac.uk/engage/public/festival-of-social-science-205.php |
Description | Northsound Radio Interview |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Interview and bradcast on 18th July 2017 on Northsound Radio. |
Year(s) Of Engagement Activity | 2017 |
URL | https://planetradio.co.uk/northsound/local/news/university-aberdeen-scientists-investigate-ai-can-he... |
Description | Original 106 Radio Interview |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Radio interview and broadcast on July 18th 2017 regarding investment in the project and its objectives. |
Year(s) Of Engagement Activity | 2017 |
Description | PechaKucha presentation on phishing |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | A "PechaKucha" rapid presentation and discussion. |
Year(s) Of Engagement Activity | 2018 |
URL | https://www.pechakucha.com/cities/aberdeen/events/5ad750163c70ef0cd479704d |
Description | Presentation to the Scotland Data Science and Technology Meetup public group |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Industry/Business |
Results and Impact | Talk: "Policies for Data Sharing" Prof. Wamberto Vasconcelos, Aberdeen University When we download and start using an app often we agree to sharing our data. Rarely have we the time, the inclination or the competence to read the "terms and conditions" and assess the consequences of giving away our data. We present an approach to expressing and reasoning with data sharing policies: by using a machine-processable formalism with associated decision mechanisms, we can establish for instance, if the terms and conditions of an app (or the privacy/cookies provisions of a web site) are compatible with users' (or organisational) data sharing policies. Typical policies establish, for instance, that the app can access the GPS or the calendar, but not both; another example is within the health domain: the app can monitor my levels of exercise, but it cannot share this information. We will cover i) the information/knowledge necessary for representing policies and reasoning with them; ii) the current standards and technologies for policy-based solutions; iii) how we can equip (cloud-based) workflows to operate subject to policies. Wamberto Vasconcelos is a Professor of Computing Science at the University of Aberdeen. His research fields are distributed systems (including peer-to-peer and multi-agent systems), software engineering and knowledge representation and reasoning. |
Year(s) Of Engagement Activity | 2020 |
URL | https://www.meetup.com/Scotland-Data-Science-Technology-Meetup/events/266313325/ |
Description | Television interview for STV |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Television interview and broadcast on July 18th 2017 covering investment in the project and its objectives. |
Year(s) Of Engagement Activity | 2017 |