ProTechThem: Building Awareness for Safer and Technology-Savvy Sharenting

Lead Research Organisation: University of Southampton
Department Name: Sch of Economic, Social & Political Sci

Abstract

Especially over the past 15 years, the creation of new "online identities" (the social identity that we acquire in cyberspace) and the expansion of the usability of our "digital identity" (the digital storage of our attributed, biographical or even biological identities) have entailed, alongside many advantages, also new and emerging risks and crime vulnerabilities, as identity information can be misused in many ways and create severe harms. Existing research in this context has so far focused on the illegal access to personal information (e.g. through hacking or social engineering techniques) but has overlooked the risky behaviours of individuals willingly sharing identifying (and potentially sensitive) information online. In this context, an area of particular interest that has been particularly overlooked is the one connected to the sharing of identifying and sensitive information of minors, who are often overexposed online in good-faith by parents and guardians in so called "sharenting" practices. Beyond risks due to negative psychological repercussions in ignoring children's desire to having (or not) an online identity, there are concerns regarding the potential for grooming and child abuse, and the potential for identity crimes (such as identity fraud and identity theft), especially keeping in mind that today's children, in a few years, will be those employing digital identities in many aspects of their lives, and will need a clean and curated digital identity to be fully part of many aspects of our society.

The proposed project combines traditional and innovative cross-disciplinary approaches to further this emerging line of inquiry. The project does so by offering a better understanding of sharenting practices, their motivations, and the risks associated with them. It provides a better understanding of the existing technical and regulatory loopholes and gaps enabling potentially harmful sharenting practices. It also develops a better understanding of the perception of the problem by parents and guardians (our "target population"). The project can therefore enable better targeted awareness-raising activities; an improvement of the tools we currently have to study, prevent and mitigate the negative impacts of sharenting practices.

The result of this research will be of significant importance for social media users (and specifically for those in our target population) by raising awareness and promoting sustained behavioural change to minimise cyber risks. The results will also be of relevance for the work of law enforcement in better addressing crimes potentially facilitated by certain sharenting practices such as grooming and identity crimes.

More in general, the proposed approach will improve our understanding of criminogenic opportunities available in social media, supporting new avenues of investigation. By integrating insights and expertise from criminology and computer sciences, the proposed project also has important implications for demonstrating interdisciplinary methodological developments and promoting best practice for ethical online research.

The research project is structured around seven cumulative work-packages to allow the research team to build a solid body of original data (currently not available to researchers) but also to promote engagement and effective communication with a non-academic audience (primarily, law enforcement, and parents and guardians). Throughout the project, we will be supported by our Project Partners (UK Safer Internet Centre; Kidscape; Arma dei Carabinieri); together with Dame Prof. Wendy Hall and other stakeholders, the Project Partners will be also part of our Advisory Board.

Publications

10 25 50
 
Description New knowledge generated:
The project is ongoing but has produced the following knowledge and insights which have been largely ignored by the extant literature which focuses mainly on the motives of sharenters:
• The various crimes that can be enabled by sharenting
• The techniques that sharenters adopt to justify their practices and neutralise blame
• The limitations of the cybersecurity measures that some sharenters adopt to try and safeguard the privacy of their children.
• The lack of adequate formal and self-regulation polices
• The role of the media and various moral entrepreneurs in producing a moral panic about sharenting whilst overlooking its attractions and the role of social media platforms, resulting in the tendency to divert attention away from potentially productive remedial measures such as raising awareness of risks and developing tools for detecting risky social media content to protect children.
• The key moral entrepreneurs contributing to moral panics: cybersecurity specialists, health professionals, guardians such as teachers and parents, as well as governmental institutions and non-governmental organizations.
Exploitation Route New research questions opened up:
Based on the emerging findings the project has developed two new research questions: what is the role of social media Artificial Intelligence (AI) systems in the production of sharenting risks and harms? What remedial measures can be developed? These questions situates the study within the broader topical area of the societal impact of new and emerging data-driven AI systems.

New research capability or specialist skills:
The social scientists and computer scientists working on the project are developing new insights and skills in the area of interdisciplinary research methods. This involves skills for using insights from social science research practices (digital ethnography and qualitative analysis of social media posts) to run NLP experiments that will ultimately inform the design of new NLP tools for detecting targeted social media content. The specific content being targeted are posts by sharenters in the form of sensitive and identifying information that can expose children to online and offline harms.
Sectors Digital/Communication/Information Technologies (including Software)

Government

Democracy and Justice

 
Description The project team's collaboration with non-academic audiences has involved presentations of emerging findings to child safety organisations and government agencies in the UK and Italy, contributing to discussions about the need to raise awareness of sharenting risks and harms. The presentations have yielded the following outcomes: A report for the Carabinieri (Italian Police) to be published this year, and follow up email communications with the UK's Department for Digital Culture, Media, and Sport (DCMS). The Department have provided advice on how to recruit participants required for the ongoing interviews and surveys of parents. They have also accepted our request to co-design and test the open source and accessible cybersecurity NLP tool we are developing for content moderators, parents, and other stakeholders. The tool can be used to detect potentially harmful online content, raise awareness to enhance online safety, and protect children affected by sharenting. The research project is making significant contributions to current understanding of the dynamics, risks, harms, and cybercrimes linked to new digital cultures of parenting, with a focus on sharenting. It is also contributing new insights on interdisciplinary methods that can be used to develop new remedial and creating safer digital environments.
First Year Of Impact 2023
Sector Government, Democracy and Justice
Impact Types Policy & public services

 
Description Citation in UK Parliment Draft Online Safety Bill Joint Committee Written Evidence
Geographic Reach National 
Policy Influence Type Contribution to a national consultation/review
Impact The UK Parliment Draft Online Safety Bill Joint Committee cited my report in its published Written Evidence 14 December 2021, pecifically around recommendation #2. This is evidence of impact on goverment policy with regards UK regulation around online harms.
URL https://committees.parliament.uk/committee/534/draft-online-safety-bill-joint-committee/publications...
 
Description Policy submission to the Consultation by Communications and Digital Committee, House of Lords, AI Large Language Models Inquiry.
Geographic Reach National 
Policy Influence Type Contribution to a national consultation/review
Impact Input to the evolving AI policy landscape around large language models (e.g. ChatGPT)
URL http://dx.doi.org/10.5258/SOTON/P1126
 
Title ProTechThem_ WP1_OpenAccess on Sharenting 
Description In order to identify relevant cases of sharenting leading to minors' victimisation, the content aggregator Nexis was used to collect newspapers, magazines and journals, blogs, or other types of web-based publications published in English over the last ten years (1.1.2011 to 31.12.2021). The following syntax (as refined by the researchers after tentative keyword searches to minimise false negatives) was applied: '(minor OR child!) AND (parent OR mother OR father OR grandparent OR grandmother OR uncle OR aunt OR teacher) AND ((sharenting OR oversharing) OR (exposure near/5 information)) AND (crim! OR harm! OR danger!)'. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact N/A yet (dataset recently created) 
URL https://reshare.ukdataservice.ac.uk/cgi/users/home?screen=EPrint::Summary&eprintid=855558
 
Title ProTechThem_ WP2_Relevant Self-Regulation data (social media platforms) 
Description To identify vulnerabilities in self-regulatory practices, we focused on what we expected to be five social media platforms with a major role in sharenting practices, which were selected considering their demographic distribution among the population, and following a discussion with our non-academic project partners: Facebook; TikTok; Instagram; You Tube; Twitter. Textual data was retrieved from each of these platforms' relevant self-regulatory documents (mostly terms and conditions, and community standards, even if some additional documents targeting, for instance, specifically parents were identified and considered for the analysis). This data is publicly available online and, after an initial screening, all sections of the documents that could (broadly) be relevant to sharenting, digital harms and crimes, moderation practices, minors and their parents/guardians were selected for the analysis and collated in the file here avilable. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact Not available yet - dataset recently created 
URL https://reshare.ukdataservice.ac.uk/cgi/users/home?screen=EPrint::Summary&eprintid=855563
 
Title rafamestre/Multimodal-USElecDeb60To16: v1.0.0 
Description Dataset and codes for the Multimodal USElecDeb60To16 dataset, released in the paper "Augmenting pre-trained language models with audio feature embedding for argumentation mining in political debates", published at the Findings of the 17th conference on European chapter of the Association for Computational Linguistics (EACL) in 2023. 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
Impact None so far 
URL https://zenodo.org/record/7628465
 
Description Exploring Fairness and Bias of Multimodal Natural Language Processing for Mental Health 
Organisation Northeastern University - Boston
Country United States 
Sector Academic/University 
PI Contribution Our collaboration is centred around ethical application, ensuring fairness and avoiding biases in the models. Our key activities will include sharing resources between institutions (datasets and models), evaluating models for potential biases, and proposing policy recommendations for AI in mental health, as well as broadening collaborations and incorporating stakeholders early in the process, ensuring that AI integration in mental health is both innovative, ethically grounded and addressing real users' needs.
Collaborator Contribution Northeastern University is providing data and NLP models for our work, and working with us to develop the evaluation framework.
Impact Workshop bid submission (collaboration has just started)
Start Year 2024
 
Title Multimodal USElecDeb60To16 
Description Software and models that accompany USElecDeb60To16 dataset published at EACL 2023 
Type Of Technology Software 
Year Produced 2023 
Open Source License? Yes  
Impact None so far 
URL https://doi.org/10.5281/zenodo.5653503
 
Description Invited Seminar for Alan Turing Institute NLP Interest Group 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Postgraduate students
Results and Impact Invited Seminar for Alan Turing Institute NLP Interest Group presenting my NLP research to academics and postgraduate students. This sparked followup questions, telecalls and collaboration opportunities (with a Turing Fellow leading to work on a joint workshop proposal around human-in-the-loop NLP).
Year(s) Of Engagement Activity 2021
 
Description Invited Seminar for Centre for Machine Intelligence (CMI) and London Information Retrieval (IR) Meetup 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Invited seminar on "Large Language Models for Information Extraction and Information Retrieval". Impact was via good engagement after seminar (lots of questions and requests for followup collaborations).
Year(s) Of Engagement Activity 2024
URL https://www.meetup.com/london-information-retrieval-meetup-group/
 
Description Invited Seminar for Lancaster University - SafeSpacesNLP: Exploring behaviour classification around online mental health conversations from a multi-disciplinary context - NLP, applied linguistics, social science and human-in-the-loop AI 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Postgraduate students
Results and Impact Presentation (virtual) to 50-100 postgrads and researchers of SafeSpacesNLP poject work, and post-seminar formal discussions with team from Lancaster University around SafeSpacesNLP and ProTechThem project work. This led to plans for future collaboration around datasets and NLP models in 2023.
Year(s) Of Engagement Activity 2022
URL https://ucrel.lancs.ac.uk/crs/presentation.php?id=261
 
Description Invited Seminar for UKRI Trustworthy Autonomous Systems (TAS) Hub Doctoral Training Network (DTN) 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact Invited Seminar for UKRI Trustworthy Autonomous Systems (TAS) Hub Doctoral Training Network (DTN)
Title: SafeSpacesNLP and ProTechThem - Socio-technical Natural Language Processing for Behaviour Classification
Seminar sparked questions and discussion and led to followon telecalls with researchers to discuss work. Helped audience understand issues around NLP and online behaviour classification.
Year(s) Of Engagement Activity 2021
URL https://www.tas.ac.uk/doctoral-training-network/
 
Description Invited Seminar for University of Sheffield Computer Science Dept 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Postgraduate students
Results and Impact Invited virtual seminar on my NLP research to Computer Science Dept (~50 people attended, reached ~100 via on demand viewing). Academics and postgraduate students attended.

Part of an ongoing engagement I have with academics from University of Sheffield NLP research group where my PDRA's + PhD's team visit about 3 times a year.
Year(s) Of Engagement Activity 2021
 
Description Invited Talk at Violence Against Women and Girls (VAWG) Symposium 2023 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Policymakers/politicians
Results and Impact Invited talk at Violence Against Women and Girls (VAWG) Symposium 2023. Attendees included UK law enforcement (e.g. Hampshire & IoW Police) and NGOs and impact took the form of Hampshire & IoW Police agreeing to work with me for future collaboration
Year(s) Of Engagement Activity 2023
 
Description Invited talk to local secondary school Embley - Independent Day & Boarding School in Hampshire 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Schools
Results and Impact 50 pupils (good gender mix) attended for a school seminar presenting my NLP research, which sparked questions and discussion afterwards. Very successful, students were inspired and I was invited back to give a talk next year.
Year(s) Of Engagement Activity 2023
 
Description Organising Committee and Workshop Co-chair - RUSI and UKRI TAS Hub conference 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Organising Committee and Workshop Co-chair - RUSI and UKRI TAS Hub conference, Trusting Machines? Cross-sector Lessons from Healthcare and Security
Major inter-disciplinary event in defence, security and healthcare sector.
Panel session where I discussed my NLP research and issues around trust in AI.
Lots of discussions and led to several collaboration opportunities (commercial followon funding opportunities, joint paper accepted for CACM).
Year(s) Of Engagement Activity 2021
URL https://www.tas.ac.uk/eventslist/trusting-machines/trust-machines-conference-programme
 
Description Podcasts for UKRI Trustworthy Autonomous Systems (TAS) Hub 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact Two podcasts involving AI and NLP for the UKRI TAS Hub 'living with AI' podcast series. Impact is engagement with general public around trust in AI (including trust in NLP research around online behaviour classification).
Year(s) Of Engagement Activity 2021,2022
URL https://www.tas.ac.uk/
 
Description RUSI TAS Workshop - Using AI in an Intelligence Context Future Scenario Workshop 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Policymakers/politicians
Results and Impact Workshop exploring use of AI in an Intelligence Context Future Scenario. Impact was in the form of a followon RUSI report written up from the workshop, and input into future DSTL scenarious underpinned by future funding calls (DSTL were hosts).
Year(s) Of Engagement Activity 2024
 
Description Turing @ Southampton showcase event - expert talk on 'Human in the loop AI' 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact Turing showcase event hosted by University of Southampton but attended by UK-wide Turing community. Reach was 50-100 people (physically and online). My expert talk led to about 10 followon discussions with postgrads and early career researchers, and I was invited to give the talk again at another local early career researcher event in 2023.
Year(s) Of Engagement Activity 2022
 
Description Workshop - AI and Defence: Readiness, Resilience and Mental Health 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Organised and chaired international workshop around use of AI for Mental Health in Defence domain. 125 participants from goverment (defence mostly including MOD and DSTL), academia and commercial practitioners. Feedback from participants was event was very useful and has helped shape UK conversation in this area. Work is underway on a RUSI Journal paper as a direct output from this event.
Year(s) Of Engagement Activity 2023
URL https://www.southampton.ac.uk/~sem03/AI-and-Defence-2023.html