Adaptive Robotic EQ for Well-being (ARoEQ)
Lead Research Organisation:
University of Cambridge
Department Name: Computer Science and Technology
Abstract
Social robots are envisioned to interact closely with people safely and efficiently, and to add value to people's lives by helping, caring, teaching and entertaining. However, currently there is a major gap between public perception of humanoid / social robot capabilities and their actual capabilities. The cognitive and social capabilities of the current humanoid robots are still very limited.
Although social robotics is an inherently multi-disciplinary field, there are no systematic efforts to develop novel sensing, perception and understanding capabilities for these robots grounded in the state of the art in the fields of affective computing, social signal processing, computer vision and machine learning. To avoid re-inventing the wheel, researchers in HRI often and rightly utilise available sensing / perception tools from other domains, creating their own in-house datasets and evaluations. However, these practices hinder advance in social robotics, leading to a major lack of novel and domain specific tools, and a lack of measures for benchmarking due to a lack of annotated, publicly available multimodal interaction datasets that are vital for comparative evaluation.
This Fellowship aims to address these major gaps in HRI and social robotics. Its vision is to:
(1) equip humanoid robots with novel socio-emotional intelligence and adaptation capabilities grounded in the state of the art in affective computing, social signal processing, computer vision and machine learning fields;
(2) investigate the deployment of humanoid robots as socio-emotionally smart embodied personal devices that can potentially revolutionise our ability to maintain healthier behaviours and working environments, leading to resilient communities.
Although social robotics is an inherently multi-disciplinary field, there are no systematic efforts to develop novel sensing, perception and understanding capabilities for these robots grounded in the state of the art in the fields of affective computing, social signal processing, computer vision and machine learning. To avoid re-inventing the wheel, researchers in HRI often and rightly utilise available sensing / perception tools from other domains, creating their own in-house datasets and evaluations. However, these practices hinder advance in social robotics, leading to a major lack of novel and domain specific tools, and a lack of measures for benchmarking due to a lack of annotated, publicly available multimodal interaction datasets that are vital for comparative evaluation.
This Fellowship aims to address these major gaps in HRI and social robotics. Its vision is to:
(1) equip humanoid robots with novel socio-emotional intelligence and adaptation capabilities grounded in the state of the art in affective computing, social signal processing, computer vision and machine learning fields;
(2) investigate the deployment of humanoid robots as socio-emotionally smart embodied personal devices that can potentially revolutionise our ability to maintain healthier behaviours and working environments, leading to resilient communities.
Planned Impact
The line of research targeted in this funding application is highly interdisciplinary and the research outcomes can be spun into the outer society in various ways. Relevant beneficiaries are identified based on the expected term of impact.
This Fellowship is likely to impact two major areas: Robotics and healthcare. The most immediate application of the proposed research is in the areas of social robotics, telepresence robotics and human-robot interaction. In the short to medium term, the proposed research has the potential to become applicable and expandable to multiple domains including healthcare, social work, education, accommodation, foodservice, retail trade, and arts and entertainment. With advances in AI and physical robot design, robots are starting to take on increasingly complex social roles in homes, workplaces, and public spaces. However, there is a major gap between public perception of humanoid / social robot capabilities and their actual capabilities. The proposed research will improve on the capabilities of the humanoid robots, bringing them closer to the expectations of the public. This is also expected to help with technology adoption, i.e., the adoption and acceptance of social Robotics and Autonomous Systems (RAS) as new products to be used for public good.
Health has been identified as one of the top priorities in the RCUK Strategic Priorities and Spending Plan (2016-2020). In the medium term, the proposed research will contribute to this priority both by developing a better technical understanding of building socio-emotionally intelligent and adaptive robots, and by developing multidisciplinary knowledge about how to deploy such robotic systems as personal and/or self-management technology (e.g., a coach) to strengthen the emotional and social well-being of the unaffected populations for prevention and resilience, both in the workplace and outside.
Academic, artistic and philosophical motivations for researching intelligent machines and robots are well represented by the Science Museum's 2017 exhibition on robots that have started to be used in theatre plays (e.g., Spillikin) and dance performances (e.g., Robot, Blanca Li Dance Company). In the medium to long term, the robotic platform created as part of this project can potentially be used by the performers and artists.
In the longer term, this project will contribute to the UK's competence in Social Robotics and Autonomous Systems (RAS) area adding value into its economy. The project will ensure international visibility for the research team, which in turn will contribute to building the UK's capabilities in social robotics through support for talented people, and attract other investments.
This Fellowship is likely to impact two major areas: Robotics and healthcare. The most immediate application of the proposed research is in the areas of social robotics, telepresence robotics and human-robot interaction. In the short to medium term, the proposed research has the potential to become applicable and expandable to multiple domains including healthcare, social work, education, accommodation, foodservice, retail trade, and arts and entertainment. With advances in AI and physical robot design, robots are starting to take on increasingly complex social roles in homes, workplaces, and public spaces. However, there is a major gap between public perception of humanoid / social robot capabilities and their actual capabilities. The proposed research will improve on the capabilities of the humanoid robots, bringing them closer to the expectations of the public. This is also expected to help with technology adoption, i.e., the adoption and acceptance of social Robotics and Autonomous Systems (RAS) as new products to be used for public good.
Health has been identified as one of the top priorities in the RCUK Strategic Priorities and Spending Plan (2016-2020). In the medium term, the proposed research will contribute to this priority both by developing a better technical understanding of building socio-emotionally intelligent and adaptive robots, and by developing multidisciplinary knowledge about how to deploy such robotic systems as personal and/or self-management technology (e.g., a coach) to strengthen the emotional and social well-being of the unaffected populations for prevention and resilience, both in the workplace and outside.
Academic, artistic and philosophical motivations for researching intelligent machines and robots are well represented by the Science Museum's 2017 exhibition on robots that have started to be used in theatre plays (e.g., Spillikin) and dance performances (e.g., Robot, Blanca Li Dance Company). In the medium to long term, the robotic platform created as part of this project can potentially be used by the performers and artists.
In the longer term, this project will contribute to the UK's competence in Social Robotics and Autonomous Systems (RAS) area adding value into its economy. The project will ensure international visibility for the research team, which in turn will contribute to building the UK's capabilities in social robotics through support for talented people, and attract other investments.
Organisations
- University of Cambridge (Fellow, Lead Research Organisation)
- Parthenope University of Naples (Collaboration)
- University of Augsburg (Collaboration)
- UNIVERSITY OF LEICESTER (Collaboration)
- UNIVERSITY OF NOTTINGHAM (Collaboration)
- EMTEQ LIMITED (Collaboration)
- Uppsala University (Collaboration)
- Massachusetts Institute of Technology (Collaboration)
- UNIVERSITY OF CAMBRIDGE (Collaboration)
- University of Grenoble (Collaboration)
- Universität Hamburg (Collaboration)
- Institute for Intelligent Systems and Robotics (Collaboration)
- Eindhoven University of Technology (Collaboration)
- New York University Abu Dhabi (Collaboration)
- Middle East Technical University (Collaboration)
- Cornell University (Collaboration)
- Yale University (Collaboration)
- Colorado School of Mines (Collaboration)
- Cambridge Consultants (Collaboration)
- CAMBRIDGE UNIVERSITY HOSPITALS NHS FOUNDATION TRUST (Collaboration)
- Royal Institute of Technology (Collaboration)
- European Commission (Collaboration)
- University of Barcelona (Collaboration)
- Monash University (Collaboration)
People |
ORCID iD |
Hatice Gunes (Principal Investigator / Fellow) |
Publications
Abbasi N
(2022)
Measuring mental wellbeing of children via human-robot interaction Challenges and opportunities
in Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems
Abbasi N
(2023)
Measuring Mental Wellbeing of Children via Human-Robot Interaction: Challenges and Opportunities
in Interaction Studies
Axelsson M
(2023)
Robotic Coaches Delivering Group Mindfulness Practice at a Public Cafe
Description | Key findings from the ARoEQ project as of March 2024 include: - Both human and teleoperated robotic coaches elicit similarly positive participant responses, underscoring the potential of robotic coaches. - Qualitative data from human coaches and users are crucial for developing meaningful AI models and robotic frameworks for delivering wellbeing practices. - Careful assessment and mitigation of bias in AI model training are essential for equitable wellbeing analysis. - Child-like robots, with their engaging presence, show promise for evaluating children's mental wellbeing, surpassing self-reports and parent-reports. - Well-designed robotic coaches can enhance workplace mental wellbeing, offering accessible and cost-effective mental wellbeing solutions. ====================== Major findings from our user studies with the robotic coaches (as of Feb 2023): • Stakeholders generally receptive to robotic coach; • Stakeholders want robot's form to match its function; • Participant personality affects their perception of robotic coach over time; • Robot form/appearance affects how participants perceive its behaviours and personality; • People get used to robot behaviour over time, however they expect the robotic coach to be more responsive; ====================== Main findings from our work published on IEEE Trans. on Cybernetics 2020 [ https://ieeexplore.ieee.org/xpl/tocresult.jsp?isnumber=6352949] (1) We don't always need sophisticated models as humans rate the machine-like speech to match the machine-like movements better than the more natural movements generated; (2) Even though the models are not trained with explicit information about subject personalities, subject-dependent learning generates movements that are assessed as more appropriate to the input audio for 'more conscientious' people than for 'less conscientious'; (3) This creates a stepping stone toward learning to synthesize motions for distinctive personality styles rather than manually manipulating a robot's behaviors which is still an open research problem. Main findings from our work published at IEEE RO-MAN 2020 - https://ieeexplore.ieee.org/document/9223564 WHAT? Acquire person-specific data WHY? Adapting learning models to individual preferences requires large amounts of data that can only be sourced through interactions with users. HOW? (1) Conduct introductory HRI rounds to enable the robot to collect additional data about the user. (2) Leverage adversarial learning to train a generative model to simulate additional person-specific data. WHAT? Obtain normative baselines WHY? The robot needs to know the behavioural norm for each user against which deviations can be observed. Deviations help identify shifts in user socio-emotional behaviours and infer changes in interaction context. HOW? (1) Conduct interactions under contextually inert (neutral) situations during introduction rounds. (2) Use the (subtle) deviations from this baseline, given the interaction context, to analyse shifts. WHAT? Extract semantic associations WHY? Adapting the learning for a large number of users is computationally intractable. Learning models will get saturated, not able to remember previous information or learn with new individuals. HOW? (1) Form user groupings, using person-specific attributes (Cu in Eq. 2-3) to learn group-based adaptations. (2) Use unsupervised data clustering to facilitate learning semantic groupings of users. WHAT? Learn contextual affordances WHY? Interactions are driven by context and humans switch between contexts without clear boundaries. Contextual attributions may not always be implicit and need to be learnt separately HOW? (1) Learn context-aware embeddings to distinguish between task boundaries. (2) Use contextual affordances (e.g. Ti in Eq. 3) to facilitate smooth switching between affective HRI contexts. WHAT? Balance memory with computation WHY? The memory-computation trade-off needs to be considered w.r.t the application domain. Adding more memory facilitates rehearsal of past knowledge, while additional computation power improves adaptation to novel experiences. HOW? (1) Use generative models for pseudo-rehearsal to reduce model's memory foot-print. (2) Offload part of the computation/memory load to RaaSbased solutions to balance old vs. novel learning. WHAT? Allow controlled forgetting WHY? When learning is continuous, redundant information in the memory/model, is not released, hindering learning capacity of the model. HOW? (1) Utilise forgetting mechanisms (inspired by biological organisms) on unused memory locations or parts of the model, to learn new knowledge. WHAT? Use multiple performance metrics WHY? Benchmark evaluations from conventional ML and CL perspectives are needed for reproducibility and fairness guarantees, and to evaluate model's robustness to dynamic shifts in data distributions. HOW? (1) Report CL performance metrics (Section IIE), along with the classification metrics of Fmeasure and AUC-ROC scores or reward-function dynamics for behaviour learning. |
Exploitation Route | The World Health Organization (WHO) estimates that mental health disorders cost roughly US$1 trillion annually, accounting for approximately 2% of global healthcare spending. In pursuit of its goal to offer high-quality, accessible mental health care to an additional 100 million people, the WHO advocates for initiatives that expand mental health services to a wider population. This project serves as a proof-of-concept, demonstrating the feasibility of providing schools and workplaces with robotic coaches to assess and enhance mental wellbeing in real-world settings. This development holds promise for reducing the economic and healthcare burden associated with mental health by offering more affordable and accessible healthcare solutions for both children and adults. |
Sectors | Digital/Communication/Information Technologies (including Software) Education Financial Services and Management Consultancy Healthcare |
URL | https://cambridge-afar.github.io/aroeq.html |
Description | The ARoEQ project conducted groundbreaking interdisciplinary research at the crossroads of multimodal behaviour analysis, AI, and robotics with the aim of creating adaptive robotic wellbeing coaches for mental wellbeing assessment and promotion. Collaborations were established with various stakeholders, including experts from the Department of Psychiatry, a mindfulness instructor from MindfulnessWithCompassion Inc., and a practising NHS therapist for psychological and psychiatric rigor. Partnerships with Cambridge Edge Café, Cambridge Consultants Inc. and Cambridgeshire schools facilitated empirical longitudinal human-robot interaction studies, whom through their participations experienced how robots could be used as alternative means for assessment and promotion of mental and emotional wellbeing. The project made significant technical advances in lifelong learning, adaptation and personalization for long-term human-robot interaction. Technological advances in deep learning, continual learning, reinforcement learning, affective computing and affective robotics were incorporated into deployment of the robotic coaches both in lab and in real world settings. The project made significant advances in scientific understanding of the design features, functional capabilities and ethical usage of social robots as effective coaches for improving health and wellbeing outcomes. The project also made significant advances in understanding and improving the features of the QT, Misty and Pepper robot platforms which contributed to improving usability and trust. By taking the robotic coaches into the real world, and through engagement with the workplaces, as well as the generic public via demos in festivals and open days, and the media, the project advanced public understanding of social and humanoid robots as novel tools for assessing, promoting and improving mental and emotional wellbeing, not only in adults, but also in children. The ARoEQ project had a significant impact on the scientific community as evidenced by the awards we received: Finalist for the Robotics Societies of Korea and Japan Distinguished Interdisciplinary Research Award at IEEE RO-MAN'21, and Best Paper Award in Responsible Affective Computing at IEEE ACII'23. We not only published over 70 peer-reviewed papers in top venues (e.g., IEEE Tran. on Affective Computing, ACM Trans. on HRI, ACM/IEEE HRI'23, ACM/IEEE HRI'24, IEEE RO-MAN'23-'22-'21, IEEE ACII'23-'22 etc.), but also organised over 8 popular workshops in these venues, and guest edited 2022-23 Int'l Journal of Social Robotics Special Issue on Embodied Agents for Wellbeing, 2021-22 Frontiers in Robotics and AI Special Issue on Lifelong Learning and Long-Term Human-Robot Interaction, and 2020-21 IEEE Transactions on Affective Computing Special Issue on Automated Perception of Human Affect from Longitudinal Behavioural Data. Cambridge Consultants hosted the robotic wellbeing coaching studies in their offices in 2022 and 2023. This positively influenced their internal culture as employees witnessed the potential of novel technology for promoting workplace wellbeing. The robotised mental wellbeing assessment study with the child-like robot allowed children from the Cambridgeshire schools to experience interactions with a humanoid robot for the first time. This experience had a positive impact on both the children and their parents, raising awareness of robots' potential for social good, and serving as an implicit inspiration for young girls to pursue STEM careers. Studies stemming from the ARoEQ project garnered extensive global media attention, with over 1,200 articles and reports in Sep'22 and Mar'23, featured in renowned outlets like BBC News, The Guardian, Daily Mail, The Telegraph, and ITV News. Live public demonstrations with robots also enjoyed significant popularity at festivals and open days over three consecutive years (2022-2024). During Cambridge Festival'23, the Department of Computer Science and Technology had approximately 1,200 visitors. The ARoEQ research team not only conducted pre-arranged sessions but also accommodated numerous additional visitors, showcasing the robots, engaging in conversations, and inspiring young individuals about potential careers in computer science. The World Health Organization (WHO) estimates that mental health disorders cost roughly US$1 trillion annually, accounting for approximately 2% of global healthcare spending. In pursuit of its goal to offer high-quality, accessible mental health care to an additional 100 million people, the WHO advocates for initiatives that expand mental health services to a wider population. This project serves as a proof-of-concept, demonstrating the feasibility of providing schools and workplaces with robotic coaches to assess and enhance mental wellbeing in real-world settings. This development holds promise for reducing the economic and healthcare burden associated with mental health by offering more affordable and accessible healthcare solutions for both children and adults. The above-mentioned efforts of the ARoEQ researchers were recognized with a Runner-up for the Collaboration Award at the Cambridge Vice-Chancellor's Awards for Research Impact and Engagement 2024. Organised by the University of Cambridge Public Engagement and Impact team, these awards recognise outstanding achievement, innovation and creativity in devising and implementing ambitious engagement and impact plans which have the potential to create significant economic, social and cultural impact from, and engagement with, research. |
First Year Of Impact | 2022 |
Sector | Digital/Communication/Information Technologies (including Software),Education,Financial Services, and Management Consultancy,Healthcare |
Impact Types | Cultural Societal Economic Policy & public services |
Description | New Theme on Reproducibility at The 2020 ACM/IEEE International Conference on Human-Robot Interaction |
Geographic Reach | Multiple continents/international |
Policy Influence Type | Influenced training of practitioners or researchers |
Description | meeting with a Senior Policy Advisor |
Geographic Reach | National |
Policy Influence Type | Influenced training of practitioners or researchers |
URL | https://www.csap.cam.ac.uk/network/hatice-gunes/ |
Description | Alan Turing Faculty Fellowship |
Amount | £8,184 (GBP) |
Organisation | Alan Turing Institute |
Sector | Academic/University |
Country | United Kingdom |
Start | 03/2019 |
End | 04/2021 |
Description | Robot Coach for Promoting Mental Wellbeing in Workplaces |
Amount | £23,467 (GBP) |
Organisation | University of Cambridge |
Sector | Academic/University |
Country | United Kingdom |
Start | 03/2022 |
End | 07/2022 |
Description | W.D Armstrong Trust Fund PhD Studentship in the Application of Engineering in Medicine |
Amount | £134,709 (GBP) |
Organisation | University of Cambridge |
Sector | Academic/University |
Country | United Kingdom |
Start | 09/2020 |
End | 09/2023 |
Title | The MANNERS Dataset |
Description | The MANNERS Dataset constitutes simulated robot actions in visual domestic scenes of different social configurations. To be able to control but vary the configurations of the scenes and the social settings, MANNERS has been created utilising a simulation environment by uniformly sampling relevant contextual attributes. The robot actions in each scene have been annotated by multiple humans along social appropriateness levels via a window. The dataset is available for research purposes - access to all images, annotation labels and Unity files - upon filling the Access Request Form provided on the dataset website. |
Type Of Material | Database/Collection of data |
Year Produced | 2022 |
Provided To Others? | Yes |
Impact | Not yet |
URL | https://github.com/jonastjoms/MANNERS-DB |
Title | VITA system |
Description | VITA is a novel multi-modal LLM-based system that allows robotic coaches to autonomously adapt to the coachee's multi-modal behaviours (facial valence and speech duration) and deliver coaching exercises in order to promote mental well-being in adults. |
Type Of Material | Computer model/algorithm |
Year Produced | 2024 |
Provided To Others? | Yes |
Impact | N/A |
URL | https://github.com/Cambridge-AFAR/VITA-system |
Description | Affect-Driven Learning of Robot Behaviour |
Organisation | University of Hamburg |
Country | Germany |
Sector | Academic/University |
PI Contribution | Collaborating on a journal paper entitled 'Affect-Driven Learning of Robot Behaviour for Collaborative Human-Robot Interactions'. |
Collaborator Contribution | Collaborating on a journal paper entitled 'Affect-Driven Learning of Robot Behaviour for Collaborative Human-Robot Interactions'. |
Impact | A journal article titled 'Affect-Driven Learning of Robot Behaviour for Collaborative Human-Robot Interactions' published in Frontiers Robotics AI 9: 717193 (2022). |
Start Year | 2019 |
Description | Cambridge / UCL Neuroscience collaboration |
Organisation | University of Cambridge |
Department | Cambridge Neuroscience |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Created a Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training and tested it on older adults. |
Collaborator Contribution | Dr Dennis Chan from Cambridge Neuroscience (now with UCL Neuroscience) provided: - expertise for the design of the memory tasks to be used for cognitive training - assistance in recruiting the older adults for the intervention evaluation - assistance in administering a battery of standardised cognitive tests to the participants |
Impact | A conference paper presented: Lorcan Reidy, Dennis Chan, Charles Nduka, Hatice Gunes: Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training. ICMI 2020: 174-183 |
Start Year | 2019 |
Description | Cambridge Consultants Inc. Company |
Organisation | Cambridge Consultants |
Country | United Kingdom |
Sector | Private |
PI Contribution | We have been awarded a EPSRC IAA Early Researcher grant in 2022 for a project entitled "Robot Coach for Promoting Mental Wellbeing in Workplaces", when we established a collaboration with Cambridge Consultants. This research project aims to help promote mental wellbeing via positive psychology interventions delivered by robot coaches in workplaces. With the funding, we have purchased two humanoid robots (QT robot and Misty II), - to autonomously interact with employees in their offices - and the recording equipment (e.g., video-camera) to collect a dataset of these interactions. |
Collaborator Contribution | The collaboration with this company allowed this research to go beyond the lab settings and investigate its impacts in the real world. |
Impact | Award: Runner-up for the Collaboration Award at the Cambridge Vice-Chancellor's Awards for Research Impact and Engagement Publications: - Minja Axelsson, Micol Spitale, and Hatice Gunes. ""Oh, Sorry, I Think I Interrupted You": Designing Repair Strategies for Robotic Longitudinal Well-being Coaching." Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction. 2024. - Micol Spitale, Minja Axelsson, and Hatice Gunes. "Robotic Mental Well-being Coaches for the Workplace: An In-the-Wild Study on Form." Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction. 2023. - Minja Axelsson, Micol Spitale, and Hatice Gunes. "Adaptive Robotic Mental Well-being Coaches." Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction. 2023. |
Start Year | 2022 |
Description | Cambridge Department of Psychiatry collaboration |
Organisation | University of Cambridge |
Department | Department of Psychiatry |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | We applied for the University of Cambridge - W.D Armstrong Trust Fund PhD Studentship in the Application of Engineering in Medicine (£ 134709; 2020 - 2023) together with Professor Peter B Jones, and this was awarded. Since Oct 2020, a PhD student and other team members collaborate on child human-robot interaction studies aiming to help with improving and monitoring mental health and well-being in children. |
Collaborator Contribution | Prof Peter B Jones and Professor Tamsin Ford and their team have been advising us on the design and data analysis aspects of the child human-robot interaction studies my team is undertaking in the Department Computer Science and Technology. |
Impact | We are currently preparing conference papers as outputs of the study conducted to date. |
Start Year | 2020 |
Description | Developing trustworthy systems |
Organisation | European Commission |
Department | Joint Research Centre (JRC) |
Country | European Union (EU) |
Sector | Public |
PI Contribution | Collaborated on a scientific report titled 'The landscape of facial processing applications in the context of the European AI Act and the development of trustworthy systems'. |
Collaborator Contribution | Collaborated on a scientific report titled 'The landscape of facial processing applications in the context of the European AI Act and the development of trustworthy systems'. |
Impact | Published a Nature Scientific report - Scientific Reports volume 12, Article number: 10688 (2022) |
Start Year | 2021 |
Description | Emteq collaboration |
Organisation | Emteq Limited |
Country | United Kingdom |
Sector | Private |
PI Contribution | Created a Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training and tested it on older adults. |
Collaborator Contribution | Emteq and Charles Nduka provided: - the new Faceteq with Facial Electromyography sensors attached used within a VR headset for sensing user facial upper muscle movements. - tech support for using the relevant libraries and code |
Impact | A conference paper presented: Lorcan Reidy, Dennis Chan, Charles Nduka, Hatice Gunes: Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training. ICMI 2020: 174-183 |
Start Year | 2019 |
Description | Hosting Journal Special Issue on "Lifelong Learning and Long-Term Human-Robot Interaction" |
Organisation | Middle East Technical University |
Country | Turkey |
Sector | Academic/University |
PI Contribution | Co-initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Collaborator Contribution | Co-initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Impact | A Special Issue / Collection of articles on the topic of lifelong learning and long term human robot interaction submitted by renown research groups from across the world. |
Start Year | 2020 |
Description | Hosting Journal Special Issue on "Lifelong Learning and Long-Term Human-Robot Interaction" |
Organisation | Royal Institute of Technology |
Country | Sweden |
Sector | Academic/University |
PI Contribution | Co-initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Collaborator Contribution | Co-initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Impact | A Special Issue / Collection of articles on the topic of lifelong learning and long term human robot interaction submitted by renown research groups from across the world. |
Start Year | 2020 |
Description | Hosting Journal Special Issue on "Lifelong Learning and Long-Term Human-Robot Interaction" |
Organisation | University of Hamburg |
Country | Germany |
Sector | Academic/University |
PI Contribution | Co-initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Collaborator Contribution | Co-initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Impact | A Special Issue / Collection of articles on the topic of lifelong learning and long term human robot interaction submitted by renown research groups from across the world. |
Start Year | 2020 |
Description | Hosting Special Issue entitled "Embodied Agents for Well-being" |
Organisation | Eindhoven University of Technology |
Country | Netherlands |
Sector | Academic/University |
PI Contribution | We lead the hosting of the special issue entitled "Embodied Agents for Well-being" in the International Journal of Social Robotics. |
Collaborator Contribution | The partners collaborate with us and co-host the special issue (e.g., assigning reviewers and making decisions on the papers submitted to the special issue). |
Impact | The outcome will be manuscripts published in the special issue of the journal. |
Start Year | 2022 |
Description | Hosting Special Issue entitled "Embodied Agents for Well-being" |
Organisation | Uppsala University |
Country | Sweden |
Sector | Academic/University |
PI Contribution | We lead the hosting of the special issue entitled "Embodied Agents for Well-being" in the International Journal of Social Robotics. |
Collaborator Contribution | The partners collaborate with us and co-host the special issue (e.g., assigning reviewers and making decisions on the papers submitted to the special issue). |
Impact | The outcome will be manuscripts published in the special issue of the journal. |
Start Year | 2022 |
Description | Inference of Engagement in Human- Machine Interaction |
Organisation | Institute for Intelligent Systems and Robotics |
Country | France |
Sector | Public |
PI Contribution | A survey article titled Automatic Context-Driven Inference of Engagement in Human-Machine Interaction, in collaboration with Center of AI & Robotics (CAIR), SMART Lab, New York University, Abu Dhabi; Department of Engineering, King's College London, United Kingdom; and the Institute of Intelligent Systems and Robotics, Sorbonne University, Paris, France. |
Collaborator Contribution | A survey article titled Automatic Context-Driven Inference of Engagement in Human-Machine Interaction, in collaboration with Center of AI & Robotics (CAIR), SMART Lab, New York University, Abu Dhabi; Department of Engineering, King's College London, United Kingdom; and the Institute of Intelligent Systems and Robotics, Sorbonne University, Paris, France. |
Impact | A survey paper submitted to the IEEE Transactions on Affective Computing. |
Start Year | 2022 |
Description | Inference of Engagement in Human- Machine Interaction |
Organisation | New York University Abu Dhabi |
Country | United Arab Emirates |
Sector | Academic/University |
PI Contribution | A survey article titled Automatic Context-Driven Inference of Engagement in Human-Machine Interaction, in collaboration with Center of AI & Robotics (CAIR), SMART Lab, New York University, Abu Dhabi; Department of Engineering, King's College London, United Kingdom; and the Institute of Intelligent Systems and Robotics, Sorbonne University, Paris, France. |
Collaborator Contribution | A survey article titled Automatic Context-Driven Inference of Engagement in Human-Machine Interaction, in collaboration with Center of AI & Robotics (CAIR), SMART Lab, New York University, Abu Dhabi; Department of Engineering, King's College London, United Kingdom; and the Institute of Intelligent Systems and Robotics, Sorbonne University, Paris, France. |
Impact | A survey paper submitted to the IEEE Transactions on Affective Computing. |
Start Year | 2022 |
Description | LEAP-HRI Workshop |
Organisation | Massachusetts Institute of Technology |
Country | United States |
Sector | Academic/University |
PI Contribution | Co-organised the Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) Workshop for 3 consecutive years in conjunction with ACM/IEEE International Conference on Human-Robot Interaction. Contributed to the organisation, workshop ideas and structure, paper reviews and paper content and writing. |
Collaborator Contribution | Contributed to the organisation, workshop ideas and structure, paper reviews and paper content and writing. |
Impact | Papers: - Bahar Irfan, Aditi Ramachandran, Mariacarla Staffa, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI): Adaptivity for All. HRI (Companion) 2023: 929-931 - Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, German Ignacio Parisi, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI). HRI 2022: 1261-1264 - Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, Sinan Kalkan, German Ignacio Parisi, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI). HRI (Companion) 2021: 724-727 Workshop recordings: https://leap-hri.github.io/2021/ https://leap-hri.github.io/2022/ https://leap-hri.github.io/2023/ |
Start Year | 2021 |
Description | LEAP-HRI Workshop |
Organisation | Parthenope University of Naples |
Country | Italy |
Sector | Academic/University |
PI Contribution | Co-organised the Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) Workshop for 3 consecutive years in conjunction with ACM/IEEE International Conference on Human-Robot Interaction. Contributed to the organisation, workshop ideas and structure, paper reviews and paper content and writing. |
Collaborator Contribution | Contributed to the organisation, workshop ideas and structure, paper reviews and paper content and writing. |
Impact | Papers: - Bahar Irfan, Aditi Ramachandran, Mariacarla Staffa, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI): Adaptivity for All. HRI (Companion) 2023: 929-931 - Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, German Ignacio Parisi, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI). HRI 2022: 1261-1264 - Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, Sinan Kalkan, German Ignacio Parisi, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI). HRI (Companion) 2021: 724-727 Workshop recordings: https://leap-hri.github.io/2021/ https://leap-hri.github.io/2022/ https://leap-hri.github.io/2023/ |
Start Year | 2021 |
Description | LEAP-HRI Workshop |
Organisation | Royal Institute of Technology |
Country | Sweden |
Sector | Academic/University |
PI Contribution | Co-organised the Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) Workshop for 3 consecutive years in conjunction with ACM/IEEE International Conference on Human-Robot Interaction. Contributed to the organisation, workshop ideas and structure, paper reviews and paper content and writing. |
Collaborator Contribution | Contributed to the organisation, workshop ideas and structure, paper reviews and paper content and writing. |
Impact | Papers: - Bahar Irfan, Aditi Ramachandran, Mariacarla Staffa, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI): Adaptivity for All. HRI (Companion) 2023: 929-931 - Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, German Ignacio Parisi, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI). HRI 2022: 1261-1264 - Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, Sinan Kalkan, German Ignacio Parisi, Hatice Gunes: Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI). HRI (Companion) 2021: 724-727 Workshop recordings: https://leap-hri.github.io/2021/ https://leap-hri.github.io/2022/ https://leap-hri.github.io/2023/ |
Start Year | 2021 |
Description | METU Kovan Lab collaboration |
Organisation | Middle East Technical University |
Department | Department of Computer Engineering |
Country | Turkey |
Sector | Academic/University |
PI Contribution | Provided: - expertise in HRI, facial affect analysis, human nonverbal behaviour understanding - physical space for a visiting academic - PhD and MPhil students - support and expertise in research proposals |
Collaborator Contribution | Provided: - expertise in deep learning - time and effort for co-supervising two PhD and one MPhil students - support and expertise in research proposals |
Impact | This collaboration is multi-disciplinary, involving human-robot interaction and machine learning. Scientific papers: Nikhil Churamani, Sinan Kalkan, Hatice Gunes: Continual Learning for Affective Robotics: Why, What and How? RO-MAN 2020: 425-431 Tian Xu, Jennifer White, Sinan Kalkan, Hatice Gunes: Investigating Bias and Fairness in Facial Expression Recognition. ECCV Workshops (6) 2020: 506-523 Nikhil Churamani, Sinan Kalkan, Hatice Gunes: Spatio-Temporal Analysis of Facial Actions using Lifecycle-Aware Capsule Networks. CoRR abs/2011.08819 (2020) Media coverage: Researchers find evidence of bias in facial expression data sets https://venturebeat.com/2020/07/24/researchers-find-evidence-of-bias-in-facial-expression-data-sets/ Collaborative project funded by by TUBITAK 1001 (Project no 120E269): KALFA project - New Methods for Assembly Scenarios with Collaborative Robots |
Start Year | 2020 |
Description | Modeling the Social and Normative Context of Human-Agent Interactions |
Organisation | Colorado School of Mines |
Country | United States |
Sector | Academic/University |
PI Contribution | Collaborated on the writing of a book chapter titled 'Modeling the Social and Normative Context of Human-Agent Interactions'. |
Collaborator Contribution | Collaborated on the writing of a book chapter titled 'Modeling the Social and Normative Context of Human-Agent Interactions'. |
Impact | A collaborative book chapter titled 'Modeling the Social and Normative Context of Human-Agent Interactions' that will be published as part of the Human-Centered Machine Learning Book. |
Start Year | 2021 |
Description | Modeling the Social and Normative Context of Human-Agent Interactions |
Organisation | Yale University |
Country | United States |
Sector | Academic/University |
PI Contribution | Collaborated on the writing of a book chapter titled 'Modeling the Social and Normative Context of Human-Agent Interactions'. |
Collaborator Contribution | Collaborated on the writing of a book chapter titled 'Modeling the Social and Normative Context of Human-Agent Interactions'. |
Impact | A collaborative book chapter titled 'Modeling the Social and Normative Context of Human-Agent Interactions' that will be published as part of the Human-Centered Machine Learning Book. |
Start Year | 2021 |
Description | NHS collaboration |
Organisation | Cambridge University Hospitals NHS Foundation Trust |
Country | United Kingdom |
Sector | Public |
PI Contribution | Established an in-lab human-robot interaction study setting with a humanoid robot for conducting longitudinal positive psychology therapy sessions. Exploring therapy sessions with a robot is a novel approach for NHS practitioners. Both the coach and the participants were recorded with multiple sensors in the course of 4 weeks. The sessions were repeated with another cohort of participants for another 4 weeks. We were planning to continue in January/February 2021, but this was not possible due to COVID-19 related lockdown. Iterative studies were conducted in the period of 2021-2023 both in lab and in real world settings. |
Collaborator Contribution | The NHS practitioner recruited as part of this project provided guidance in terms of structure of one-to-one sessions, content, measures and questionnaires to be used. In the period of November-December 2020, the NHS practitioner administered coaching in one-to-one sessions. Their data was recorded with multiple sensors. This data was used for creating several iterations of robotic well-being coaches. The NHS practitioner provided guidance in subsequent study designs of the robotic wellbeing coaches. |
Impact | Scientific publications |
Start Year | 2019 |
Description | REACT Challenge Organisation |
Organisation | Monash University |
Country | Australia |
Sector | Academic/University |
PI Contribution | Human behavioural responses are stimulated by their environment (or context), and people will inductively process the stimulus and modify their interactions to produce an appropriate response. When facing the same stimulus, different facial reactions could be triggered across not only different subjects but also the same subjects under different contexts. The Multimodal Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023) was organised as a satellite event of ACM Multimedia Conference 2023, (Ottawa, Canada, October 2023) and aimed at comparison of multimedia processing and machine learning methods for automatic human facial reaction generation under different dyadic interaction scenarios. The goal of the Challenge was to provide the first benchmark test set for multimodal information processing and to bring together the audio, visual and audio-visual affective computing communities, to compare the relative merits of the approaches to automatic appropriate facial reaction generation under well-defined conditions. The postdoctoral researcher funded by this fellowship project and the project leader co-organised The First Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023). The PDRA was one of the main organisers, and was responsible for challenge task design and execution of the challenge, registering the teams, and writing the challenge paper. |
Collaborator Contribution | The various partners provided data and labels, trained and evaluated baseline models, and contributed to the challenge paper writing. |
Impact | Conference Paper: Siyang Song, Micol Spitale, Cheng Luo, Germán Barquero, Cristina Palmero, Sergio Escalera, Michel F. Valstar, Tobias Baur, Fabien Ringeval, Elisabeth André, Hatice Gunes: REACT2023: The First Multiple Appropriate Facial Reaction Generation Challenge. ACM Multimedia 2023: 9620-9624 |
Start Year | 2023 |
Description | REACT Challenge Organisation |
Organisation | University of Augsburg |
Country | Germany |
Sector | Academic/University |
PI Contribution | Human behavioural responses are stimulated by their environment (or context), and people will inductively process the stimulus and modify their interactions to produce an appropriate response. When facing the same stimulus, different facial reactions could be triggered across not only different subjects but also the same subjects under different contexts. The Multimodal Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023) was organised as a satellite event of ACM Multimedia Conference 2023, (Ottawa, Canada, October 2023) and aimed at comparison of multimedia processing and machine learning methods for automatic human facial reaction generation under different dyadic interaction scenarios. The goal of the Challenge was to provide the first benchmark test set for multimodal information processing and to bring together the audio, visual and audio-visual affective computing communities, to compare the relative merits of the approaches to automatic appropriate facial reaction generation under well-defined conditions. The postdoctoral researcher funded by this fellowship project and the project leader co-organised The First Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023). The PDRA was one of the main organisers, and was responsible for challenge task design and execution of the challenge, registering the teams, and writing the challenge paper. |
Collaborator Contribution | The various partners provided data and labels, trained and evaluated baseline models, and contributed to the challenge paper writing. |
Impact | Conference Paper: Siyang Song, Micol Spitale, Cheng Luo, Germán Barquero, Cristina Palmero, Sergio Escalera, Michel F. Valstar, Tobias Baur, Fabien Ringeval, Elisabeth André, Hatice Gunes: REACT2023: The First Multiple Appropriate Facial Reaction Generation Challenge. ACM Multimedia 2023: 9620-9624 |
Start Year | 2023 |
Description | REACT Challenge Organisation |
Organisation | University of Barcelona |
Country | Spain |
Sector | Academic/University |
PI Contribution | Human behavioural responses are stimulated by their environment (or context), and people will inductively process the stimulus and modify their interactions to produce an appropriate response. When facing the same stimulus, different facial reactions could be triggered across not only different subjects but also the same subjects under different contexts. The Multimodal Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023) was organised as a satellite event of ACM Multimedia Conference 2023, (Ottawa, Canada, October 2023) and aimed at comparison of multimedia processing and machine learning methods for automatic human facial reaction generation under different dyadic interaction scenarios. The goal of the Challenge was to provide the first benchmark test set for multimodal information processing and to bring together the audio, visual and audio-visual affective computing communities, to compare the relative merits of the approaches to automatic appropriate facial reaction generation under well-defined conditions. The postdoctoral researcher funded by this fellowship project and the project leader co-organised The First Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023). The PDRA was one of the main organisers, and was responsible for challenge task design and execution of the challenge, registering the teams, and writing the challenge paper. |
Collaborator Contribution | The various partners provided data and labels, trained and evaluated baseline models, and contributed to the challenge paper writing. |
Impact | Conference Paper: Siyang Song, Micol Spitale, Cheng Luo, Germán Barquero, Cristina Palmero, Sergio Escalera, Michel F. Valstar, Tobias Baur, Fabien Ringeval, Elisabeth André, Hatice Gunes: REACT2023: The First Multiple Appropriate Facial Reaction Generation Challenge. ACM Multimedia 2023: 9620-9624 |
Start Year | 2023 |
Description | REACT Challenge Organisation |
Organisation | University of Grenoble |
Country | France |
Sector | Academic/University |
PI Contribution | Human behavioural responses are stimulated by their environment (or context), and people will inductively process the stimulus and modify their interactions to produce an appropriate response. When facing the same stimulus, different facial reactions could be triggered across not only different subjects but also the same subjects under different contexts. The Multimodal Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023) was organised as a satellite event of ACM Multimedia Conference 2023, (Ottawa, Canada, October 2023) and aimed at comparison of multimedia processing and machine learning methods for automatic human facial reaction generation under different dyadic interaction scenarios. The goal of the Challenge was to provide the first benchmark test set for multimodal information processing and to bring together the audio, visual and audio-visual affective computing communities, to compare the relative merits of the approaches to automatic appropriate facial reaction generation under well-defined conditions. The postdoctoral researcher funded by this fellowship project and the project leader co-organised The First Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023). The PDRA was one of the main organisers, and was responsible for challenge task design and execution of the challenge, registering the teams, and writing the challenge paper. |
Collaborator Contribution | The various partners provided data and labels, trained and evaluated baseline models, and contributed to the challenge paper writing. |
Impact | Conference Paper: Siyang Song, Micol Spitale, Cheng Luo, Germán Barquero, Cristina Palmero, Sergio Escalera, Michel F. Valstar, Tobias Baur, Fabien Ringeval, Elisabeth André, Hatice Gunes: REACT2023: The First Multiple Appropriate Facial Reaction Generation Challenge. ACM Multimedia 2023: 9620-9624 |
Start Year | 2023 |
Description | REACT Challenge Organisation |
Organisation | University of Leicester |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Human behavioural responses are stimulated by their environment (or context), and people will inductively process the stimulus and modify their interactions to produce an appropriate response. When facing the same stimulus, different facial reactions could be triggered across not only different subjects but also the same subjects under different contexts. The Multimodal Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023) was organised as a satellite event of ACM Multimedia Conference 2023, (Ottawa, Canada, October 2023) and aimed at comparison of multimedia processing and machine learning methods for automatic human facial reaction generation under different dyadic interaction scenarios. The goal of the Challenge was to provide the first benchmark test set for multimodal information processing and to bring together the audio, visual and audio-visual affective computing communities, to compare the relative merits of the approaches to automatic appropriate facial reaction generation under well-defined conditions. The postdoctoral researcher funded by this fellowship project and the project leader co-organised The First Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023). The PDRA was one of the main organisers, and was responsible for challenge task design and execution of the challenge, registering the teams, and writing the challenge paper. |
Collaborator Contribution | The various partners provided data and labels, trained and evaluated baseline models, and contributed to the challenge paper writing. |
Impact | Conference Paper: Siyang Song, Micol Spitale, Cheng Luo, Germán Barquero, Cristina Palmero, Sergio Escalera, Michel F. Valstar, Tobias Baur, Fabien Ringeval, Elisabeth André, Hatice Gunes: REACT2023: The First Multiple Appropriate Facial Reaction Generation Challenge. ACM Multimedia 2023: 9620-9624 |
Start Year | 2023 |
Description | REACT Challenge Organisation |
Organisation | University of Nottingham |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Human behavioural responses are stimulated by their environment (or context), and people will inductively process the stimulus and modify their interactions to produce an appropriate response. When facing the same stimulus, different facial reactions could be triggered across not only different subjects but also the same subjects under different contexts. The Multimodal Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023) was organised as a satellite event of ACM Multimedia Conference 2023, (Ottawa, Canada, October 2023) and aimed at comparison of multimedia processing and machine learning methods for automatic human facial reaction generation under different dyadic interaction scenarios. The goal of the Challenge was to provide the first benchmark test set for multimodal information processing and to bring together the audio, visual and audio-visual affective computing communities, to compare the relative merits of the approaches to automatic appropriate facial reaction generation under well-defined conditions. The postdoctoral researcher funded by this fellowship project and the project leader co-organised The First Multiple Appropriate Facial Reaction Generation Challenge (REACT 2023). The PDRA was one of the main organisers, and was responsible for challenge task design and execution of the challenge, registering the teams, and writing the challenge paper. |
Collaborator Contribution | The various partners provided data and labels, trained and evaluated baseline models, and contributed to the challenge paper writing. |
Impact | Conference Paper: Siyang Song, Micol Spitale, Cheng Luo, Germán Barquero, Cristina Palmero, Sergio Escalera, Michel F. Valstar, Tobias Baur, Fabien Ringeval, Elisabeth André, Hatice Gunes: REACT2023: The First Multiple Appropriate Facial Reaction Generation Challenge. ACM Multimedia 2023: 9620-9624 |
Start Year | 2023 |
Description | Reproducibility in Child-Robot Interaction |
Organisation | Cornell University |
Country | United States |
Sector | Academic/University |
PI Contribution | Research reproducibility - i.e., rerunning analyses on original data to replicate the results - is paramount for guaranteeing scientific validity. However, reproducibility is often very challenging, especially in research fields where multi-disciplinary teams are involved, such as child-robot interaction (CRI). This paper presents a systematic review of the last three years (2020-2022) of research in CRI under the lens of reproducibility, by analysing the field for transparency in reporting. Across a total of 325 studies, we found deficiencies in reporting demographics (e.g. age of participants), study design and implementation (e.g. length of interactions), and open data (e.g. maintaining an active code repository). From this analysis, we distil a set of guidelines and provide a checklist to systematically report CRI studies to help and guide research to improve reproducibility in CRI and beyond. PDRA of this fellowship project (MC) initiated this collaboration and led the collaborative effort of undertaking and writing up this review paper. The PhD student (NA) contributed to the review and the paper writing. The project leader (HG) contributed to the paper writing. |
Collaborator Contribution | The collaborators contributed to undertaking the systematic review and writing the paper. |
Impact | Scientific paper currently on ArXiv: Micol Spitale, Rebecca Stower, Elmira Yadollahi, Maria Teresa Parreira, Nida Itrat Abbasi, Iolanda Leite, Hatice Gunes: A Systematic Review on Reproducibility in Child-Robot Interaction. CoRR abs/2309.01822 (2023) |
Start Year | 2022 |
Description | Reproducibility in Child-Robot Interaction |
Organisation | Royal Institute of Technology |
Country | Sweden |
Sector | Academic/University |
PI Contribution | Research reproducibility - i.e., rerunning analyses on original data to replicate the results - is paramount for guaranteeing scientific validity. However, reproducibility is often very challenging, especially in research fields where multi-disciplinary teams are involved, such as child-robot interaction (CRI). This paper presents a systematic review of the last three years (2020-2022) of research in CRI under the lens of reproducibility, by analysing the field for transparency in reporting. Across a total of 325 studies, we found deficiencies in reporting demographics (e.g. age of participants), study design and implementation (e.g. length of interactions), and open data (e.g. maintaining an active code repository). From this analysis, we distil a set of guidelines and provide a checklist to systematically report CRI studies to help and guide research to improve reproducibility in CRI and beyond. PDRA of this fellowship project (MC) initiated this collaboration and led the collaborative effort of undertaking and writing up this review paper. The PhD student (NA) contributed to the review and the paper writing. The project leader (HG) contributed to the paper writing. |
Collaborator Contribution | The collaborators contributed to undertaking the systematic review and writing the paper. |
Impact | Scientific paper currently on ArXiv: Micol Spitale, Rebecca Stower, Elmira Yadollahi, Maria Teresa Parreira, Nida Itrat Abbasi, Iolanda Leite, Hatice Gunes: A Systematic Review on Reproducibility in Child-Robot Interaction. CoRR abs/2309.01822 (2023) |
Start Year | 2022 |
Description | Special Issue on Automated Perception of Human Affect from Longitudinal Behavioural Data |
Organisation | Massachusetts Institute of Technology |
Country | United States |
Sector | Academic/University |
PI Contribution | Contributed to the Special Issue proposal, advertised the special issue, and handled submitted papers |
Collaborator Contribution | Initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Impact | IEEE Transactions on Affective Computing Special Issue on Automated Perception of Human Affect from Longitudinal Behavioural Data |
Start Year | 2019 |
Description | Special Issue on Automated Perception of Human Affect from Longitudinal Behavioural Data |
Organisation | University of Hamburg |
Country | Germany |
Sector | Academic/University |
PI Contribution | Contributed to the Special Issue proposal, advertised the special issue, and handled submitted papers |
Collaborator Contribution | Initiated the collaboration and Special Issue proposal, advertised the special issue, and handled submitted papers |
Impact | IEEE Transactions on Affective Computing Special Issue on Automated Perception of Human Affect from Longitudinal Behavioural Data |
Start Year | 2019 |
Title | An Open-source Benchmark of Deep Learning Models for Audio-visual Apparent and Self-reported Personality Recognition |
Description | A GitHub code repository of An Open-source Benchmark of Deep Learning Models for Audio-visual Apparent and Self-reported Personality Recognition (https://www.computer.org/csdl/journal/ta/5555/01/10428080/1UmXkjfL4hW). Seven visual models, six audio models and five audio-visual models have been reproduced and evaluated. Besides, seven widely-used visual deep learning models, which have not been applied to video-based personality computing before, have also been employed for benchmark. Detailed description can be found in the paper. All benchmarked models are evaluated on: the ChaLearn First Impression dataset and the ChaLearn UDIVA self-reported personality dataset. |
Type Of Technology | Software |
Year Produced | 2022 |
Open Source License? | Yes |
Impact | International research groups re-sung the code and the benchmark, and citing the scientific paper. Code repository contributes to open science and reproducibility efforts. |
URL | https://www.computer.org/csdl/journal/ta/5555/01/10428080/1UmXkjfL4hW |
Title | CLIFER |
Description | This is a Keras-Tensorflow implementation for the Continual Learning with Imagination for Facial Expression Recognition paper published at the 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020). The paper proposes a novel framework for Continual Learning (CL)-based personalised Facial Expression Recognition. |
Type Of Technology | Software |
Year Produced | 2023 |
Open Source License? | Yes |
Impact | The open source code contributes to open science and reproducibility efforts. The work is now well-known and cited by international research groups; and has been requested and used by other research groups around the world. |
URL | https://ieeexplore.ieee.org/document/9320226 |
Title | Domain-Incremental CL for Mitigating Bias |
Description | The code is a PyTorch implementation for the Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition paper published in the IEEE Transactions on Affective Computing in 2023. Identifying bias as a critical problem in facial analysis systems, different methods have been proposed that aim to mitigate bias both at data and algorithmic levels. The paper proposes the novel use of Continual Learning (CL), in particular, using Domain-Incremental Learning (Domain-IL) settings, as a potent bias mitigation method to enhance the fairness of Facial Expression Recognition (FER) systems; and presents benchmarking results comparing Continual Learning for bias mitigation in Facial Affect Analyses with state-of-the-art bias mitigation methods. Code for individual benchmark experiments with the BP4D and RAF-DB datasets can be found in the respective folders. |
Type Of Technology | Software |
Year Produced | 2022 |
Impact | The open source code contributes to open science and reproducibility efforts. The work is now well-known and cited by international research groups. |
URL | https://ieeexplore.ieee.org/document/9792455 |
Title | LGR |
Description | The GitHub repository includes the code for the experiments presented in "Latent Generative Replay for Resource-Efficient Continual Learning of Facial Expressions" (https://www.repository.cam.ac.uk/handle/1810/342122) - presented at FG 2023. The implementation has been adapted using the original Continual Learning repository by van de Ven et al. (available on GitHub). Model performance has been benchmarked adapting implementations from the Avalanche project. |
Type Of Technology | Software |
Year Produced | 2022 |
Open Source License? | Yes |
Impact | The proposed novel Latent Generative Replay (LGR) for pseudo-rehearsal of low-dimensional latent features helps mitigate forgetting in a resource-efficient manner. We adapt popular CL strategies to use LGR instead of generating pseudo-samples, resulting in performance upgrades -- LGR significantly reduces the memory and resource consumption of replay-based CL without compromising model performance. The open source code contributes to open science and reproducibility efforts. Other research groups requested and used the code. |
URL | https://ieeexplore.ieee.org/document/10042642 |
Title | VITA Open-source System |
Description | VITA is a novel multi-modal LLM-based system that allows robotic coaches to autonomously adapt to the coachee's multi-modal behaviours (facial valence and speech duration) and deliver coaching exercises in order to promote mental well-being in adults. |
Type Of Technology | Software |
Year Produced | 2023 |
Open Source License? | Yes |
Impact | The VITA system was used for undertaking autonomous and adaptive robotic wellbeing coaching in the wild (in a workplace setting) over 4 weeks and participants reported significant wellbeing improvements at the end of this study. The open source code contributes to open science and reproducibility efforts. The work is now well-known and cited by international research groups; and has been requested and used by other research groups around the world. The VITA system also contributed to the Project Leader (PI), the postdoctoral researcher and the collaborators to be recognized with a Runner-up for the Collaboration Award at the Cambridge University Vice-Chancellor's Awards for Research Impact and Engagement 2023. |
URL | https://arxiv.org/abs/2312.09740 |
Description | 2023 interview with the University of Cambridge Communications Team |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | The PI, the postdoctoral researcher and a PhD student were interviewed by the University of Cambridge Communications Team regarding our study and scientific paper entitled 'Robotic Mental Well-being Coaches for the Workplace: An In-the-Wild Study on Form' that was presented to the ACM/IEEE International Conference on Human-Robot Interaction (HRI'23) in Stockholm, Sweden on 15 March 2023. This interview resulted in a research blog titled 'Robots can help improve mental wellbeing at work - as long as they look right' on the University of Cambridge website and which received more than 700 pieces of media coverage worldwide in March 2023, including The Independent, Psychology Today, Health Tech World, ITVX, and more. This interview resulted in a research blog titled 'Robots can be better at detecting mental wellbeing issues in children than parent-reported or self-reported testing, a new study suggests' on the University of Cambridge website and has received more than 1,000 pieces of coverage worldwide in September 2022, including BBC News, Guardian, Daily Mail, Telegraph, Sky News, ITV News, Bloomberg, Independent, Evening Standard, Metro, Scotsman, Yahoo News, New York Post and more. It was also featured in the morning news bulletins on Radio 2 and Radio 5 Live on September 1, 2022. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.cam.ac.uk/research/news/robots-can-help-improve-mental-wellbeing-at-work-as-long-as-they... |
Description | Academic panelist for ACM/IEEE HRI 2021 Pioneers Workshop |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I was an academic panelist for ACM/IEEE HRI 2021 Pioneers Workshop. This workshop is organised in conjunction with ACM/IEEE HRI Conference and it seeks to foster creativity and collaboration surrounding key challenges in human-robot interaction and empower students early in their academic careers. Each year, the workshop brings together a cohort of the world's top student researchers and provides the opportunity for students to present and discuss their work with distinguished student peers and senior scholars in the field. I was one of the 5 academic panelists for the pioneers to interact with and learn from. |
Year(s) Of Engagement Activity | 2021 |
URL | https://hripioneers.org/archives/hri21/program_speakers.html |
Description | BBC coverage |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | Our project on developing a robotic coach that can deliver mindfulness received recognition in the media platforms - was covered by the BBC Look East and the research team was interviewed for this purpose. |
Year(s) Of Engagement Activity | 2021 |
Description | Cambridge Computer Lab Healthcare Research Showcase |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Industry/Business |
Results and Impact | The postdoctoral researcher employed on ARoEQ project was among the early-career researchers that were invited to give a talk about their healthcare and wellbeing related research. Dr Indu Bodala talked about Designing Robot Coaches for Mental Wellbeing. After the talk we received an invitation from the Mindfulness After Cam (a Cambridge University Alumni Group whose aims are to promote the well-being of University of Cambridge alumni and their friends and family through mindfulness meditation) to attend a panel on mindfulness that will be attended by both the alumni group and current students in the Cambridge University Mindfulness Society. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.cst.cam.ac.uk/news/showcasing-our-healthcare-research |
Description | Co-chairing ACII 2022 WORKSHOP ON AFFECTIVE HUMAN-ROBOT INTERACTION (AHRI'22) |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The postdoc funded by this grant (SS) co-chaired and co-organised the ACII 2022 Workshop on Affective HRI (AHRI'22). |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.a-hri.me/ |
Description | Co-organizing the Workshop "Moral Imagination in Affective Computing" at IEEE ACII 2023 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | The PI was a co-organizer of the inaugural Workshop on Moral Imagination in Affective Computing at ACII 2023 with professional practitioners from Google as well as other international academics. The values and perspectives of affective technologists often shape what capabilities are built and how they are deployed. While most researchers want to see their work drive socially beneficial impacts, the technologies are being deployed in many complex social contexts. If our assumptions and intuitions go unchallenged, they can and do lead to unintended adverse effects. Realizing socially beneficial affective technologies requires going beyond dominant norms of computer science and engineering culture to engage meaningfully and productively with ethics and social responsibility. Using a "Moral Imagination" methodology developed at Google, this workshop provided an interactive format in which the affective computing community engaged pragmatically and constructively with this critical part of affective computing research. invites researchers from emotion science, social cognition, affective computing, HMI, and HRI to discuss cross-disciplinary perspectives on social and affective intelligence in humans and machines. Social Intelligence includes processes and competencies for perceiving, representing, reasoning about, and participating in social interactions. Affective Intelligence includes processes and competencies for communicating and managing emotions, feelings, moods, and other affective phenomena. The workshop was held on September 10th at MIT, Cambridge, MA, USA, in conjunction with the ACII23 conference. |
Year(s) Of Engagement Activity | 2023 |
URL | https://sites.google.com/view/MoralImagination-ACII2023 |
Description | Festival of Ideas 2019 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | This was a panel discussion on What Makes Us Human in the Age of AI. The focus was on whether automation make us redundant or are there qualities that are essentially human which will become more sought after. Other panellists included Allegre Hadida, University Senior Lecturer in Strategy at Judge Business School, author and lecturer Laura Dietz and Stephen Cave from the Leverhulme Centre for the Future of Intelligence. Chaired by Julian Clover from Cambridge 105 Radio. |
Year(s) Of Engagement Activity | 2019 |
URL | https://www.festivalofideas.cam.ac.uk/events/what-makes-us-human-age-ai |
Description | Friday Discourse at the RI |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | The PI gave an invited discourse titled 'Creating Emotionally Intelligent Technology' at the Royal Institution (London, UK) as part of their Friday Discourse series. The talk took place in London and was also broadcast around the world via the Royal Institution Youtube Channel. The recording of the talk has already reached ~29,000 viewers worldwide. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.youtube.com/watch?v=ddv91MZyLPQ |
Description | Interview India Times |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Micol Spitale and Minja Axelsson have been interviewed by India Times on their work published at HRI23: Spitale, M., Axelsson, M., & Gunes, H. (2023, March). Robotic mental well-being coaches for the workplace: An in-the-wild study on form. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (pp. 301-310). |
Year(s) Of Engagement Activity | 2023 |
Description | Interview at BBC Radio Cambridgeshire |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | Micol Spitale and Minja Axelsson have been interviewed by the BBC Radio Cambridgeshire on their work published: Spitale, M., Axelsson, M., & Gunes, H. (2023, March). Robotic mental well-being coaches for the workplace: An in-the-wild study on form. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (pp. 301-310). |
Year(s) Of Engagement Activity | 2023 |
Description | Interview for Blog Post |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | A blog post about our research on creating robotic wellbeing coaches was published on the University of Cambridge website in May 2021. We were informed that it has reached over 45,000 readers. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.cam.ac.uk/stories/wellbeing-robot |
Description | Interview with Medical News Today (USA) |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | The PI was interviewed by Medical News Today (USA) on how robots can be used to assist with mental wellbeing assessment in children. The media report covering this was published on September 2nd, 2022 with the title "Children more likely to disclose mental health issues to a robot, study shows". |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.medicalnewstoday.com/articles/children-more-likely-to-disclose-mental-health-issues-to-a... |
Description | Interview with The Guardian |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | The PI was interviewed by The Guardian Newspaper science correspondent on how robots can be used to assist with mental wellbeing assessment in children. This was published on The Guardian on September 1st with the title "Children more candid about mental health when talking to robot, study finds". This media report was then picked up by other media channels and resulted in more than 1,000 pieces of coverage worldwide in September 2022, including BBC News, Daily Mail, Telegraph, Sky News, ITV News, Bloomberg, Independent, Evening Standard, Metro, Scotsman, Yahoo News, New York Post and more. It was also featured in the morning news bulletins on Radio 2 and Radio 5 Live on September 1, 2022. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.theguardian.com/technology/2022/sep/01/children-mental-health-talking-robot-study-cambri... |
Description | Invited Talk at the "Cambridge Language Sciences " |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | Micol Spitale gave a Keynote talk at the Symposium on "Language and Wellbeing", for the Cambridge Language Sciences entitled "Taking affective robots into the real world: challenges and opportunities". This presentation addressed the current challenges identified in literature for the deployment of affective robots in real-world scenarios. By sharing insights from various case studies, ranging from the use of robots in the treatment of language disorders in children to utilising robotic coaches for promoting mental well-being, this talk shed light on the obstacles faced in bringing affective robots out of the lab and into practical applications. Additionally, potential solutions and future directions for advancing the field have also been discussed. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.cambridge.org/engage/coe/article-details/656db4ee29a13c4d479ea657 |
Description | Invited Talk at the "Cambridge Neuroscience Talks" |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I gave a talk entitled "Children-Agent Interaction For Assessment and Rehabilitation: From Linguistic Skills To Mental Well-being" at the Cambridge Neuroscience Talk series on the 07/02/2023 |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.youtube.com/watch?v=ii4wdadGudw |
Description | Invited Talk at the "Children and AI: risk, opportunities and the future" from University of Cambridge |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I gave an invited talk entitled "Conversational Agents For Children with Language Impairments: Design and Technology" at the Children and AI workshop organized by the University of Cambridge on the 25/04/2022 |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.crassh.cam.ac.uk/events/33399/ |
Description | Invited Talk at the "Talking Robotics" series |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I gave a talk entitled "Human-Robot Interaction For Promoting and Assessing Mental Well-being" at the Talking Robotics series on the 08/02/2023 |
Year(s) Of Engagement Activity | 2023 |
Description | Invited Talk at the AI for Robotics Workshop |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I gave an invited talk entitled Data-driven Socio-emotional Intelligence for Human-Robot Interaction at the 2nd NAVER LABS Europe International Workshop on AI for Robotics (29th - 30th November 2021). |
Year(s) Of Engagement Activity | 2021 |
URL | https://europe.naverlabs.com/research/2nd-ai-for-robotics-international-workshop-by-naver-labs-europ... |
Description | Invited talk for ACM-W UK |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Postgraduate students |
Results and Impact | I gave a keynote talk titled Artificial Emotional Intelligence for Well-being for the ACM Women in Computing UK (ACM-W UK) organisation in June 2020 as part of their ACM-W UK Webinar Series "Computing for Social Good". In my talk, I covered the teleoperated robotic coach for mindfulness study we conducted as part of the EPSRC project ARoEQ, as well as the collaborative study we undertook with Emteq and Cambridge Neuroscience. The workshop was organised virtually my talk attracted numerous questions as well as follow up emails by some of the participants. The talk prompted further invitations for keynote talks in 2020. |
Year(s) Of Engagement Activity | 2020 |
URL | https://acmukwomen.acm.org/2020/08/24/computing-for-social-good-how-we-ran-a-successful-speaker-seri... |
Description | Invited talk for CVPR 2021 Workshop on Continual Learning in Computer Vision |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I gave an invited talk titled Continual Learning for Affective Robotics for the CVPR 2021 Workshop on Continual Learning in June 2021. In my talk, I covered the continual learning methodologies we have been studying int he context of affective and social robotics as part of the EPSRC project ARoEQ. The workshop was organised virtually and the workshop attracted numerous participants and viewers. The talk prompted further invitations for keynote talks in 2021. |
Year(s) Of Engagement Activity | 2021 |
URL | https://sites.google.com/view/clvision2021 |
Description | Invited talk for RO-MAN 2021 Workshop SCRITA |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I gave an invited talk titled 'Will Artificial Social Intelligence Lead to Trust and Acceptance in HRI?' at SCRITA, the 4th Workshop on Trust, Acceptance and Social Cues in Human-Robot Interaction organised in conjunction with IEEE RO-MAN 2021. |
Year(s) Of Engagement Activity | 2021 |
URL | https://scrita.herts.ac.uk/2021/ |
Description | Keynote at AHRI 2022 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The PI gave a keynote talk titled 'Affective Computing for Humanoid Service Robotics' at the International Workshop on Affective Human-Robot Interaction (AHRI 2022) that was organised in conjunction with ACII 2022. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.a-hri.me/program |
Description | Keynote at AMAR 2022 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The PI gave a keynote talk titled 'It Doesn't Work Like That! Lessons Learned in Situated Affective Computing' at the Third Workshop on Applied Multimodal Affect Recognition (AMAR'22) that was organised in conjunction with ICPR 2022. |
Year(s) Of Engagement Activity | 2022 |
URL | https://cse.usf.edu/~tjneal/AMAR2022/ |
Description | Keynote at BAILAR'22 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The PI gave a keynote talk titled 'Robotic Coaches for Mental Wellbeing: From User Requirements to AI-driven Adaptation' at the Workshop on Behavior Adaptation and Learning for Assistive Robotics organised in conjunction with IEEE RO-MAN 2022 conference in Naples, Italy. |
Year(s) Of Engagement Activity | 2022 |
URL | https://sites.google.com/view/bailar-2022/invited-speakers?authuser=0 |
Description | Keynote at HCSSL'22 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The PI gave a keynote talk titled 'Fairness, Explainability, and Facial Affect' at the Workshop on Human-Centric Selfsupervised Learning (HCSSL'22) organised in conjunction with AAAI 2022. |
Year(s) Of Engagement Activity | 2022 |
URL | https://hcssl.github.io/AAAI-22/pages/speakers.html |
Description | Keynote at ICPR 2022 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The PI gave a keynote talk titled 'Artificial Emotional Intelligence: Quo Vadis?' at the 26th International Conference on Pattern Recognition 2022 organised in Montreal, Canada. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.icpr2022.com/speakers/ |
Description | Keynote at SCIAR 2022 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | The PI gave a keynote talk titled 'Social Robots for Assessing and Promoting Mental Well-being' at the Workshop on Social and Cognitive Interactions for Assistive Robotics (SCIAR 2022) that was organised in conjunction with IROS 2022. The talk presented various studies and findings related to the ARoEQ project. |
Year(s) Of Engagement Activity | 2022 |
URL | https://sciar-workshop.github.io/sessions/ |
Description | Keynote talk at ACM ICMI 2021 Workshop GENEA |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | I gave a keynote talk on Data-driven Robot Social Intelligence at the ACM ICMI 2021 Workshop on Generation and Evaluation of Non-verbal Behaviour for Embodied Agents (GENEA). |
Year(s) Of Engagement Activity | 2021 |
URL | https://genea-workshop.github.io/2021/ |
Description | Magazine feature |
Form Of Engagement Activity | A magazine, newsletter or online publication |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Our project on developing a robotic coach that can deliver mindfulness received recognition in the media platforms - the project and the postdoc were featured by the Raspberry Pi Foundation's Hello World magazine. |
Year(s) Of Engagement Activity | 2021 |
URL | https://www.mclibre.org/descargar/docs/revistas/hello-world/hello-world-17-en-202110.pdf |
Description | Organiser of LEAP-HRI 2021 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I was one of the organizers of the Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) that was organised in conjunction with the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2021). While most of the research in Human-Robot Interaction (HRI) focus on short-term interactions, long-term interactions require bolder developments and a substantial amount of resources, especially if the robots are deployed in the wild. The robots need to incrementally learn new concepts or abilities in a lifelong fashion to adapt their behaviors within new situations and personalize their interactions with users to maintain their interest and engagement. The "Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI)" Workshop aimed to take a leap from the traditional HRI approaches towards addressing the developments and challenges in these areas and create a medium for researchers to share their work in progress, present preliminary results, learn from the experience of invited researchers and discuss relevant topics. It also hosted two keynote talks and a panel discussion with leading academic and industry experts from around the world. |
Year(s) Of Engagement Activity | 2021 |
URL | https://leap-hri.github.io/ |
Description | Organiser of LEAP-HRI 2022 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I was one of the organizers of the 2nd Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI 2022) that was organised in conjunction with the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2022). While most research in Human-Robot Interaction (HRI) studies one-off or short-term interactions in constrained laboratory settings, a growing body of research focuses on breaking through these boundaries and studying long-term interactions that arise through deployments of robots "in the wild". Under these conditions, robots need to incrementally learn new concepts or abilities (i.e., "lifelong learning") to adapt their behaviors within new situations and personalize their interactions with users to maintain their interest and engagement. The second edition of the "Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI)" workshop aimed to address the developments and challenges in these areas and create a medium for researchers to share their work in progress, present preliminary results, learn from the experience of invited researchers and discuss relevant topics. The workshop focused on studies on lifelong learning and adaptivity to users, context, environment, and tasks in long-term interactions in a variety of fields such as education, rehabilitation, elderly care, collaborative tasks, service, and companion robots. |
Year(s) Of Engagement Activity | 2022 |
URL | https://leap-hri.github.io/ |
Description | Organiser of LL4LHRI 2020 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | I was the main organiser of the International Workshop on Lifelong Learning for Long-term HRI (LL4LHRI) at IEEE RO-MAN 2020 Conference. With three keynote speakers from the fields of neuroscience, cognitive robotics and machine learning, the workshop was very well received. It led to the HRI 2021 Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) as well as the a Frontiers in Robotics and AI Special Issue on Lifelong Learning and Long-term HRI for which I am the main Guest Editor. |
Year(s) Of Engagement Activity | 2020 |
URL | https://sites.google.com/view/ll4lhri2020/objectives-and-challenges?authuser=0 |
Description | Organizing the 2nd Edition of the Workshop "HRI4Wellbeing" at IEEE RO-MAN'23 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The second edition of the workshop entitled "HRI4Wellbeing" was organized at RO-MAN23, Busan, South Korea. The main topic of our workshop was "robotic applications for wellbeing in the real world", which is strongly in line with the RO-MAN 2023 theme of "Design a New Bridge for H-R-I", which seeks to address the challenges of developing intelligent robots for human health. Robots are becoming more prevalent in our society for task-oriented goals (e.g., cleaning the house, cooking a meal) and social-oriented interactions such as companionship, assistance, and coaching. We expect robots to share our daily lives in our homes, workplaces, and public spaces. |
Year(s) Of Engagement Activity | 2023 |
URL | https://hri4wellbeing.github.io/ |
Description | Organizing the Workshop "AR4W: Affective Robotics For Well-being" at ACII22 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | We organized the workshop entitled "AR4W: Affective Robotics For Well-being" at the 10th International Conference on Affective Computing and Intelligent Interaction (ACII) 2022, on the 17th October 2022. |
Year(s) Of Engagement Activity | 2022 |
URL | https://sites.google.com/rosielab.ca/ar4w2022/ |
Description | Organizing the Workshop "CRITTER" at HRI23 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | We co-organized the second edition of the workshop entitled "CRITTER: Child-Robot Interaction and Interdisciplinary Research" at the 18th Annual ACM/IEEE International Conference on Human Robot Interaction (HRI) 2023, on the 13th of March 2023. |
Year(s) Of Engagement Activity | 2023 |
URL | https://child-robot-interaction.github.io/ |
Description | Organizing the Workshop "Causal-HRI: Causal Learning for Human-Robot Interaction" at ACM/IEEE HRI'24 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | Half-day workshop taking place on March 11, 2024 with peer-reviewed paper presentations and 3 keynote talks. |
Year(s) Of Engagement Activity | 2024 |
URL | https://causal-hri.github.io/ |
Description | Organizing the Workshop "HRI4Wellbeing" at RO-MAN22 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | We organized a workshop entitled "HRI4Wellbeing" at the 31st IEEE International Conference on Robot & Human Interactive Communication (RO-MAN 2022) on the 2nd September. |
Year(s) Of Engagement Activity | 2022 |
URL | https://hri4wellbeing.github.io/ |
Description | Organizing the Workshop "SAI: Social Affective Intelligence" at IEEE ACII'23 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | We co-organized the inaugural Workshop on Social and Affective Intelligence (SAI) at ACII 2023 that invites researchers from emotion science, social cognition, affective computing, HMI, and HRI to discuss cross-disciplinary perspectives on social and affective intelligence in humans and machines. Social Intelligence includes processes and competencies for perceiving, representing, reasoning about, and participating in social interactions. Affective Intelligence includes processes and competencies for communicating and managing emotions, feelings, moods, and other affective phenomena. The workshop was held on September 10th at MIT, Cambridge, MA, USA, in conjunction with the ACII23 conference. |
Year(s) Of Engagement Activity | 2023 |
URL | https://sites.google.com/cam.ac.uk/sai2023/home |
Description | Organizing the Workshop "SS4HRI: Social Signal Modeling in Human-Robot Interaction" at ACM/IEEE HRI'24 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The workshop entitled "Social Signal Modeling in Human-Robot Interaction" focuses on the understanding and modeling of social signals to create human-aware HRI. The three fundamental themes are: understanding social signals (gain insights into human internal states), modeling social signals for the generation of a human's mental state (translating social signals into actionable computational models), and operationalizing human models for human-aware applications (integrating these cognitive models into robotic systems to develop new human-aware capabilities). This workshop will be held on the 11th March in Boulder, Colorado (USA) at HRI24. |
Year(s) Of Engagement Activity | 2023 |
URL | https://sites.google.com/cam.ac.uk/ss4hri/home |
Description | Panel discussion at NeuroEthics event |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Postgraduate students |
Results and Impact | Micol Spitale has been invited to participant to the Panel Discussion at NeuroEthics event. The panel topic was around the effects (positive and/or negative) of technologies on mental well-being in people. |
Year(s) Of Engagement Activity | 2023 |
Description | Panelist at ACM Multimedia 2021 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | I was a panelist in Social Signals and Multimedia: Past, Present, Future that was organised at ACM Multimedia 2021. We covered wide range of topics related to social signals, including the potential of robots. |
Year(s) Of Engagement Activity | 2021 |
URL | https://dl.acm.org/doi/10.1145/3474085.3480024 |
Description | Physics at Work Outreach 2023 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Schools |
Results and Impact | In 2023, the Cavendish Laboratory hosted the 39th annual Physics at Work Exhibition at the Cavendish Laboratory in Cambridge. Physics at Work aims to show 14-16yr olds the variety of careers to which study in Physics can lead and the range of practical problems that physics can be used to solve. This event is open to all schools and is free. The PI's group members (two of whom work as part of the ARoEQ project) held live demonstration sessions with the school students, where they were invited to interact with the Pepper robot undertaking various activities. A total of more than 450 school students attended the event. The demonstrations with the Pepper were the second-highest rated event by students and faculty members and made it to the cover of the CavMag magazine. https://www.phy.cam.ac.uk/files/cavmag_30_screen_oct_2023.pdf |
Year(s) Of Engagement Activity | 2023 |
URL | https://outreach.phy.cam.ac.uk/programme/physicsatwork |
Description | Program Chair for ACM/IEEE International Conference on Human Robot Interaction |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | The 15th Annual ACM/IEEE International Conference on Human Robot Interaction will be held at the Corn Exchange and Cambridge Guild Hall in Cambridge, UK from March 23-26, 2020. I am a Program Chair for this top-tier conference in the area of social robotics and human-robot interaction. Reproducibility is a major issue in the field of HRI. Studies are conducted in an in-house manner and data is not shared in the community. I co-led the effort for creating a new Theme on Reproducibility for Human-Robot Interaction at this conference which was a major success. The theme is now running for a second year and is already changing the views of the academics, students and practitioners working in the field of HRI. |
Year(s) Of Engagement Activity | 2020 |
URL | https://humanrobotinteraction.org/2020/ |
Description | Robot Interactions at the Cambridge Festival 2022 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Public/other audiences |
Results and Impact | We invited participants to "Meet Nao that Helps with Assessment of Mental Wellbeing in Children". Children can talk with a Nao robot and describe one picture displayed on a screen. With the robot Nao, children will spend a few minutes talking with it and looking at a picture; they will imagine what happens before, in, and after. In the meantime, parents will be presented with the potentialities (given the results of our previous study) of having a robot as an aid to assess children's mental wellbeing related issues. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.festival.cam.ac.uk/events/meet-pepper-and-nao-wellbeing-robots |
Description | Robot Interactions at the Cambridge Festival 2023 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Public/other audiences |
Results and Impact | We invited participants to engage in the following activity during the Cambridge Festival: "Which robot would you choose as a wellbeing coach?" on the 18th March 2023. Participants experience a sample of a positive psychology session with a robot. They can choose which robot they would like to deliver the exercise to them, the QTRobot or the Misty II robot. They spend a few minutes reflecting on recent events that made them feel grateful, and the positive emotions they stimulated. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.festival.cam.ac.uk/events/try-positive-psychology-session-robot |
Description | Trinity Hall Association's Panel Discussion |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | On 23 September 2023, Hatice Gunes was an invited panelist at the Trinity Hall Association's Panel Discussion on 'AI: Unravelling Potential'. |
Year(s) Of Engagement Activity | 2023 |
Description | Tutorial at the HRI Winterschool on Embodied AI 2022 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | The postdoc funded by this grant (NC) was invited as a speaker to the HRI Winterschool on Embodied AI 2022. He gave a tutorial on Continual Learning for Affective Robotics and demonstrated and shared code on how to use facial affect prediction with continual learning for human-robot interaction applications, in particular for creating adaptive robotic coaches. |
Year(s) Of Engagement Activity | 2022 |
URL | https://hriwinterschool.com/ |
Description | interview with the University of Cambridge Communications Team |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | The PI, the postdoctoral researcher and a PhD student were interviewed by the University of Cambridge Communications Team regarding their study and scientific paper entitled 'Can Robots Help in the Evaluation of Mental Wellbeing in Children? An Empirical Study' that was presented at the 31st IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) in Naples, Italy on 1 September 2022. This interview resulted in a research blog titled 'Robots can be better at detecting mental wellbeing issues in children than parent-reported or self-reported testing, a new study suggests' on the University of Cambridge website and has received more than 1,000 pieces of coverage worldwide in September 2022, including BBC News, Guardian, Daily Mail, Telegraph, Sky News, ITV News, Bloomberg, Independent, Evening Standard, Metro, Scotsman, Yahoo News, New York Post and more. It was also featured in the morning news bulletins on Radio 2 and Radio 5 Live on September 1, 2022. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.cam.ac.uk/research/news/robots-can-be-used-to-assess-childrens-mental-wellbeing-study-su... |
Description | keynote at IEEE FG 2019 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | I gave a keynote talk titled 'Creating Technology with Socio-emotional Intelligence' at IEEE FG 2019, the 14th IEEE Int'l Conference on Automatic Face and Gesture Recognition, held in Lille, France in May 2019. |
Year(s) Of Engagement Activity | 2019 |
URL | http://fg2019.org/invited-speakers/hatice-gunes/ |
Description | seminar at the Centre for Historical Analysis and Conflict Research |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | I gave an invited seminar titled 'Artificial Emotional Intelligence for Human-Robot Interaction' at the Centre for Historical Analysis and Conflict Research, in the Royal Military Academy Sandhurst with an exgtensive Q&A session regarding the use of AI. |
Year(s) Of Engagement Activity | 2019 |