Adaptive Robotic EQ for Well-being (ARoEQ)

Lead Research Organisation: University of Cambridge
Department Name: Computer Laboratory

Abstract

Social robots are envisioned to interact closely with people safely and efficiently, and to add value to people's lives by helping, caring, teaching and entertaining. However, currently there is a major gap between public perception of humanoid / social robot capabilities and their actual capabilities. The cognitive and social capabilities of the current humanoid robots are still very limited.

Although social robotics is an inherently multi-disciplinary field, there are no systematic efforts to develop novel sensing, perception and understanding capabilities for these robots grounded in the state of the art in the fields of affective computing, social signal processing, computer vision and machine learning. To avoid re-inventing the wheel, researchers in HRI often and rightly utilise available sensing / perception tools from other domains, creating their own in-house datasets and evaluations. However, these practices hinder advance in social robotics, leading to a major lack of novel and domain specific tools, and a lack of measures for benchmarking due to a lack of annotated, publicly available multimodal interaction datasets that are vital for comparative evaluation.

This Fellowship aims to address these major gaps in HRI and social robotics. Its vision is to:

(1) equip humanoid robots with novel socio-emotional intelligence and adaptation capabilities grounded in the state of the art in affective computing, social signal processing, computer vision and machine learning fields;

(2) investigate the deployment of humanoid robots as socio-emotionally smart embodied personal devices that can potentially revolutionise our ability to maintain healthier behaviours and working environments, leading to resilient communities.

Planned Impact

The line of research targeted in this funding application is highly interdisciplinary and the research outcomes can be spun into the outer society in various ways. Relevant beneficiaries are identified based on the expected term of impact.

This Fellowship is likely to impact two major areas: Robotics and healthcare. The most immediate application of the proposed research is in the areas of social robotics, telepresence robotics and human-robot interaction. In the short to medium term, the proposed research has the potential to become applicable and expandable to multiple domains including healthcare, social work, education, accommodation, foodservice, retail trade, and arts and entertainment. With advances in AI and physical robot design, robots are starting to take on increasingly complex social roles in homes, workplaces, and public spaces. However, there is a major gap between public perception of humanoid / social robot capabilities and their actual capabilities. The proposed research will improve on the capabilities of the humanoid robots, bringing them closer to the expectations of the public. This is also expected to help with technology adoption, i.e., the adoption and acceptance of social Robotics and Autonomous Systems (RAS) as new products to be used for public good.

Health has been identified as one of the top priorities in the RCUK Strategic Priorities and Spending Plan (2016-2020). In the medium term, the proposed research will contribute to this priority both by developing a better technical understanding of building socio-emotionally intelligent and adaptive robots, and by developing multidisciplinary knowledge about how to deploy such robotic systems as personal and/or self-management technology (e.g., a coach) to strengthen the emotional and social well-being of the unaffected populations for prevention and resilience, both in the workplace and outside.

Academic, artistic and philosophical motivations for researching intelligent machines and robots are well represented by the Science Museum's 2017 exhibition on robots that have started to be used in theatre plays (e.g., Spillikin) and dance performances (e.g., Robot, Blanca Li Dance Company). In the medium to long term, the robotic platform created as part of this project can potentially be used by the performers and artists.

In the longer term, this project will contribute to the UK's competence in Social Robotics and Autonomous Systems (RAS) area adding value into its economy. The project will ensure international visibility for the research team, which in turn will contribute to building the UK's capabilities in social robotics through support for talented people, and attract other investments.
 
Description Main findings from our work published on IEEE Trans. on Cybernetics 2020 [ https://ieeexplore.ieee.org/xpl/tocresult.jsp?isnumber=6352949]
(1) We don't always need sophisticated models as humans rate the machine-like speech to match the machine-like movements better than the more natural movements generated;
(2) Even though the models are not trained with explicit information about subject personalities, subject-dependent learning generates movements that are assessed as more appropriate to the input audio for 'more conscientious' people than for 'less conscientious';
(3) This creates a stepping stone toward learning to synthesize motions for distinctive personality styles rather than manually manipulating a robot's behaviors which is still an open research problem.

Main findings from our work published at IEEE RO-MAN 2020 - https://ieeexplore.ieee.org/document/9223564

WHAT? Acquire person-specific data
WHY? Adapting learning models to individual preferences requires large amounts of data that can only be sourced through interactions with users.
HOW? (1) Conduct introductory HRI rounds to enable the robot to collect additional data about the user. (2) Leverage adversarial learning to train a generative model to simulate additional person-specific data.

WHAT? Obtain normative baselines
WHY? The robot needs to know the behavioural norm for each user against which deviations can be observed. Deviations help identify shifts in user socio-emotional behaviours and infer changes in interaction context.
HOW? (1) Conduct interactions under contextually inert (neutral) situations during introduction rounds. (2) Use the (subtle) deviations from this baseline, given the interaction context, to analyse shifts.

WHAT? Extract semantic associations
WHY? Adapting the learning for a large number of users is computationally intractable. Learning models will get saturated, not able to remember previous information or learn with new individuals.
HOW? (1) Form user groupings, using person-specific attributes (Cu in Eq. 2-3) to learn group-based adaptations. (2) Use unsupervised data clustering to facilitate learning semantic groupings of users.

WHAT? Learn contextual affordances
WHY? Interactions are driven by context and humans switch between contexts without clear boundaries. Contextual attributions may not always be implicit and need to be learnt separately
HOW? (1) Learn context-aware embeddings to distinguish between task boundaries. (2) Use contextual affordances (e.g. Ti in Eq. 3) to facilitate smooth switching between affective HRI contexts.

WHAT? Balance memory with computation
WHY? The memory-computation trade-off needs to be considered w.r.t the application domain. Adding more memory facilitates rehearsal of past knowledge, while additional computation power improves adaptation to novel experiences.
HOW? (1) Use generative models for pseudo-rehearsal to reduce model's memory foot-print. (2) Offload part of the computation/memory load to RaaSbased solutions to balance old vs. novel learning.

WHAT? Allow controlled forgetting
WHY? When learning is continuous, redundant information in the memory/model, is not released, hindering learning capacity of the model.
HOW? (1) Utilise forgetting mechanisms (inspired by biological organisms) on unused memory locations or parts of the model, to learn new knowledge.

WHAT? Use multiple performance metrics
WHY? Benchmark evaluations from conventional ML and CL perspectives are needed for reproducibility and fairness guarantees, and to evaluate model's robustness to dynamic shifts in data distributions.
HOW? (1) Report CL performance metrics (Section IIE), along with the classification metrics of Fmeasure and AUC-ROC scores or reward-function
dynamics for behaviour learning.
Exploitation Route Too early to say as the project started in mid-April 2019.
Sectors Digital/Communication/Information Technologies (including Software),Education,Healthcare

 
Description New Theme on Reproducibility at The 2020 ACM/IEEE International Conference on Human-Robot Interaction
Geographic Reach Multiple continents/international 
Policy Influence Type Influenced training of practitioners or researchers
 
Description Alan Turing Faculty Fellowship
Amount £8,184 (GBP)
Organisation Alan Turing Institute 
Sector Academic/University
Country United Kingdom
Start 03/2019 
End 04/2021
 
Description Robot Coach for Promoting Mental Wellbeing in Workplaces
Amount £23,467 (GBP)
Organisation University of Cambridge 
Sector Academic/University
Country United Kingdom
Start 03/2022 
End 07/2022
 
Description W.D Armstrong Trust Fund PhD Studentship in the Application of Engineering in Medicine
Amount £134,709 (GBP)
Organisation University of Cambridge 
Sector Academic/University
Country United Kingdom
Start 09/2020 
End 09/2023
 
Title The MANNERS Dataset 
Description The MANNERS Dataset constitutes simulated robot actions in visual domestic scenes of different social configurations. To be able to control but vary the configurations of the scenes and the social settings, MANNERS has been created utilising a simulation environment by uniformly sampling relevant contextual attributes. The robot actions in each scene have been annotated by multiple humans along social appropriateness levels via a window. The dataset is available for research purposes - access to all images, annotation labels and Unity files - upon filling the Access Request Form provided on the dataset website. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact Not yet 
URL https://github.com/jonastjoms/MANNERS-DB
 
Description Cambridge / UCL Neuroscience collaboration 
Organisation University of Cambridge
Department Cambridge Neuroscience
Country United Kingdom 
Sector Academic/University 
PI Contribution Created a Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training and tested it on older adults.
Collaborator Contribution Dr Dennis Chan from Cambridge Neuroscience (now with UCL Neuroscience) provided: - expertise for the design of the memory tasks to be used for cognitive training - assistance in recruiting the older adults for the intervention evaluation - assistance in administering a battery of standardised cognitive tests to the participants
Impact A conference paper presented: Lorcan Reidy, Dennis Chan, Charles Nduka, Hatice Gunes: Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training. ICMI 2020: 174-183
Start Year 2019
 
Description Cambridge Department of Psychiatry collaboration 
Organisation University of Cambridge
Department Department of Psychiatry
Country United Kingdom 
Sector Academic/University 
PI Contribution We applied for the University of Cambridge - W.D Armstrong Trust Fund PhD Studentship in the Application of Engineering in Medicine (£ 134709; 2020 - 2023) together with Professor Peter B Jones, and this was awarded. Since Oct 2020, a PhD student and other team members collaborate on child human-robot interaction studies aiming to help with improving and monitoring mental health and well-being in children.
Collaborator Contribution Prof Peter B Jones and Professor Tamsin Ford and their team have been advising us on the design and data analysis aspects of the child human-robot interaction studies my team is undertaking in the Department Computer Science and Technology.
Impact We are currently preparing conference papers as outputs of the study conducted to date.
Start Year 2020
 
Description Emteq collaboration 
Organisation Emteq Limited
Country United Kingdom 
Sector Private 
PI Contribution Created a Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training and tested it on older adults.
Collaborator Contribution Emteq and Charles Nduka provided: - the new Faceteq with Facial Electromyography sensors attached used within a VR headset for sensing user facial upper muscle movements. - tech support for using the relevant libraries and code
Impact A conference paper presented: Lorcan Reidy, Dennis Chan, Charles Nduka, Hatice Gunes: Facial Electromyography-based Adaptive Virtual Reality Gaming for Cognitive Training. ICMI 2020: 174-183
Start Year 2019
 
Description METU Kovan Lab collaboration 
Organisation Middle East Technical University
Department Department of Computer Engineering
Country Turkey 
Sector Academic/University 
PI Contribution Provided: - expertise in HRI, facial affect analysis, human nonverbal behaviour understanding - physical space for a visiting academic - PhD and MPhil students - support and expertise in research proposals
Collaborator Contribution Provided: - expertise in deep learning - time and effort for co-supervising two PhD and one MPhil students - support and expertise in research proposals
Impact This collaboration is multi-disciplinary, involving human-robot interaction and machine learning. Scientific papers: Nikhil Churamani, Sinan Kalkan, Hatice Gunes: Continual Learning for Affective Robotics: Why, What and How? RO-MAN 2020: 425-431 Tian Xu, Jennifer White, Sinan Kalkan, Hatice Gunes: Investigating Bias and Fairness in Facial Expression Recognition. ECCV Workshops (6) 2020: 506-523 Nikhil Churamani, Sinan Kalkan, Hatice Gunes: Spatio-Temporal Analysis of Facial Actions using Lifecycle-Aware Capsule Networks. CoRR abs/2011.08819 (2020) Media coverage: Researchers find evidence of bias in facial expression data sets https://venturebeat.com/2020/07/24/researchers-find-evidence-of-bias-in-facial-expression-data-sets/ Collaborative project funded by by TUBITAK 1001 (Project no 120E269): KALFA project - New Methods for Assembly Scenarios with Collaborative Robots
Start Year 2020
 
Description NHS collaboration 
Organisation Cambridge University Hospitals NHS Foundation Trust
Country United Kingdom 
Sector Public 
PI Contribution Established an in-lab human-robot interaction study setting with a humanoid robot for conducting longitudinal positive psychology therapy sessions. Exploring therapy sessions with a robot is a novel approach for NHS practitioners. Both the coach and the participants were recorded with multiple sensors in the course of 4 weeks. The sessions were repeated with another cohort of participants for another 4 weeks. We were planning to continue in January/February 2021, but this was not possible due to COVID-19 related lockdown.
Collaborator Contribution The NHS practitioner provided us guidance in terms of structure of one-to-one sessions, content, measures and questionnaires to be used. In the period of November-December 2020, the NHS practitioner administered coaching in one-to-one sessions. Their data was recorded with multiple sensors. This data will be used for creating a robotic well-being coach.
Impact The output is a multimodal dataset of dyadic interactions wellbeing coaching interactions.
Start Year 2019
 
Description Academic panelist for ACM/IEEE HRI 2021 Pioneers Workshop 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I was an academic panelist for ACM/IEEE HRI 2021 Pioneers Workshop. This workshop is organised in conjunction with ACM/IEEE HRI Conference and it seeks to foster creativity and collaboration surrounding key challenges in human-robot interaction and empower students early in their academic careers. Each year, the workshop brings together a cohort of the world's top student researchers and provides the opportunity for students to present and discuss their work with distinguished student peers and senior scholars in the field.
I was one of the 5 academic panelists for the pioneers to interact with and learn from.
Year(s) Of Engagement Activity 2021
URL https://hripioneers.org/archives/hri21/program_speakers.html
 
Description BBC coverage 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact Our project on developing a robotic coach that can deliver mindfulness received recognition in the media platforms - was covered by the BBC Look East and the research team was interviewed for this purpose.
Year(s) Of Engagement Activity 2021
 
Description Cambridge Computer Lab Healthcare Research Showcase 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact The postdoctoral researcher employed on ARoEQ project was among the early-career researchers that were invited to give a talk about their healthcare and wellbeing related research. Dr Indu Bodala talked about Designing Robot Coaches for Mental Wellbeing.
After the talk we received an invitation from the Mindfulness After Cam (a Cambridge University Alumni Group whose aims are to promote the well-being of University of Cambridge alumni and their friends and family through mindfulness meditation) to attend a panel on mindfulness that will be attended by both the alumni group and current students in the Cambridge University Mindfulness Society.
Year(s) Of Engagement Activity 2021
URL https://www.cst.cam.ac.uk/news/showcasing-our-healthcare-research
 
Description Festival of Ideas 2019 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact This was a panel discussion on What Makes Us Human in the Age of AI. The focus was on whether automation make us redundant or are there qualities that are essentially human which will become more sought after. Other panellists included Allegre Hadida, University Senior Lecturer in Strategy at Judge Business School, author and lecturer Laura Dietz and Stephen Cave from the Leverhulme Centre for the Future of Intelligence. Chaired by Julian Clover from Cambridge 105 Radio.
Year(s) Of Engagement Activity 2019
URL https://www.festivalofideas.cam.ac.uk/events/what-makes-us-human-age-ai
 
Description Interview for Blog Post 
Form Of Engagement Activity Engagement focused website, blog or social media channel
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact A blog post about our research on creating robotic wellbeing coaches was published on the University of Cambridge website in May 2021. We were informed that it has reached over 45,000 readers.
Year(s) Of Engagement Activity 2021
URL https://www.cam.ac.uk/stories/wellbeing-robot
 
Description Invited Talk at the AI for Robotics Workshop 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I gave an invited talk entitled Data-driven Socio-emotional Intelligence for Human-Robot Interaction at the 2nd NAVER LABS Europe International Workshop on AI for Robotics (29th - 30th November 2021).
Year(s) Of Engagement Activity 2021
URL https://europe.naverlabs.com/research/2nd-ai-for-robotics-international-workshop-by-naver-labs-europ...
 
Description Invited talk for ACM-W UK 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact I gave a keynote talk titled Artificial Emotional Intelligence for Well-being for the ACM Women in Computing UK (ACM-W UK) organisation in June 2020 as part of their ACM-W UK Webinar Series "Computing for Social Good".

In my talk, I covered the teleoperated robotic coach for mindfulness study we conducted as part of the EPSRC project ARoEQ, as well as the collaborative study we undertook with Emteq and Cambridge Neuroscience. The workshop was organised virtually my talk attracted numerous questions as well as follow up emails by some of the participants. The talk prompted further invitations for keynote talks in 2020.
Year(s) Of Engagement Activity 2020
URL https://acmukwomen.acm.org/2020/08/24/computing-for-social-good-how-we-ran-a-successful-speaker-seri...
 
Description Invited talk for CVPR 2021 Workshop on Continual Learning in Computer Vision 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I gave an invited talk titled Continual Learning for Affective Robotics for the CVPR 2021 Workshop on Continual Learning in June 2021. In my talk, I covered the continual learning methodologies we have been studying int he context of affective and social robotics as part of the EPSRC project ARoEQ. The workshop was organised virtually and the workshop attracted numerous participants and viewers. The talk prompted further invitations for keynote talks in 2021.
Year(s) Of Engagement Activity 2021
URL https://sites.google.com/view/clvision2021
 
Description Invited talk for RO-MAN 2021 Workshop SCRITA 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I gave an invited talk titled 'Will Artificial Social Intelligence Lead to Trust and Acceptance in HRI?' at SCRITA, the 4th Workshop on Trust, Acceptance and Social Cues in Human-Robot Interaction organised in conjunction with IEEE RO-MAN 2021.
Year(s) Of Engagement Activity 2021
URL https://scrita.herts.ac.uk/2021/
 
Description Keynote talk at ACM ICMI 2021 Workshop GENEA 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact I gave a keynote talk on Data-driven Robot Social Intelligence at the ACM ICMI 2021 Workshop on Generation and Evaluation of Non-verbal Behaviour for Embodied Agents (GENEA).
Year(s) Of Engagement Activity 2021
URL https://genea-workshop.github.io/2021/
 
Description Magazine feature 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Media (as a channel to the public)
Results and Impact Our project on developing a robotic coach that can deliver mindfulness received recognition in the media platforms - the project and the postdoc were featured by the Raspberry Pi Foundation's Hello World magazine.
Year(s) Of Engagement Activity 2021
URL https://www.mclibre.org/descargar/docs/revistas/hello-world/hello-world-17-en-202110.pdf
 
Description Organiser of LEAP-HRI 2021 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I was one of the organizers of the Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) that was organised in conjunction with the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2021).
While most of the research in Human-Robot Interaction (HRI) focus on short-term interactions, long-term interactions require bolder developments and a substantial amount of resources, especially if the robots are deployed in the wild. The robots need to incrementally learn new concepts or abilities in a lifelong fashion to adapt their behaviors within new situations and personalize their interactions with users to maintain their interest and engagement. The "Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI)" Workshop aimed to take a leap from the traditional HRI approaches towards addressing the developments and challenges in these areas and create a medium for researchers to share their work in progress, present preliminary results, learn from the experience of invited researchers and discuss relevant topics. It also hosted two keynote talks and a panel discussion with leading academic and industry experts from around the world.
Year(s) Of Engagement Activity 2021
URL https://leap-hri.github.io/
 
Description Organiser of LEAP-HRI 2022 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I was one of the organizers of the 2nd Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI 2022) that was organised in conjunction with the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2022). While most research in Human-Robot Interaction (HRI) studies one-off or short-term interactions in constrained laboratory settings, a growing body of research focuses on breaking through these boundaries and studying long-term interactions that arise through deployments of robots "in the wild". Under these conditions, robots need to incrementally learn new concepts or abilities (i.e., "lifelong learning") to adapt their behaviors within new situations and personalize their interactions with users to maintain their interest and engagement. The second edition of the "Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI)" workshop aimed to address the developments and challenges in these areas and create a medium for researchers to share their work in progress, present preliminary results, learn from the experience of invited researchers and discuss relevant topics. The workshop focused on studies on lifelong learning and adaptivity to users, context, environment, and tasks in long-term interactions in a variety of fields such as education, rehabilitation, elderly care, collaborative tasks, service, and companion robots.
Year(s) Of Engagement Activity 2022
URL https://leap-hri.github.io/
 
Description Organiser of LL4LHRI 2020 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I was the main organiser of the International Workshop on Lifelong Learning for Long-term HRI (LL4LHRI) at IEEE RO-MAN 2020 Conference. With three keynote speakers from the fields of neuroscience, cognitive robotics and machine learning, the workshop was very well received.
It led to the HRI 2021 Workshop on Lifelong Learning and Personalization in Long-Term Human-Robot Interaction (LEAP-HRI) as well as the a Frontiers in Robotics and AI Special Issue on Lifelong Learning and Long-term HRI for which I am the main Guest Editor.
Year(s) Of Engagement Activity 2020
URL https://sites.google.com/view/ll4lhri2020/objectives-and-challenges?authuser=0
 
Description Panelist at ACM Multimedia 2021 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact I was a panelist in Social Signals and Multimedia: Past, Present, Future that was organised at ACM Multimedia 2021. We covered wide range of topics related to social signals, including the potential of robots.
Year(s) Of Engagement Activity 2021
URL https://dl.acm.org/doi/10.1145/3474085.3480024
 
Description Program Chair for ACM/IEEE International Conference on Human Robot Interaction 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact The 15th Annual ACM/IEEE International Conference on Human Robot Interaction will be held at the Corn Exchange and Cambridge Guild Hall in Cambridge, UK from March 23-26, 2020. I am a Program Chair for this top-tier conference in the area of social robotics and human-robot interaction.

Reproducibility is a major issue in the field of HRI. Studies are conducted in an in-house manner and data is not shared in the community. I co-led the effort for creating a new Theme on Reproducibility for Human-Robot Interaction at this conference which was a major success. The theme is now running for a second year and is already changing the views of the academics, students and practitioners working in the field of HRI.
Year(s) Of Engagement Activity 2020
URL https://humanrobotinteraction.org/2020/
 
Description keynote at IEEE FG 2019 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact I gave a keynote talk titled 'Creating Technology with Socio-emotional Intelligence' at IEEE FG 2019, the 14th IEEE Int'l Conference on Automatic Face and Gesture Recognition, held in Lille, France in May 2019.
Year(s) Of Engagement Activity 2019
URL http://fg2019.org/invited-speakers/hatice-gunes/
 
Description seminar at the Centre for Historical Analysis and Conflict Research 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Other audiences
Results and Impact I gave an invited seminar titled 'Artificial Emotional Intelligence for Human-Robot Interaction' at the Centre for Historical Analysis and Conflict Research, in the Royal Military Academy Sandhurst with an exgtensive Q&A session regarding the use of AI.
Year(s) Of Engagement Activity 2019