Lifelearn: Unbounded activity and context awareness

Lead Research Organisation: University of Sussex
Department Name: Sch of Engineering and Informatics

Abstract

The Wearable Computing market is expected to explode, as evidenced in 2014 and early 2015 with a plethora of new products primarily in the sports and fitness domain. Business Insider in 2013 estimated that 300 million units would be shipped by 2018. What makes wearables (and similarly mobile phones) unique is their contextual intelligence: they use sensors to infer users' context, such as location, activities or social interactions. This contextual intelligence allows a fitness tracker to detect by itself that the user is running, walking, or doing push-ups.

We are motivated by the vision of pervasive "wearable smart assistants" that provide situated contextual support in daily life. They may act as "memory reminders" for people with dementia, or encourage healthy behaviours through supportive prompts presented at the right time (e.g. to fight obesity, diabetes, cardiovascular diseases).

This project deals with the heart of any such assistive technology: the ability to recognise general human activities and context from sensors. Current methods can only recognise pre-defined or "closed sets" set of activities and context. This is insufficient for the scenarios outlined above. In such applications, the set of relevant activities is not necessarily known at design-time, as different users tend to have different routines, routines may change as users change interests, and activities may be performed differently, for instance after an injury. Therefore the set of relevant activities and contexts is potentially unbounded and is said to be "open-ended

The project investigates the methods required to recognise an "open-ended" set of activities and contexts from existing wearables, such as a smartwatch and a mobile phone, following lifelong learning principles. In other words, the system should discover that a user engages in a new activity, even if it was not initially programmed with the knowledge of that activity.

We develop new open-ended learning techniques that can model changing number of classes at runtime. These methods run on a recognition infrastructure comprising software on the wearable devices and on a server. The infrastructure will be made open-source to benefit other projects. We develop methods that discover reoccurring wearable sensor patterns. Repeating patterns may correspond to new activities or contexts. Therefore they are modelled using open-ended learning techniques. Finally, we develop methods to decide whether a discovered pattern is meaningful and what it represents. This is achieved by involving the user and occasionally requesting to provide information about his/her current activity. We compare different feedback options that minimise the number of interruptions and the complexity of the queries. Overall, the system is evaluated on existing data as well as on a new long-term dataset collected within this project.


Our approach is novel and timely. Performance increases in activity recognition are incremental and the inability to deal with unknown activities is most critical for large-scale deployments in daily life scenarios. This project addresses this fundamental limit. This is timely given raising costs of healthcare and calls to rely on technology to address this issue. The outcomes of this project along understanding daily human behaviour may lead to new smart assistants that could help support independent living or assist users in following healthy behaviour change. The outcomes may also find their use in psychology research and in the area of sustainable innovation, such as the assessment of consumer-product interaction and behaviour change initiatives. As such the project has clear societal benefits.


This project is supported by our partners Unilever and Plessey Semiconductors, respectively interested in consumer behaviour research and new products in the healthcare domain.

Planned Impact

This project is motivated by the vision of pervasive "smart assistants" that provide situated contextual support in daily life, such as "memory reminders" for persons with dementia. At the heart of such technology lies the ability to recognise general human activities and context. This is the key challenge that this project addresses. In the longer term, the outcomes may become fundamental to address societal challenges in assisted living, health monitoring, and smart assistance.

*Economic and industrial impact

This project develops technologies along mobile, wearable, Internet of Things and behavioural analytics. There are major UK technology companies in the field (e.g. ARM, Imagination Technologies) as well as US technology companies with UK branches (e.g. Qualcomm). Behavioural analytics is beneficial for targeted advertisement or consumer behaviour analysis. The UK has several major companies active in health, nutrition and care (e.g. GSK, Unilever, Body Shop) where behavioural analysis can be used to understand product usage and design new products.

Our project partners Unilever and Plessey Semiconductors illustrate the interest in the technology we develop, which they aim to use respectively for better consumer behaviour research and to create new microelectronics products.

We will reach out to industries through the UK Design Forum (a UK microelectronics forum where companies and academics meet) and the Knowledge Transfer Network of Innovate UK. The PI has also contacts to leading researchers active in companies (Intel, Google, Qualcomm, etc.).

*Societal and quality of life benefits

In the longer term, understanding human daily routines from sensors enables new interventions and services in the field of healthcare. It can support longer independent living; activity monitoring may be used to detect onset of depression; cues may be provided in relevant contextual situations to support a desired lifestyle change (e.g. to fight obesity, diabetes or cardiovascular disease).
The impacts can be major in terms of quality of life for the concerned users. They will also benefit public bodies involved in healthcare (UK NHS, charities) and can increase effectiveness of public services and support evidence-based policies.

*Wider public

There is a growing interest in the general public in "quantified self", i.e. keeping track of one's own routines. Within this project we will inform the general public as to how these technologies operate, their future potential (e.g. with the outcomes of this project), and their possible downsides (e.g. along privacy). We will reach-out to the wider public through talks at existing meetup events and university open-doors (see pathway to impact for details).

*Skilled workforce
The team (postdoc funded by this project and students indirectly associated to it) will gain useful and transferable skills, especially analytical, presentation and organisational skills, that are applicable to several employment sectors:
- big data analytics
- machine learning
- mobile and wearable sensing and applications
- sensing technologies and internet of things
- client/server architectures
- behaviour analytics

*Impact timescales
The infrastructure we develop in the project will be immediately available after the end of the project and may have immediate impact for companies already active in wearable sensing and our project partners.

Open-ended awareness is a novel paradigm that will take more time to be exploited. Based on previous experience we envision uptake in the scientific community within 3 years from the project start (through the workshops we will organise). Wider industry awareness and uptake may be 3-5 years after the project completion.

*Summary

We disseminate our work through high-impact publications and conferences and will reach out to selected industries directly to achieve impact both in the UK and internationally.
 
Description We aim to automatically discover human activities in sensor time series, such as the one obtained from wearable sensors. We demonstrated for the first time that Deep Learning techniques are suitable for activity recognition from wearable sensor data, despite the comparatively smaller amount of available training data in contrast to other fields such as speech and image recognition. So far, the suitability of our deep learning techniques is being further investigated with a view towards industrial applications in the domain of mobile computing and hardware-assisted dedicated learning circuits. The main challenges to overcome to achieve impact related to the computational cost and benefit tradeoff of the method. Further advances along performance increase or computational cost reduction will enable inclusion into novel context-aware consumer products. In parallel to deep learning techniques, we developed a new unsupervised clustering technique, which operates currently on traditional features computed on time series. This proves to be a very low-cost approach computationally and memory-wise which is well suited for implementation in a miniature wearable device. Finally, we also developed a complex gesture encoding technique that transforms a sensor data time series into a compact string-like representation, which can be processed on miniature wearable devices. The outcomes of this project towards impact are as follows: 1) hardware compression of neural weight is an important step to include deep networks on embedded devices. We have been approached by a semiconductor company with which we have benchmarked our deep-learning approach on activity recognition task with and without compression to successful results. 2) During an invited talk at Google (7 June 2017) to present the outcomes of this project it appears that upcoming hardware (e.g. Google TPU) will be suitable to run deep networks with limited power use, which appears to make such approaches tractable on wearable devices. 3) During an invited talk at Bosch (6 June 2017), the ability to include gesture recognition algorithms at the hardware-level attracted most attention, as the algorithm can sit alongside a motion sensing element on a recognition chip. This shows that the low-complexity unsupervised clustering techhnique and complex gesture encoding techniques appear the most promising approaches for highly-constrained, silicon-level implementations. Therefore, the pathway to impact is as follows: 1) On the one hand, future devices of "mobile" size will be powerful enough for deep learning methods, and hardware implementation of deep learning algorithm on an embedded devices is the required next step. 2) On the other hand, future sensor technologies with AI techniques built-in on silicon are more suited for the unsupervised clustering and complex gesture encoding methods. The required next step to achieve impact is an implementation closer to silicon (e.g. on a sensor chip or reconfigurable hardware).
Sector Digital/Communication/Information Technologies (including Software),Electronics
Impact Types Economic

 
Description EPSRC Associate College invitation
Geographic Reach National 
Policy Influence Type Participation in a guidance/advisory committee
 
Description Huawei Technologies (direct funding)
Amount £168,044 (GBP)
Organisation Huawei Technologies 
Sector Private
Country China
Start 01/2017 
End 12/2017
 
Description Industrial CASE fellowship
Amount £126,000 (GBP)
Organisation Unilever 
Department Unilever UK R&D Centre Port Sunlight
Sector Private
Country United Kingdom
Start 05/2018 
End 04/2022
 
Title Activity discovery reference implementation 
Description We released the reference implementation of the activity discovery algorithm developed within the EPSRC Lifelearn project. This allows other researchers to compare their methods against ours, therefore improving scientific reproducible research. 
Type Of Material Improvements to research infrastructure 
Year Produced 2017 
Provided To Others? Yes  
Impact The code has been released too recently to be able to verify the impact. 
URL https://github.com/sussexwearlab/OpenEnded
 
Title Deep learning for activity recognition reference implementation 
Description We released the reference implementation of the deep learning framework used in the Sensors and ISWC articles to the scientific community, including pre-trained models. This has made our implementation of deep learning for activity recognition in wearable computing a reference implementation which allows other scientist to compare their system against this baseline. 
Type Of Material Improvements to research infrastructure 
Year Produced 2016 
Provided To Others? Yes  
Impact The following are notable impacts: - a large number of researchers contacted us for further queries on the code (67 individual queries) - Two companies contacted us for follow-up collaborations where work is under progress - 36 citations to the Sensors article after one year. 
URL https://github.com/sussexwearlab/DeepConvLSTM
 
Title Activity discovery reference implementation 
Description We released the reference implementation of the activity discovery algorithm developed within the EPSRC Lifelearn project. This allows other researchers to compare their methods against ours, therefore improving scientific reproducible research. 
Type Of Material Computer model/algorithm 
Year Produced 2017 
Provided To Others? Yes  
Impact The code has been released too recently to be able to verify the impact. 
URL https://github.com/sussexwearlab/OpenEnded
 
Title Deep learning for activity recognition reference implementation 
Description We released the reference implementation of the deep learning framework used in the Sensors and ISWC articles to the scientific community, including pre-trained models. This has made our implementation of deep learning for activity recognition in wearable computing a reference implementation which allows other scientist to compare their system against this baseline. 
Type Of Material Data analysis technique 
Year Produced 2016 
Provided To Others? Yes  
Impact The following are notable impacts: - a large number of researchers contacted us for further queries on the code (67 individual queries) - Two companies contacted us for follow-up collaborations where work is under progress - 36 citations to the Sensors article after one year. 
URL https://github.com/sussexwearlab/DeepConvLSTM
 
Description Bosch 
Organisation Robert Bosch LLC
Country United States 
Sector Private 
PI Contribution Activity discovery algorithms.
Collaborator Contribution Business case for activity discovery algorithms, and hardware-implemented algorithms for activity recognition and discovery.
Impact Support for a Centre for Doctoral Training in Computational Behavioural Science.
Start Year 2017
 
Description Consumer behaviour analytics 
Organisation Unilever
Department Unilever UK R&D Centre Port Sunlight
Country United Kingdom 
Sector Private 
PI Contribution Behaviour analytics algorithms for consumer product usage.
Collaborator Contribution The partner provides datasets to evaluate our methods on. They provide a business case motivating further research in the area.
Impact Industrial CASE fellowship 2018-2022.
Start Year 2017
 
Description Huawei 
Organisation Huawei Technologies
Country China 
Sector Private 
PI Contribution We contribute an activity recognition challenge at Ubicomp/ISWC 2018 at a highly visible location.
Collaborator Contribution The partner will help with visibility and to provide prizes for the winners of the activity recognition challenge.
Impact Competition for best activity recognition algorithms at Ubicomp/ISWC based on the SHL dataset.
Start Year 2017
 
Description ST 
Organisation ST Microelectronics
Country Switzerland 
Sector Private 
PI Contribution Our algorithms applied to ST datasets.
Collaborator Contribution Hardware implementation of deep learning.
Impact Ongoing Y3 student project on algorithm implementation on FPGA.
Start Year 2017
 
Title Activity discovery reference implementation 
Description We released the reference implementation of the activity discovery algorithm developed within the EPSRC Lifelearn project. This allows other researchers to compare their methods against ours, therefore improving scientific reproducible research. 
Type Of Technology Software 
Year Produced 2017 
Open Source License? Yes  
Impact The code has been released too recently to be able to verify the impact. 
URL https://github.com/sussexwearlab/OpenEnded
 
Title Deep learning for activity recognition reference implementation 
Description We released the reference implementation of the deep learning framework used in the Sensors and ISWC articles to the scientific community, including pre-trained models. This has made our implementation of deep learning for activity recognition in wearable computing a reference implementation which allows other scientist to compare their system against this baseline. 
Type Of Technology Software 
Year Produced 2016 
Open Source License? Yes  
Impact The following are notable impacts: - a large number of researchers contacted us for further queries on the code (67 individual queries) - Two companies contacted us for follow-up collaborations where work is under progress - 36 citations to the Sensors article after one year. 
URL https://github.com/sussexwearlab/DeepConvLSTM
 
Description 4th workshop on human activity sensing corpus and applications: towards open-ended context awareness, Ubicomp 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact This workshop was designed to promote the notion of "unbounded"/"open-ended"/"lifelong learning" in human activity/context awareness in the wearable, mobile and ubicomp community. This was organised as a special call for topics within the Human Activity Sensing Corpus and Applications (HASCA( workshop at Ubicomp 2016. Papers were peer reviewed for inclusion in the adjunct proceedings of HASCA.
This actual workshop comprised a keynote talk from Dr Mario Fritz from Max Plank on "scalable learning and perception" followed by presentations of the accepted peer-reviewed contributions.
A key outcome from this workshop was the crystallisation of the notion of "unbounded"/"open-ended"/"lifelong learning" in in human activity/context awareness in the wearable, mobile and ubicomp communities, as illustrated by several papers tackling this in various ways. This is a marked change from the 3rd edition of HASCA (2015) which did not have a special call for contributions on unbounded activity/context recognition and welcomed primarily dataset-related contributions.
Year(s) Of Engagement Activity 2016
URL http://hasca2016.hasc.jp/
 
Description 5th workshop on human activity sensing corpus and applications: towards open-ended context awareness, Ubicomp 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact This workshop was designed to promote the notion of "unbounded"/"open-ended"/"lifelong learning" in human activity/context awareness in the wearable, mobile and ubicomp community. This was organised as a special call for topics within the Human Activity Sensing Corpus and Applications (HASCA( workshop at Ubicomp 2017. This follows the workshop I organized the year earlier at Ubicomp 2016. Papers were peer reviewed for inclusion in the adjunct proceedings of HASCA. A key outcome from this workshop was the crystallisation of the notion of "unbounded"/"open-ended"/"lifelong learning" in in human activity/context awareness in the wearable, mobile and ubicomp communities, as illustrated by several papers tackling this in various ways.
Year(s) Of Engagement Activity 2017
URL http://hasca2017.hasc.jp/
 
Description British Science Festival 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact I organised a 3-day presence at the British Science Festival 2017 in Brighton. This included: 1) an event in the city on Tuesday 5 September 2017 during an evening with demonstration of wearable activity recognition, including our activity discovery system funded by the EPSRC Lifelearn project, as well as other wearable technologies (audience: 200); 2) a talk on wearable technologies on Wednesday 6 September 2017 (audience: 50); 3) another public event on Saturday 9 September 2017 with demonstration of motion sensing (audience: 200).
Year(s) Of Engagement Activity 2017
URL https://www.britishsciencefestival.org/event/science-in-south-lanes/
 
Description Demonstration of gesture encoding at EWSN 2018 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact We presented a live demonstration of a complex gesture encoding system which can be used for activity discovery in the EPSRC Lifelearn project. This took place at the conference EWSN (International Conference on Embedded Wireless Systems and Networks) on 14 February.
Year(s) Of Engagement Activity 2018
URL https://ewsn2018.networks.imdea.org/posters-demos.html
 
Description Google, Mountain View, USA 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact I gave a talk on June 6 at Google, Mountain View, USA on "Towards unbounded activity & context awareness in wearables and ubicomp" presenting the results of the project. The audience included primarily members from Google ATAP (Advanced Technology and Projects). The main interest arose around deep learning techniques, hardware implementation of such techniques, and the ability to encode prior context information in deep networks..
Year(s) Of Engagement Activity 2017
 
Description Invited talk, Acrossing Initial Training Network, De Montfort University, UK 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact I gave a talk on June 21 at at De Montfort University "Towards unbounded activity & context awareness in wearables and ubicomp" at the Acrossing Initial Training Network summer school. The audience included primarily members of PhD students of the ITN and their supervisors.
Year(s) Of Engagement Activity 2017
 
Description Invited talk, Bosch, Palo Alto, USA 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact I gave a talk on June 7 at Bosch, Palo Alto, USA on "Towards unbounded activity & context awareness in wearables and ubicomp" presenting the results of the project. The audience included primarily members from the sensing team of Bosch. The main interest arose around template-based approaches to activity recognition and their implementation in hardware.
Year(s) Of Engagement Activity 2017
 
Description Invited talk, SPHERE project Seminar Series, Bristol University, UK 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact I gave a talk on June 22 at University of Bristol on "Towards unbounded activity & context awareness in wearables and ubicomp" as part of the SPHERE seminar series. The audience included primarily SPHERE members, PhD students and undergrads.
Year(s) Of Engagement Activity 2017
 
Description Media coverage 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact As part of the British science festival, our work in EPSRC Lifelearn was heavily publicised, as a publication occurred at the same time as the festival. We received press coverage on this EPSRC project in the Telegraph (http://www.telegraph.co.uk/news/2017/09/04/smartwatch-able-learn-every-move/), the Sun (https://www.thesun.co.uk/fabulous/4384708/fitness-tracker-accuracy-calorie-count-washing-teeth-brushing), Eurekalert (https://eurekalert.org/pub_releases/2017-09/uos-aus090117.php), iNews (https://inews.co.uk/essentials/news/technology/smart-watches-will-soon-able-detect-every-move/), and others.
Year(s) Of Engagement Activity 2017
URL https://eurekalert.org/pub_releases/2017-09/uos-aus090117.php
 
Description Radio interview BBC Sussex 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact As a follow-up of the press visibility arising from the British science festival and the work on Activity Discovery conducted in the EPSRC-funded project Lifelearn, postdoctoral researcher gave an interview on BBC Sussex on the 4th of September at 7:25 on the research project.
Year(s) Of Engagement Activity 2017