IRC Next Steps Plus : OPERA - Opportunistic Passive Radar for Non-Cooperative Contextual Sensing

Lead Research Organisation: University of Bristol
Department Name: Electrical and Electronic Engineering

Abstract

Physical activity and behaviour is a very large component in an array of long-term chronic health conditions such as diabetes, dementia, depression, COPD, arthritis and asthma, and the UK currently spends 70% of its entire health and social care budget on these types of conditions. All aspects of self-care, new therapies or management conditions, require novel non-intrusive technologies able to capture the salient data on causes and symptoms over long periods of time.
The OPERA Project - Opportunistic Passive Radar for Non-Cooperative Contextual Sensing - will investigate a new unobtrusive sensing technology for CONTEXUAL SENSING - defined as concurrent physical activity recognition and indoor localisation - to facilitate new applications in e-Healthcare and Ambient Assisted Living (AAL). The OPERA platform will be integrated into the "SPHERE long term behavioural sensing machine" to gather information alongside various other sensors around the home so as to monitor and track the signature movements of people.
The OPERA system will be built around passive sensing technology: a receiver-only radar network that detects the reflections of ambient radio-frequency signals from people - in this case, principally, the WiFi signals in residential environments. These opportunistic signals are transmitted from common household WiFi access points, but also other wireless enabled devices which are becoming part of the Internet of Things (IoT) home ecosystem.
The project will make use of cutting-edge hardware synchronisation techniques, and recent advances in direction finding techniques to enable accurate device-free (non-cooperative) localisation of people. It will also employ the latest ideas in micro-Doppler radar signal processing, bio-mechanical modelling and machine/deep learning for automatic recognition of both everyday activities e.g. tidying and washing-up, to events which require urgent attention such as falling. OPERA is expected to overcome some of the key barriers associated with the state-of-the-art contextual sensing technologies. Most notably non-compliance with wearable devices, especially amongst the elderly, and the invasion of privacy brought about by the intrusive nature of video based technologies.

Planned Impact

UK faces substantial challenges dealing with chronic health conditions such as diabetes, dementia, depression, COPD, arthritis and asthma. Whether seeking to understand the mechanisms, trying to avoid onset, creating new therapies, or supporting self-care of these conditions, we need non-intrusive technologies able to capture salient data on causes and understand symptoms. At the most fundamental level this project aims to understand the potentials of opportunistic sensing technology to address health problems. The proposed research programme is wide ranging covering advanced radar and wireless synchronisation and detection methods, AI / deep learning and statistical signal processing, modelling and activity simulation. There is also a significant systems engineering and embedded processing aspect involved in the device synchronisation work, and developing the in-home activity recognition demonstrator. Outside of the health domain, the technology is also expected to find a set of other applications ranging from everyday pattern of life monitoring to enable intelligent and predictive smart homes and cities, right through to applications in retail shopper tracking, safety and security.

Publications

10 25 50
publication icon
Helfenstein J (2022) An approach for comparing agricultural development to societal visions. in Agronomy for sustainable development

publication icon
Vishwakarma S (2022) SimHumalator: An Open-Source End-to-End Radar Simulator for Human Activity Recognition in IEEE Aerospace and Electronic Systems Magazine

publication icon
Liu A (2022) A Survey on Fundamental Limits of Integrated Sensing and Communication in IEEE Communications Surveys & Tutorials

publication icon
Colone F (2022) On the Use of Reciprocal Filter Against WiFi Packets for Passive Radar in IEEE Transactions on Aerospace and Electronic Systems

publication icon
Bocus M (2023) Streamlining Multimodal Data Fusion in Wireless Communication and Sensor Networks in IEEE Transactions on Cognitive Communications and Networking

publication icon
Li W (2021) Passive WiFi Radar for Human Sensing Using a Stand-Alone Access Point in IEEE Transactions on Geoscience and Remote Sensing

publication icon
Tang C (2022) FMNet: Latent Feature-Wise Mapping Network for Cleaning Up Noisy Micro-Doppler Spectrogram in IEEE Transactions on Geoscience and Remote Sensing

 
Description The OPERA project has investigated and built passive sensing technology: a receiver-only radar network that detects the reflections of ambient radio-frequency signals from people, principally, the WiFi and UWB signals in residential environments. These opportunistic signals are transmitted from common household WiFi / IoT access points, but also other wireless enabled devices which are becoming part of the smart home ecosystem.
The project has focused on the development of signal processing; micro-Doppler radar signal processing, bio-mechanical modelling and machine/deep learning for automatic recognition of both everyday activities and sensor fusion. OPERA has produced several advancements in contextual sensing (joint activity recognition and localisation). The project has shown that passive sensing is a viable alternative to (or can complement) systems based on bespoke platforms, such as those based on active radars, video, and wearable sensors.
• OPERA project has shown advanced AI algorithms based on self-supervised learning paradigms are essential for passive sensing. Self-supervision is a key ingredient for reliable performance of classification algorithms, that generalise well to new environments and deployments.
• OPERA project has developed and quantified sensor and data fusion strategies. The main applications area, as set out as the beginning of the project, was eHealth and Ambient Assisted Living (AAL). These new paradigms are contingent on gathering information from various sensors around the home to monitor and track the movement signatures of people. The aim is to build long-term behavioral sensing machine which also affords privacy. Such platforms rely on an array of environmental and wearable sensors, with sensor fusion being one of the key challenges. However, the findings extend to automated driving which is arguably the most challenging industrial domain. Automated vehicles use a plethora of sensors: Lidar, mmWave radar, video and ultrasonic, and attempt to perform some form of sensor fusion for environmental perception and precise localization. A high-quality of final fusion estimate is a prerequisite for safe driving.
• Simulation environments: Generating and labeling large volumes of high-quality, diverse radar datasets is an onerous task. Furthermore, novel self-supervised deep learning-based techniques require enormous amount of data. Therefore, the project set out to develop an open-source motion capture data-driven simulation tool, SimHumalator, that can generate large volumes of human micro-Doppler radar data in passive WiFi scenarios. The simulated signatures have been validated with experimental data gathered using an in-house-built hardware prototype. The project has provided case studies on the feasibility of using simulated micro-Doppler, spectrograms for data augmentation and pretraining tasks.
Exploitation Route Opportunistic use of WiFi signals is being already recognised by the industry as a means to achieve value-added services. The first domestic mesh WiFi routers with "presence" security functions are about to enter the market. Such systems will offer relatively simple binary decisions on possible physical activity in a house (presence detected / not detected). Such a system can be viewed as offering the same functionality as a bespoke security sensor network-based on PIR sensing (but without the need for additional infrastructure/installation costs, calibration, maintenance etc ). OPERA will show how this concept can be lifted to another level: from binary presence sensing to physical activity recognition.
Sectors Digital/Communication/Information Technologies (including Software),Healthcare,Transport

 
Description The future wireless landscape is expected to have communications and radio-based sensing integrated as a key feature. The confluence, referred to as Joint Communications and Sensing (JCAS), is expected to optimise spectrum usage, improve the quality of service (QoS) provided by mobile network operators, and facilitate 'Sensing as a Service' - a new industry which is anticipated to drive the development of smart cities, autonomous vehicles, e-healthcare and intelligent environments. Wireless communication standards are ever evolving. A clear recent trend is the drive to disaggregate software from the hardware platforms. Such systems can perform most of network functions in containerised apps running on commercial-off-the-shelf (COTS) hardware. This enables market democratisation and lowers the cost of future mobile data networks. The trend has been recently extended to Radio Access Network (RAN) - with the Open-RAN Alliance spearheading the international standardisation efforts. Th OPERA project has shown that passive sensing is a viable option in future 6G O-RAN and WiFi systems. Passive sensing applications to be added, as and when required, even to a live network. Clear specification of APIs within O-RAN is poised to disrupt the supply chain market within the telco industry. Concurrently, the IEEE WiFi standardisation committee has acknowledged the last several years of research efforts into passive Wi-Fi radar systems, including outcomes of the OPERA project. As a result a new Task Group has been created. The TGbf has been established with the aim to set out a road map to a new WiFi standard - 802.11bf, which will offer sensing functionality. Sensing as a Service: The OPERA project has contributed new ideas and techniques to support the development of key JCAS features for 6G wireless systems. In essence, the research has transcend the traditional radar-centric and communication-centric approaches i.e. enabling sensing systems to communicate or communications systems to sense, and look at designing optimised systems and new JCAS strategies from the ground-up that are not constrained to existing systems and their underlying architectures and signals. The Project has developed an open-source motion capture data driven simulation tool, "SimHumalator", that can generate large volumes of human micro-Doppler radar data at multiple IEEE WiFi standards(IEEE 802.11g, ax, and ad). SimHumalator can simulate human micro-Doppler radar returns as a function of a diverse set of target parameters, radar parameters, and radar signal processing parameters. The simulation tool is is being to extensively by the scientific community.
First Year Of Impact 2023
Sector Digital/Communication/Information Technologies (including Software),Electronics
Impact Types Policy & public services

 
Title A Comprehensive Ultra-Wideband (UWB) Dataset for Non-Cooperative Contextual Sensing 
Description This Ultra-Wideband (UWB) dataset is intended for passive localization and Human Activity Recognition (HAR). The experiment was performed in an actual residential environment across several furnished rooms. The participant performed three main activities, namely, walking, sitting, and standing for an approximate experiment duration of 1.6 hours, including steady state (i.e., target not moving). Two UWB systems were used during the experiment. The first system (Qorvo's MDEK1001 kit) was used to obtain the ground truth position of the target while he/she carried a tag and moved within the monitoring area. Ten fixed anchor nodes were used for this purpose. The ground truth 2D XY coordinates reported by the tag were logged at an update rate of 10 Hz using Qorvo's DRTLS android app. The passive UWB system (Qorvo's EVK1000 kit) was solely used to capture Channel Impulse Response (CIR) data from the signals reflected off the target while he/she performed the different activities. It consisted of seven fixed modules installed in a multi-static configuration and operating as transceivers. The modules were programmed with a custom firmware so as to record CIR data on all modules. Each node acted as a transceiver, resulting in 21 bidirectional communication links. The average sampling rate was around 24 Hz between each transmit-receive pair (bidirectional) and therefore considering all 21 bidirectional links, the sampling rate of the passive UWB system amounted to approximately 504 Hz. Video ground truth (with milliseconds timestamps) was used to manually label the activities' data (sitting, standing, walking and no activity). 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact Nowadays, an increasing amount of attention is being devoted towards passive and non-intrusive sensing methods. The prime example is healthcare applications, where on-body sensors are not always an option or in other applications which require the detection and tracking of unauthorized (non-cooperative) targets within a given environment. Therefore, we present a dataset consisting of measurements obtained from Radio-Frequency (RF) devices. Essentially, the dataset consists of Ultra-Wideband (UWB) data in the form of Channel Impulse Response (CIR), acquired via a Commercial Off-the-Shelf (COTS) UWB equipment. Approximately 1.6 hours of annotated measurements are provided, which are collected in a residential environment. This dataset can be used to passively track a target's location in an indoor environment. Additionally, it can also be used to advance UWB-based Human Activity Recognition (HAR) since three basic human activities were recorded, namely, sitting, standing and walking. We anticipate that such datasets may be utilized to develop novel algorithms and methodologies for healthcare, smart homes and security applications. 
URL https://figshare.com/collections/A_Comprehensive_Ultra-Wideband_UWB_Dataset_for_Non-Cooperative_Cont...
 
Title OPERAnet, a multimodal activity recognition dataset acquired from radio frequency and vision-based sensors 
Description This collection presents the dataset accompanying the paper "OPERAnet: A Multimodal Activity Recognition Dataset Acquired from Radio Frequency and Vision-based Sensors". Approximately 8 hours of annotated measurements are provided, which are collected across two different rooms from 6 participants performing 6 activities, namely, sitting down on a chair, standing from sit, lying down on the ground, standing from the floor, walking and body rotating. The dataset has been acquired from four synchronized modalities for the purpose of passive Human Activity Recognition (HAR) as well as localization and crowd counting. The four modalities are based on Radio Frequency (RF) devices and vision/Infrared sensors: These include: (i) A WiFi Channel State Information (CSI) system implemented using Intel5300 WiFi Network Interface Cards (NIC) with 3x3 MIMO antenna configuration at the transmitter and receiver. The system operated in the 5GHz WiFi band. The CSI data was collected on 2 receivers (NUC1 and NUC2) placed in two different positions with respect to the transmitter (NUC3). (ii) A Passive WiFi Radar (PWR) system implemented using the USRP-2945 platform with four synchronized channels, including 1 reference channel and 3 surveillance channels. Each receiving channel was equipped with a 6-dB directional antenna and was placed at different positions. The PWR system shared the same WiFi transmitter (packet injector) as the CSI system. The PWR system collected data in the form of Doppler spectrograms. (iii) Ultra-Wideband (UWB) systems implemented using commercial off-the-shelf hardware, namely, Decawave MDEK1001 and EVK1000 UWB kits. One UWB system (active) was used to record the ground truth XY coordinates of the target (carrying at most 2 tags) within the monitoring area. 11 fixed UWB anchors were installed across the rooms for this purpose. Two other UWB systems (passive), UWB system 1 and UWB system 2, were deployed within the monitoring area, consisting of 4 and 5 nodes, respectively. For each of these 2 systems, all the nodes were fixed in a multi-static configuration and they exchanged Channel Impulse Response (CIR) data among themselves. (iv) A vision/Infrared system implemented using two Microsoft Kinect v2 sensors. The two Kinects were used to track three-dimensional time-varying skeletal information of the human body, including 25 joints such as head center location, knee joints, elbow joints, and shoulder joints from two different directions. This three-dimensional joint information can further be used for simulating the corresponding radar scatterings, mimicking a typical PWR sensing system. The collected Kinect data consist of three-dimensional skeleton information captured from each of the two Kinects. This collection also contains Matlab and Python scripts to load and analyze the data from each modality. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact 1. Multimodal data collection intended for human activity recognition and passive localization, i.e, the targets are oblivious to these processes (non-collaborative) and they only reflect or scatter the signals from the transmitter to receivers. Most datasets consider only one particular modality such as either UWB, WiFi CSI, PWR or Kinect, independently. In this work, we consider multiple synchronized modalities. Experiments span across two environments which can be used for investigating sensing in complex or untrained environments. 2. Approximately 8 hours of measurements are annotated with high resolution location and activity labels, capturing the participant's movement and natural behaviour within the monitoring area, as would be the case in a real-world environment. The dataset is comprehensive in so far it contains over 1 Million annotated data points. 3. The presented data can be exploited to advance human activity recognition technology in different ways, for example, using various pattern recognition and deep learning algorithms to accurately recognize human activities. For this purpose, the users can apply different signal processing pipelines to analyze the recorded WiFi CSI, PWR, UWB and Kinect data and extract salient features that can be used to recognize the human activities and/or concurrently track the target's position within an indoor environment. 4. This is the first dataset that is collected with an explicit aim to accelerate the development of self-supervised learning techniques. Such techniques are extremely data hungry, requiring orders of magnitude larger datasets compared to more traditional supervised learning. 5. This open-source dataset is intended for both HAR and non-cooperative localization, which are areas of growing interest to research communities including but not limited to radar, wireless sensing, IoT and computer vision. To ensure that the dataset aligns to the FAIR (Findable, Accessible, Interoperable, Reusable) Data principles of Open Science, we have (i) made it publicly available for download via the figshare portal, (ii) provided an in-depth and clear description of the dataset for each modality, (iii) formatted our dataset using standard filetypes and encoding, and (iv) provided example scripts/codes that will allow the user to load and analyze the data from each modality. 
URL https://figshare.com/s/bedf98519882f04bda7a
 
Title SimHumalator 
Description SimHumalator is an open-source end-to-end simulation tool for generating human micro-Doppler radar data. The radar scatter is simulated by integrating the animation data of humans with primitive shapes based on electromagnetic modelling. SimHumalator uses human animation data that is captured using a marker-based motion capture system called PhaseSpace. The PhaseSpace system captures three-dimensional time-varying positions of several markers placed on a persons body (head, torso, arms, legs etc) using eight infrared cameras distributed in a VR lab. A key advantage of using this marker-based technology lies in the fact that it can capture more accurate, more realistic, and complex human motions. 
Type Of Material Computer model/algorithm 
Year Produced 2021 
Provided To Others? Yes  
Impact As of March 2021, SiHumulator has been downloaded over 50 times by researchers globally, which has led to two research publications 
 
Title WiFi CSI dataset for device-to-device localization 
Description This dataset contains WiFi Channel State Information (CSI) data intended for device-to-device localisation. The experimental setup consists of a single transmitter and a single receiver, each equipped with an Intel 5300 network interface card. The raw complex CSI data was collected at the receiver over 3 receiving antennas and 30 subcarriers and the packet rate was fixed at 2500 Hz. The CSI data was collected over the 5 GHz WiFi band (Channel 64 with 40 MHz channel bandwidth). The transmitter was placed at two different angles with respect to the receiver: 30 degrees and -60 degrees and at 5 different distances: 1m, 2m, 3m, 4m, 5m. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact This dataset is intended to evaluate device-to-device localisation using Channel State Information (CSI) data extracted from a commercial off-the-shelf WiFi network interface card. 
URL https://doi.org/10.6084/m9.figshare.20943706.v1
 
Title Wireless sensing dataset for human activity recognition (HAR) 
Description This dataset consists of WiFi Channel State Information (CSI) and Ultra-Wideband (UWB) Channel Impulse Response (CIR) data collected for human activity recognition (wireless sensing of human activities). 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
Impact This dataset consists of WiFi Channel State Information (CSI) and Ultra-Wideband (UWB) Channel Impulse Response (CIR) data collected for human activity recognition (wireless sensing of human activities). 
URL https://figshare.com/articles/dataset/Wireless_sensing_dataset_for_human_activity_recognition_HAR_/2...
 
Title Activity Labelling App 
Description This Activity Labelling App was designed using the Matlab R2022a App Designer (MathWorks). Usage Notes: It was designed for providing ground truth labels for Human Activity Recognition (HAR) measurements. This app allows a user to insert the desired labels for the human activities. Then, during the HAR experiment, an observer looks at the participants performing a given activity and click on the appropriate button in the app to record the start and stop times of the activity. Therefore, the measurements are labelled on the fly during the experiment itself. All labels are stored as a .txt file along with their timestamps (with millisecond precision). This privacy-friendly app can be used as an alternative to video ground truth (where the experiments need to be video-recorded and then manually labelled afterwards). Directory file details: (1) "Activity_Labelling_App.mlappinstall" (MATLAB App Installer) (2) "ActivityLabellingAppInstaller_mcr.exe" (3) "video_instructions.mp4" - video instructions to install the Activity Labelling App. (4) "ALA_Snapshot.jpg" - Snapshot of the Activity Labelling App window. ****************************** **Installation instructions:** ****************************** Operating software required: WINDOWS (1) Users who have MATLAB software already installed on their device (WINDOWS machine), can install the "Activity_Labelling_App.mlappinstall" directly by double-clicking on it and then choosing "Install". The newly installed app will be available under the "APPS" tab in MATLAB. (2) Users who do not have MATLAB software installed on their device can run the standalone "ActivityLabellingAppInstaller_mcr.exe" installer and it will install all dependencies to run the app. ********************************************** **Activity Labelling App usage instructions:** ********************************************** (1) Launch the ActivityLabellingApp software. (2) Choose the number of activities that require labelling through the drop down list in name field "No. of activities" (a maximum of 9 activities can be specified). (3) Depending on the number of activities chosen in the previous step, the user will now be able to specify the desired activity labels in the fields denoted by "Label 1", "Label 2", etc. (4) After filling in the desired activity labels (without spacing between words, underscore can be used), press the button marked by "Assign labels to buttons". (5) The user will now see the chosen labels assigned to the coloured buttons in the middle of the window, depending on how many labels were specified. (6) Choose the desired folder where the log file needs to be saved by clicking on the "Choose save folder" button. (7) Specify a filename without any extension (file will be saved as .txt file at the chosen save path location). (8) Click on the "Activity Labelling Buttons" and the table on the right of the window will be populated with the "Date", "Timestamp" and "Label" information. (9) The log file will automatically be saved in the path chosen and will be appended on the fly, unless a new filename is specified and the information will be saved in the new .txt log file in this case. NOTES: 1. Please note that when an Activity Button is pressed first time, nothing will appear in the table at first, however, the data will still be saved in the log file at the appropriate timestamp the button was pressed. When an Activity Button is pressed a second time, the previous data will appear normally, as well as any other subsequent data. 2. When the "RESET" button is pressed and the user choose "Yes", the app will be restarted with all fields cleared. 3. Clicking on the "EXIT" button will prompt the user to click "Yes" to close the app or "No" to continue using the app. 4. The user is required to specify a folder and filename for saving the log file before clicking on the Activity Buttons, else Matlab message dialog boxes will pop-up on the screen, instructing the user what needs to be done. 5. Depending on the no. of activities chosen, the user is required to fill all the label fields and then click "Assign labels to buttons", else a Matlab message dialog box will pop-up on the screen, instructing the user what needs to be done. 
Type Of Technology Software 
Year Produced 2022 
Open Source License? Yes  
Impact This app was designed for providing ground truth labels for Human Activity Recognition (HAR) measurements. It allows a user to insert the desired labels for the human activities. Then, during the HAR experiment, an observer looks at the participants performing a given activity and click on the appropriate button in the app to record the start and stop times of the activity. Therefore, the measurements are labelled on the fly during the experiment itself. All labels are stored as a .txt file along with their timestamps (with millisecond precision). This privacy-friendly app can be used as an alternative to video ground truth (where the experiments need to be video-recorded and then manually labelled afterwards). 
URL https://figshare.com/articles/software/Activity_Labelling_App/20444076/1
 
Description ACM HotMobile 2020 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Schools
Results and Impact Presented on-going work related to the MOA project.
Year(s) Of Engagement Activity 2020
URL http://www.hotmobile.org/2020/
 
Description Invitation to organise Special Session at the IEEE 2021 Radar Conference - Flagship Conference on Radar systems 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Invitation to organise Special Session at the IEEE 2021 Radar Conference - Flagship Conference on Radar systems
Year(s) Of Engagement Activity 2021
URL https://ewh.ieee.org/conf/radar/2021/special-sessions.html
 
Description MLSys 2020 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Schools
Results and Impact We presented our latest results for MOA to the MLSys audience
Year(s) Of Engagement Activity 2020
URL https://mlsys.org/
 
Description Presentation of work at post-accelerator for start-up tech businesses 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Industry/Business
Results and Impact Presentation at IDEALondon's Machine Learning academy. IDEALondon is a workspace and innovation centre in the heart of Shoreditch. It gives startups the space and support they need to grow - whether it's access to funding, access talent, expert advice from its partners. The presentation led to follow-up networking events which resulted in discussions with Angel Investors and Venture Capitalists who advised on strategies for commercialising the WiFi passive radar technology being developed as part of the research.
Year(s) Of Engagement Activity 2019