The Internet of Silicon Retinas (IOSIRE): Machine-to-machine communications for neuromorphic vision sensing data
Lead Research Organisation:
Kingston University
Department Name: Faculty: Science Engineering & Computing
Abstract
Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.
People |
ORCID iD |
Maria Martini (Principal Investigator) |
Publications
Khan N
(2019)
Bandwidth Modeling of Silicon Retinas for Next Generation Visual Sensor Networks
in Sensors
Nasralla M
(2018)
Content-aware downlink scheduling for LTE wireless systems: A survey and performance comparison of key approaches
in Computer Communications
Pezzulli S
(2021)
Estimation of Quality Scores From Subjective Tests-Beyond Subjects' MOS
in IEEE Transactions on Multimedia
Khan N
(2020)
Lossless Compression of Data From Static and Mobile Dynamic Vision Sensors-Performance and Trade-Offs
in IEEE Access
Martini M
(2022)
Lossless Compression of Neuromorphic Vision Sensor Data Based on Point Cloud Representation
in IEEE Access
Adhuran J
(2024)
Lossless Encoding of Time-Aggregated Neuromorphic Vision Sensor Data Based on Point-Cloud Compression
in Sensors
Rashed S
(2020)
Power Allocation for D2D Communications Using Max-Min Message-Passing Algorithm
in IEEE Transactions on Vehicular Technology
Khan N
(2021)
Time-Aggregation-Based Lossless Video Encoding for Neuromorphic Vision Sensor Data
in IEEE Internet of Things Journal
Description | A model for the data rate output by neuromorphic vision sensors has been developed. Based on features of the captured scene and information on the motion of the camera/sensor, the output data-rate is estimated. This is particularly useful to design the methods and technologies to transmit such data. A statistical analysis of the traffic output by neuromorphic vision sensors has also been performed. This information can also inform the transmission strategy for such data. A comparison of the compression methodologies for neuromorphic vision sensor data has been performed, highlighting what is the level of compression that we can achieve without loss of information ("lossless compression"). A comparison of scheduling strategies for visual data has been performed, highlighting which ones are more suitable for scenarios with stringent delay requirements. A new model for assessing the real quality based on statistical analysis of subjective tests. Two novel methods for compressing DVS data have been proposed, with higher compression ratios than existing methods and complexity comparable or better wrt. state of the art methods. One of them is completely lossless (the data reconstructed from the compressed version are the same as original), while the other one only involves a small reduction in the temporal resolution. |
Exploitation Route | The model and the statistical analysis of the output traffic will enable identifying appropriate transmission technologies for data acquired via neuromorphic sensors. The compression technologies proposed will enable lower data rate and energy saving in data transmission. For instance, we plan to use the results in smart city environment, with a partner. |
Sectors | Aerospace, Defence and Marine,Agriculture, Food and Drink,Communities and Social Services/Policy,Construction,Digital/Communication/Information Technologies (including Software),Energy,Environment,Healthcare,Culture, Heritage, Museums and Collections,Security and Diplomacy,Transport |
URL | https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9136663&casa_token=e3WmYhMud9YAAAAA:5KMzV3mzV-mYyAXWP9sHaLW6IfpKmwUBufuT947AKScrhCCh1bzdLfxhBGXYzd3XLQKd8-Ovypg&tag=1 |
Description | 1) JPEG new activity - Following our publications on the compression of data from event-based visual sensors, the JPEG Committee has started a new exploration activity on event-based imaging named JPEG XE (announced on 1 March 2023). References: Martini, M., Adhuran, J., & Khan, N. (2022). Lossless compression of neuromorphic vision sensor data based on point cloud representation. IEEE Access, 10, 121352-121364. Khan, N., Iqbal, K., & Martini, M. G. (2020). Time-aggregation-based lossless video encoding for neuromorphic vision sensor data. IEEE Internet of Things Journal, 8(1), 596-609. Khan, N., Iqbal, K., & Martini, M. G. (2020). Lossless compression of data from static and mobile dynamic vision sensors-performance and trade-offs. IEEE Access, 8, 103149-103163. 2) Smart cities testbed. A collaboration started with a partner in Italy in order to exploit and test the findings of this project in a FIWARE based testbed for Smart Buildings and Infrastructures. 3) Discussions are ongoing with an industrial partner to apply some of the outcomes of this project in a specific use case. |
First Year Of Impact | 2022 |
Sector | Construction,Digital/Communication/Information Technologies (including Software),Energy,Environment |
Impact Types | Cultural,Societal |
Description | JPEG new activity |
Geographic Reach | Multiple continents/international |
Policy Influence Type | Contribution to new or improved professional practice |
Impact | A new exploration activity on the topic has started in JPEG (1 March 2023) |
Description | Postgraduate courses |
Geographic Reach | Local/Municipal/Regional |
Policy Influence Type | Influenced training of practitioners or researchers |
Impact | Contributed to content of a new module in the MSc on Mobile Networks and Media Streaming; supported final MSc projects and PhD training in the area. |
Description | Collaboration with industries involved in the project (e.g. Inilabs, Thales, Samsung) |
Organisation | Samsung |
Country | Korea, Republic of |
Sector | Private |
PI Contribution | Presentation of the plan for the project and preliminary results |
Collaborator Contribution | Feedback on the preliminary results and discussion of further sources of data from the company. Presentation of previous work from the company. |
Impact | No specific joint output yet, although the model developed took into account some discussions with the company. |
Start Year | 2017 |
Description | Collaboration with industries involved in the project (e.g. Inilabs, Thales, Samsung) |
Organisation | Thales Group |
Department | Thales UK Limited |
Country | United Kingdom |
Sector | Private |
PI Contribution | Presentation of the plan for the project and preliminary results |
Collaborator Contribution | Feedback on the preliminary results and discussion of further sources of data from the company. Presentation of previous work from the company. |
Impact | No specific joint output yet, although the model developed took into account some discussions with the company. |
Start Year | 2017 |
Description | Collaboration with industries involved in the project (e.g. Inilabs, Thales, Samsung) |
Organisation | iniLabs Ltd |
Country | Switzerland |
Sector | Private |
PI Contribution | Presentation of the plan for the project and preliminary results |
Collaborator Contribution | Feedback on the preliminary results and discussion of further sources of data from the company. Presentation of previous work from the company. |
Impact | No specific joint output yet, although the model developed took into account some discussions with the company. |
Start Year | 2017 |
Description | Iosire project description - press release |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Introduction to the project and interview with KU PI (M. Martini) was reported in Kingston University news: http://www.kingston.ac.uk/news/article/1825/27-apr-2017-kingston-university-to-play-leading-role-in-study-examining-how-stateoftheart-camera-that-mimics-human/ and also distributed via Kingston University social media (Twitter/Facebook/Linkedin). The press release was reported in a number of magazines targeting scientific and general public. Examples include The Engineer : https://www.theengineer.co.uk/uk-team-to-lead-research-into-artificial-eye-technology/, optics.org, imv.europe.org, Pioneer magazine, E & T magazine (IET), Next Nature. |
Year(s) Of Engagement Activity | 2017 |
URL | https://www.theengineer.co.uk/uk-team-to-lead-research-into-artificial-eye-technology/ |
Description | New Scientist Live - London Excel |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | "New Scientist Live" event at London Excel, 20-23 September 2018 Kingston University - with Dr Nabeel Khan and Prof Maria Martini - participated in the "New Scientist Live" event at London Excel in September 2018, presenting to the general public visiting the Kingston University stand the results and ongoing activity of the IoSiRe project. A demo was presented, showing how neuromorphic sensors capture the scene and how such information is stored ans processed. The "New Scientist Live" is a large science festival with more than 120 speakers and 100 exhibitors contributing to thought-provoking talks and presenting ground-breaking discoveries, interactive experiences, workshops and performances. Thousands of attendees visited the stands (https://live.newscientist.com). https://twitter.com/kingstonuni/status/1043111356193497088?lang=bg |
Year(s) Of Engagement Activity | 2018 |
URL | https://live.newscientist.com |