Adaptive Heap Analysis
Lead Research Organisation:
Queen Mary University of London
Department Name: Computer Science
Abstract
Software is a key element of modern life. It is pervasive and controls manyfundamental services and systems our society depends upon. Therefore high qualitysoftware is absolutely vital. Unfortunately, however, software nearly alwayscontains mistakes - called bugs - and therefore misbehaves. These bugs are due to the size and complexity of the code: modern code is so large and complicatedthat today's technology cannot detect many of them.One of the major source of complexity in real-life code is the intensive use of dynamic allocated memory (called the heap) for storing important data structures.But programming error in the use of the heap is notoriously difficult to avoid and even detect.Heap analyses (often called shape analyses) are program analyses meant to give accurate resultsfor programs using the heap. Having to reason about pointers,heap analysis is one of the hardest kinds of program analysis, and is considered a major open problem.Many program analyses (e.g. those addressing numerical properties) areboth fully automatic and general, therefore providing meaningfulresults for most programs. This is unfortunately not true in the caseof heap analysis. Currently, most of them are ad-hoc and work only for ahand-full of specific (hard-wired) data structures.If a program happens not to use one of these structures imprecise (useless) results will be delivered. This is a problem because the variety of data structures used in real software is great.It makes heap analysis in practice non-automatic: from program to program new tailored techniquesneed to be designed. In turn, this makes its cost of application prohibitive. Thus, poor generality and automation are major factorshindering the application of heap analysis to verification of real programs.This project aims to address these issues, which are on the criticalpath to making heap analysis applicable outside academia. It proposesto develop methodologies to make an analysis adapt on-the-flyto the data structures of the analysed program. Adaptation can providethe necessary flexibility and generality to make heap analysis automatic.Success in this project will provide significant steps towards theapplication of automatic analysis techniques to verification of substantial, real-world code.
People |
ORCID iD |
Dino Distefano (Principal Investigator) |
Publications
Brotherston J
(2011)
Automated Deduction - CADE-23
Calcagno C
(2011)
NASA Formal Methods
Calcagno C
(2009)
Compositional shape analysis by means of bi-abduction
in ACM SIGPLAN Notices
Calcagno C
(2011)
Compositional Shape Analysis by Means of Bi-Abduction
in Journal of the ACM
Distefano D
(2009)
Formal Methods for Industrial Critical Systems
Distefano D
(2010)
Fundamental Approaches to Software Engineering
Description | Real life code powering the computers used everyday by millions of people (such as open source code) makes intensive use of dynamic allocated data structures. But programming error in the use of these structures is notoriously difficult to avoid and even detect. Deep-heap analyses (often called shape analyses) are program analyses meant to give accurate results for programs using dynamic allocated data structures. Having to reason about pointers, deep-heap analysis is one of the hardest kinds of program analysis, and is considered a major open problem. Many program analyses (e.g. those addressing numerical properties) are both fully automatic and general, therefore providing meaningful results for most programs. This is unfortunately not true in the case of deep-heap analysis which work moslty for a handful of specific, hard-wired data structures. If a program happens not to use one of these structures useless results will be delivered. This project aimed to address these issues, which are on the critical path to making shape analysis applicable in real world programs. We have developed methodologies to make shape analysis scale to industrial size software and to facilitate an analysis to adapt to the data structures of the analysed program. Adaptation provides the necessary flexibility and generality to make shape analysis more automatic. |
Exploitation Route | The finding are currently used in industry. |
Sectors | Digital/Communication/Information Technologies (including Software) |
Description | The result of the research contributed to the main product developed by a start-up (Monoidics ltd) that me and my collaborators founded. The start-up was then acquired by Facebook and now the research developed in this grant is used daily in Facebook and other companies. |
Sector | Digital/Communication/Information Technologies (including Software) |
Impact Types | Economic |
Company Name | Monoidics Ltd |
Description | |
Year Established | 2009 |
Impact | The company had several industrial clients using its product. Then the company was acquired by Facebook that now daily uses Monoidics software. |