PARTIAL Analysis of Relations in Tasks of Inversion for Algorithmic Leverage

Lead Research Organisation: University of Liverpool
Department Name: Mathematical Sciences

Abstract

The solution of inverse problems forms a crucial part of many modern technologies, such as medical and astronomical imaging devices. We want to obtain from possibly very limited and corrupt measurements, a good description of the phenomenon under study. Often this description takes the form of an image, as we humans like to process visual information, but it can also take many other forms, such as the location of an earthquake, the source of a sound, or the detection of a planet.

As physical technologies improve, and become more widespread, we are faced with increased amounts of data, as well as the need to process them into higher-quality results, such as higher-resolution images. We would also like to produce the results with as little energy as possible, and ideally in real time, instead of waiting computers to crunch numbers for hours after the measurement was taken. For this we need really good optimisation algorithms: instructions for a computer to find the point where a mathematical function reaches its smallest value, akin to the lowest part of a valley among mountains. This is because the solutions we are looking are such minimising points of suitable functions modelling our inverse problem.

The objective of the proposed project is to develop efficient optimisation algorithms for the solution of large-scale inverse problems. This will be based on a study to gain a better understanding into the stability of the solutions to these problems under perturbations of the measurements and the parameters of the model. Besides inverse problems, our work will be applicable to a diverse range of disciplines from financial planning to machine learning.

Planned Impact

The solution of large-scale optimisation problems forms a crucial part of many modern technologies, including financial planning, autonomous and intelligent devices based on machine learning, as well as modern imaging devices from medicine and industry to astronomy and security. In imaging, optimisation surfaces through the need to solve an inverse problem to produce the image from limited or corrupt measurements in a non-image measurement domain.

As the involved physical technologies improve-e.g., devices for magnetic resonance imaging, electrical impedance tomography, computed tomography, electron and optical microscopes-and their use becomes more widespread, we are faced with increased amounts of data, as well as the need to process the data into higher-quality results, such as higher-resolution images, or an otherwise better understanding of the world. We would also like to produce the results with as little energy as possible, and fast, ideally in real time, instead of waiting computers to crunch numbers for hours after the measurement was taken. For this we need really good optimisation algorithms.

The objective of the proposed project is to develop such efficient optimisation algorithms, with focus on the solution of large-scale and PDE-constrained inverse problems. The eventual beneficiaries of the project will include everyone who uses technologies incorporating their solution. From the enhancement of digital pictures taken in poor lighting conditions, to the processing and automated understanding of images coming from devices for medical, industrial, and security imaging, as well as satellites and space exploration probes, these technologies are ever more widespread. The beneficiaries therefore range from consumers to doctors, from scientists to patients, and from travellers to the industry. Eventually, the outcomes of the proposed project will therefore foster the health and quality of life of the population, and improve industrial processes.

Secondly, this project will benefit the UK's initiative on Big Data. While our focus is on optimisation methods for inverse problems, our work will be beneficial to the solution of a wider range of large-scale optimisation problems, from financial planning to machine learning, the latter having applications from personal assistants to automated medicine. Our work will benefit a plethora of areas of modern society. Moreover, randomised algorithms form the very basis of quantum computing. As quantum computers become practical in the coming years, the stochastic aspects of the our research are likely to reach their full potential. In this way the project will help create new technologies, companies, and wealth in both the near and the more distant future.

Publications

10 25 50

publication icon
Mazurenko S (2020) Primal-dual block-proximal splitting for a class of non-convex problems in ETNA - Electronic Transactions on Numerical Analysis

publication icon
Valkonen T (2020) Inertial, Corrected, Primal-Dual Proximal Splitting in SIAM Journal on Optimization

 
Description * We have developed a much simplified proof of convergence of the method that was our starting point for the project, and been able to prove better convergence behaviour than previously established.
* Based on this, we have developed new methods for even more difficult problems through modelling advances.
* The project's goals are achieved in the final publication ("Primal-dual block-proximal splitting for a class of non-convex problems").
Exploitation Route Algorithms developed are available online on Zenodo.
Sectors Aerospace, Defence and Marine,Healthcare

 
Description While the impact is still largely academic, citation data from Google Scholar and Scopus indicate that, besides influencing several theoretical works, the algorithms developed in the project are starting to be used by more applied researchers, creating pathways to eventual industrial application. Applying publications as of February 2023 can be found in fields such as computed tomography, magnetic resolution imaging, and vehicle energy management..
First Year Of Impact 2023
Sector Energy,Healthcare,Transport
 
Description Decoupling preconditioners for non-smooth optimisation and inverse problems
Amount € 648,755 (EUR)
Funding ID 314701 & 320022 
Organisation Academy of Finland 
Sector Public
Country Finland
Start 09/2018 
End 08/2023
 
Description Online optimisation for dynamic inversion
Amount € 487,817 (EUR)
Funding ID 338614 
Organisation Academy of Finland 
Sector Public
Country Finland
Start 09/2021 
End 08/2025
 
Title Diffusion tensor imaging codes from "Primal-dual block-proximal splitting for a class of non-convex problems" 
Description These are the Julia codes for the diffusion tensor imaging experiments of the manuscript "Primal-dual block-proximal splitting for a class of non-convex problems" by S. Mazurenko, J. Jauhiainen, and T. Valkonen (arXiv:1911.06284). The codes were written by T. Valkonen. Prerequisites These codes we written for Julia 1.1 but are known to work with Julia 1.2. The Julia package prequisites are from May 2019 when our experiments were run, and have not been updated to maintain the same environment we used to do the experiments in the manuscript. You may get Julia from https://julialang.org/. To generate the Helix test data (the data we have generated are included), the Teem toolkit is needed. You may download it from https://sourceforge.net/projects/teem/. Please note that the version included in Homebrew is missing the emap binary that is required for the visualisation; indeed as of May 2019 the building of this binary is disabled in Teem. You will therefore need to build Teem manually from source, applying the included teem-1.11.0-emap_build_fix.patch. In your unix shell, in the top-level directory of the Teem toolkit source codes, run:
$ patch -p1 < path_to_these_codes/teem-1.11.0-emap_build_fix.patch 
Then build and install Teem according to instructions. The visualisation further requires the open -g to open the generated PDF file in a PDF viewer that will correctly refresh the file on further launches. This is the case on MacOS with Preview or Skim. Using Navigate your unix shell to the directory containing this README.md and then run:
$ julia --project=BlockPDPS 
Afterwards in the Julia shell, type:
> using BlockPDPS 
This may take a while as Julia builds any missing dependencies. Then, to run the default experiments, run:
> test_dti_helix(visualise=true) 
This will use any cached data in the data/ subdirectory of the current working directory. If no cached data is found, new data will be generated using Teem. If you use provided data and set visualise=false, the Teem dependency will be removed. The results are saved under the img/ subdirectory of the current working directory. The test_dti_helix function has further parameters; please see the source code for details. 
Type Of Technology Software 
Year Produced 2019 
Open Source License? Yes  
URL https://zenodo.org/record/3528136
 
Title Diffusion tensor imaging codes from "Primal-dual block-proximal splitting for a class of non-convex problems" 
Description These are the Julia codes for the diffusion tensor imaging experiments of the manuscript "Primal-dual block-proximal splitting for a class of non-convex problems" by S. Mazurenko, J. Jauhiainen, and T. Valkonen (arXiv:1911.06284). The codes were written by T. Valkonen. Prerequisites These codes we written for Julia 1.1 but are known to work with Julia 1.2. The Julia package prequisites are from May 2019 when our experiments were run, and have not been updated to maintain the same environment we used to do the experiments in the manuscript. You may get Julia from https://julialang.org/. To generate the Helix test data (the data we have generated are included), the Teem toolkit is needed. You may download it from https://sourceforge.net/projects/teem/. Please note that the version included in Homebrew is missing the emap binary that is required for the visualisation; indeed as of May 2019 the building of this binary is disabled in Teem. You will therefore need to build Teem manually from source, applying the included teem-1.11.0-emap_build_fix.patch. In your unix shell, in the top-level directory of the Teem toolkit source codes, run:
$ patch -p1 < path_to_these_codes/teem-1.11.0-emap_build_fix.patch 
Then build and install Teem according to instructions. The visualisation further requires the open -g to open the generated PDF file in a PDF viewer that will correctly refresh the file on further launches. This is the case on MacOS with Preview or Skim. Using Navigate your unix shell to the directory containing this README.md and then run:
$ julia --project=BlockPDPS 
Afterwards in the Julia shell, type:
> using BlockPDPS 
This may take a while as Julia builds any missing dependencies. Then, to run the default experiments, run:
> test_dti_helix(visualise=true) 
This will use any cached data in the data/ subdirectory of the current working directory. If no cached data is found, new data will be generated using Teem. If you use provided data and set visualise=false, the Teem dependency will be removed. The results are saved under the img/ subdirectory of the current working directory. The test_dti_helix function has further parameters; please see the source code for details. 
Type Of Technology Software 
Year Produced 2019 
Open Source License? Yes  
URL https://zenodo.org/record/3528137
 
Title Julia codes for "Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization" 
Description GPDPS GPDPS is a Julia package that provides the reference implementation of the generalized primal-dual proximal splitting approach described in the paper "Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization" by Christian Clason, Stanislav Mazurenko, and Tuomo Valkonen. Usage To use the provided test codes (compatible with Julia version 1.0-1.2, tested on macOS and Linux x86_64): start Julia from this directory with julia --project=. (or with the relative path to the GPDPS.jl directory from somewhere else) do ]instantiate to download all dependencies (only required the first time, make sure you go back to standard Julia prompt with backspace afterwards) (highly recommended: using Revise so you can make changes in the code without having to restart Julia; if not install, start julia (without --project) and ]add Revise) load the package with using GPDPS To run the example for the elliptic Nash equilibrium problem: do test_enep(N), where N controls the discretization (number of nodes per coordinate, default N=128) To run the example for the Huber-Potts segmentation model: do test_potts(alpha,gamma,keyword=value), where keyword is one or more of the following (comma separated, order insensitive, may be removed before submission) with default value if omitted: image: test image; default is "blobs" (size 256x254), other images in .tif format can be specified by file name if placed in the img folder isotropic: use isotropic (value true, default) or anisotropic (value `false)` Potts model maxit: maximum number of iterations (default 500000) If alpha or gamma are not specified, the default values 1 and 1e-3 are used. Possible issues: on macOS, it may also be required to ]add QuartzImageIO if it is not included in the default environment. Development This code can also be found at https://github.com/clason/GPDPS.jl Reference If you find this code useful, you can cite the paper as
@article{GPDPS, author = {Clason, Christian and Mazurenko, Stanislav and Valkonen, Tuomo}, title = {Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization}, year = {2019}, eprinttype = {arxiv}, eprint = {1901.02746}, } 
 
Type Of Technology Software 
Year Produced 2020 
Open Source License? Yes  
URL https://zenodo.org/record/3647614
 
Title Julia codes for "Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization" 
Description GPDPS GPDPS is a Julia package that provides the reference implementation of the generalized primal-dual proximal splitting approach described in the paper "Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization" by Christian Clason, Stanislav Mazurenko, and Tuomo Valkonen. Usage To use the provided test codes (compatible with Julia version 1.0-1.2, tested on macOS and Linux x86_64): start Julia from this directory with julia --project=. (or with the relative path to the GPDPS.jl directory from somewhere else) do ]instantiate to download all dependencies (only required the first time, make sure you go back to standard Julia prompt with backspace afterwards) (highly recommended: using Revise so you can make changes in the code without having to restart Julia; if not install, start julia (without --project) and ]add Revise) load the package with using GPDPS To run the example for the elliptic Nash equilibrium problem: do test_enep(N), where N controls the discretization (number of nodes per coordinate, default N=128) To run the example for the Huber-Potts segmentation model: do test_potts(alpha,gamma,keyword=value), where keyword is one or more of the following (comma separated, order insensitive, may be removed before submission) with default value if omitted: image: test image; default is "blobs" (size 256x254), other images in .tif format can be specified by file name if placed in the img folder isotropic: use isotropic (value true, default) or anisotropic (value `false)` Potts model maxit: maximum number of iterations (default 500000) If alpha or gamma are not specified, the default values 1 and 1e-3 are used. Possible issues: on macOS, it may also be required to ]add QuartzImageIO if it is not included in the default environment. Development This code can also be found at https://github.com/clason/GPDPS.jl Reference If you find this code useful, you can cite the paper as
@article{GPDPS, author = {Clason, Christian and Mazurenko, Stanislav and Valkonen, Tuomo}, title = {Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization}, year = {2019}, eprinttype = {arxiv}, eprint = {1901.02746}, } 
 
Type Of Technology Software 
Year Produced 2020 
Open Source License? Yes  
URL https://zenodo.org/record/3647615
 
Title Source codes for "Inertial, corrected, primal-dual proximal splitting" 
Description These are the Matlab source codes for the aforementioned manuscript by Tuomo Valkonen, available at https://arxiv.org/abs/1804.08736. The top-level batchrun.m and batchreport.m files are used to generate the results in manuscript. Individual tests can be run with the test_tv_*.m and test_tgv2_*.m files likewise at the top-level. The parameters can either be editied in the files or passed like the batch run files do. 
Type Of Technology Software 
Year Produced 2019 
Open Source License? Yes  
URL https://zenodo.org/record/3531935
 
Title Source codes for "Inertial, corrected, primal-dual proximal splitting" 
Description These are the Matlab source codes for the aforementioned manuscript by Tuomo Valkonen, available at https://arxiv.org/abs/1804.08736. The top-level batchrun.m and batchreport.m files are used to generate the results in manuscript. Individual tests can be run with the test_tv_*.m and test_tgv2_*.m files likewise at the top-level. The parameters can either be editied in the files or passed like the batch run files do. 
Type Of Technology Software 
Year Produced 2019 
Open Source License? Yes  
URL https://zenodo.org/record/3531934