Optimal Newton-Type Algorithms for Large-Scale Nonlinear Optimization

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Mathematics

Abstract

The research of the numerical optimization community, and hence our work as well, develops tools, underlying theory and techniques, for solving large-scale optimizationproblems as may occur in engineering and science. Real-life applications that can benefit from our work abound. Manufacturers seek maximum efficiency in the design of theirproduction processes. Investors aim at creating portfolios that avoid high risk while yielding a good return. Traffic planners need to decide on the level and ways of routing traffic to minimize congestion. Finding the `best' solution for such processes commonly involves constructing a mathematical model to describe such problems. The resulting models are usually complex and large scale, depending on a large number of parameters. Models with millions of variables and restrictions are not uncommon. It is therefore imperative to implement the model on a computer and to use computer algorithms for solving it. The methods and codes aim to solve the given problem efficiently, and robustly, thus allowing the software to be employed in a variety of contexts and to be run on a diverse range of computers. Since computers cannot solve mathematical problems exactly, only approximately, one of our priorities is to ensure that the solution obtained by applying our algorithms to the models is highly accurate, close to the `true' solution of the problem.Providing theoretical guarantees that the algorithm will successfully solve the user's problem is also crucial. A useful such `safety net' is for example, the provision of global convergence of the algorithm, namely, showing the algorithm will converge to a problem solution irrespectively of the initial guess used to start the algorithm. Establishing thespeed at which the method approaches the solution is important not only to the practitioner, who is keen to have a fast algorithm, but also computationally, since for example, a fast method better prevents the accumulation of numerical errors by the simple fact of taking less time to solve the problem. The direct relation between rate of convergence and time spent solving the problem implies the latter investigation also gives us an indication how long the algorithm will take to solve the problem; we often refer to the latter as an efficiency or complexity question. We have been involved in the development of a new class of methods, Adaptive Regularization with Cubics (ARC), and related software for nonlinear optimization problems, for which efficiency estimates can be given that are better than for other standard methods, such as the classical Newton and steepest-descent methods. This is a first in the context of the problems that interest us, namely functions that `wiggle' a lot, or in standard terms, that may have many local solutions, not just one global optimum. It turns out that not only are these methods theoretically interesting, but also numerically: we implemented our method on the computer and found that it does better than existing methods for this class of problems when tested on some standard test problems. We are excited by these developments and now plan to do a `proper' computer implementation and extensive testing to see how well the ARC method does. Also, we want to see if the ARC efficiency estimate is the best possible; then, no other method could ever perform better than ARC on nonlinear problems. We are also interested in extending ARC so that it can be guaranteed to compute not just a local solution, but the global one, which is a much more challenging task. Finally, we care about the situation when the problem unknowns are restricted to belong to certain possibly-complicated sets; we also intend to develop ARC to take these restrictions into account. If our work is successful, the ARC software will become part of an optimization library, GALAHAD, that is freely available to the research community, and that has been used to solve many applications.

Planned Impact

Beneficiaries: Nonlinear optimization problems are ubiquitous throughout computational science, engineering and operations research. Thus efficient general-purpose methods for solving such problems are essential to the work of many scientists and researchers worldwide, both in academia and in industry. A successful outcome of this project would imply that an improvement to state of the art algorithms for nonlinear optimization has been achieved, with strong theoretical guarantees of convergence and efficiency. These results would have a significant impact on the optimization community, as they would imply a reassessment of methods used for nonlinear optimization. It would also profoundly impact the users of optimization as it would provide improved algorithms and software for their optimization applications. Impact Pathways: A feature of interest to users in many applications is the computational effort required by the chosen algorithm. One way to measure this is the number of function evaluations required to solve the problem; this is particularly pertinent for applications where the function evaluation time dominates all other aspects of an iteration, such as in applications like climate modelling and multibody simulations. The focus of this project is a method, Adaptive Regularization with Cubics (ARC), whose overall effort in terms of the function-evaluations count we have shown to be better in worst-case situations than other classical state of the art choices such as steepest descent and Newton. Furthermore, we attempt to prove in the course of this research that in fact, ARC is the best that can be in such situations, amongst the successful Newton-type methods. Generally, most users are concerned with the average computational effort required by methods, rather than the worst case. Hence, we plan to develop an optimized ARC implementation suitable for large-scale problems and to see numerically, on standard test problems, whether it is better than the state of the art, a behaviour already observed in our preliminary experiments. If successful, this would lead to an improvement to the state of the art software for nonlinear optimization, of interest to all users of such codes. Furthermore, we also intend to extend the ARC approach to find the best solutions of a given problem, namely the global ones; which is one of the most challenging of the optimization problem classes. For this we will make use of parallel computations and also plan to develop an efficient implementation. Hence, these implementations, which appear as research Modules in the Case for Support, can be considered as pathways to impact for our theoretical algorithm development. Impact Actions: To ensure ease of access and availability of the resulting software (for local and global optimization), it is planned that it will be included in GALAHAD, a principal optimization software library whose main authors are Visiting Researchers on this project. Also, it will be made available through the Numerical Analysis and Intelligent Software (NAIS) Centre who co-sponsors this project. Thus, through both these venues, it will be available for free to the worldwide academic community; GALAHAD is also available for a fee to commercial users. As the frequent downloads of the GALAHAD library indicate, as well as the leading role NAIS is gaining in the UK numerical analysis and high performance computing landscape, our software will impact users of optimization in diverse areas of research and applications.

Publications

10 25 50
 
Description We have managed to improve methods, their implementation and parallelisation for global optimisation problems, the most challenging class of optimisation problems.
Exploitation Route Our findings can be used through our software that is freely available on COIN-OR.
Sectors Chemicals,Construction,Creative Economy,Digital/Communication/Information Technologies (including Software),Electronics,Energy,Environment,Pharmaceuticals and Medical Biotechnology

 
Title oBB (Overlapping Branch and Bound) software 
Description We have implemented, tested, documented and posted online a new global optimization branch and bound solver that uses the cubic regularization techniques advocated in the grant. Currently, oBB can be downloaded freely from the dedicated website: https://pypi.python.org/pypi/oBB. oBB has been refereed and accepted to appear as part of the open-source principal software repository COIN-OR, please see the dedicated COIN-OR page for details on software and download: https://projects.coin-or.org/oBB 
Type Of Technology Software 
Year Produced 2013 
Open Source License? Yes  
Impact Not known yet. 
URL http://www.coin-or.org/projects/oBB.xml