Synthesis of Differentiable Functional Programs for Lifelong Learning
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Informatics
Abstract
This project focuses on lifelong machine learning - the ability of a model to reuse knowledge throughout a sequence of tasks. Currently, neural networks are limited by the large amount of data they require in order to generalise well. Advancements in lifelong learning have the potential to reduce this requirement, thus allowing neural networks to be applicable to more real-word problems.
Artificial neural network architectures can be represented as a functional program - a combination of differentiable functions (neural networks) and higher-order functions (e.g. map, fold) [1]. The project's novelty is posing transferring knowledge to a new machine learning task as a symbolic program synthesis problem. For a new task, we use symbolic program synthesis methods to generate different neural network architectures, in which previously learned neural networks can be reused. Our approach, detailed in our NIPS 2018 paper [2] outperforms competing methods, while requiring much fewer training points for a new task.
This project's aim is to investigate the benefits and limitations of the approach described above. Our first objective was to demonstrate that this approach can outperform competing methods. Another objective is to make the method more scalable for longer sequences of tasks. Thirdly, we would like to extend the method to solve more complex tasks.
[1] Christopher Olah. Neural networks, types, and functional programming, 2015. http://colah.github.io/posts/2015-09-NN-Types-FP/.
[2] Valkov, L., Chaudhari, D., Srivastava, A., Sutton, C. and Chaudhuri, S., 2018. Synthesis of Differentiable Functional Programs for Lifelong Learning. arXiv preprint arXiv:1804.00218.
Artificial neural network architectures can be represented as a functional program - a combination of differentiable functions (neural networks) and higher-order functions (e.g. map, fold) [1]. The project's novelty is posing transferring knowledge to a new machine learning task as a symbolic program synthesis problem. For a new task, we use symbolic program synthesis methods to generate different neural network architectures, in which previously learned neural networks can be reused. Our approach, detailed in our NIPS 2018 paper [2] outperforms competing methods, while requiring much fewer training points for a new task.
This project's aim is to investigate the benefits and limitations of the approach described above. Our first objective was to demonstrate that this approach can outperform competing methods. Another objective is to make the method more scalable for longer sequences of tasks. Thirdly, we would like to extend the method to solve more complex tasks.
[1] Christopher Olah. Neural networks, types, and functional programming, 2015. http://colah.github.io/posts/2015-09-NN-Types-FP/.
[2] Valkov, L., Chaudhari, D., Srivastava, A., Sutton, C. and Chaudhuri, S., 2018. Synthesis of Differentiable Functional Programs for Lifelong Learning. arXiv preprint arXiv:1804.00218.
Organisations
People |
ORCID iD |
Charles Sutton (Primary Supervisor) | |
Lazar Valkov (Student) |
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/N509644/1 | 30/09/2016 | 29/09/2021 | |||
1796468 | Studentship | EP/N509644/1 | 30/09/2016 | 31/12/2020 | Lazar Valkov |
Title | Implementation for HOUDINI: Lifelong learning as program synthesis |
Description | A Python implementation of the method described in our paper "HOUDINI: Lifelong learning as program synthesis". |
Type Of Technology | Software |
Year Produced | 2019 |
Open Source License? | Yes |
Impact | Releasing this allows other researchers to evaluate and further develop our method. |