Meta Learning Conditional Densities

Lead Research Organisation: University of Oxford
Department Name: Statistical Science CDT


In recent years, there has been a lot of research focused on a new emerging field called Meta-Learning. The idea of Meta-Learning can be traced back to Donald B. Maudsley's work in 1979, who referred to it as "the process by which learners become aware of and increasingly in control of habits of perception, inquiry, learning, and growth that they have internalized". Hence the goal of Meta-Learning is to understand how one can grasp ideas from data and transfer these concepts to similar tasks efficiently. However, a lot of machine learning(ML) models these days require a lot of data and training time to get good performances, i.e. Deep learning models. Meta-Learning aims to find the underlying structure of these tasks. So given a new task the model is able to adapt quickly. Understanding this would be a step towards Artificial General Intelligence.

There has been a huge interest in Meta-Learning in the recent years, as it allows for efficient learning from multiple dataset and tasks in a joint manner. Among the most popular algorithms, we have "Model Agnostic Meta Learning" (MAML), which is a optimization based model. Another popular model is the Neural Process, which has received a lot of attention recently as it is able to model distributions over functions instead of modelling them over parameters.

In this project, we aim to use the Meta-Learning framework for modelling distributions over the expectation of conditional distributions. Density estimation is an important problem in a lot of Machine Learning task, as it allows us, for example, to detect outliers or get a better understanding of the underlying data distribution itself. Being able to compute the log-likelihood of new samples, enables us to discard/further investigate those specific data points, which can be very useful in a wide variety of scenarios such as medicine for example.

In addition, uni-model distributions for prediction tasks are often used as they allow for simple computations such as the popular Kullback-Leibler divergence which measures the distance between distributions. However, unimodality is a strong assumption that is usually not the case in real-world scenarios. Hence, we propose to firstly model the expectation of the conditional distribution without the unimodality constraint in a non parametric fashion and secondly, learn representations of distributions that can be transferred across different density estimation tasks.

The novel methodology, we want to apply to this task can be quantified under the umbrella of Kernel methods. We make use of the well-established methodology of Kernel mean embeddings, which allows us to encode a distribution in a functional space. This space can then be learned to be optimal for transferring structures between datasets. A novel way of tackling this learning problem is to use "Constrastive Noise Estimation (NCE)", which is a recently developed method that allows us to learn representations that encode the underlying structure in a signal, thereby ignoring low-level information such as noise. In essence, the idea consists of comparing your data with noise and thus learning a suitable function space that is able to distinguish them. This method has shown impressive results in a lot of different areas such as speech, images, text and reinforcement learning. Due to its computational scalability and representative power, we aim to use NCE for learning the function space of conditional distribution and thus learn a useful representations for density estimation across different datasets.

Hence, the primary aim of this project is representation learning of conditional distributions, that captures the underlying structure of the densities, using the recently developed and promising technique of NCE.

This project falls within the EPSRC Information and communication technologies (ICT) and Engineering research area. I currently do not have any companies or collaboarators involved in my projects.


10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/R512333/1 01/10/2017 30/09/2021
1930458 Studentship EP/R512333/1 01/10/2017 30/09/2021 Jean-Francois Ton