Publications

Below, * represents equal contribution.

Hyperparameter Learning via Distributional Transfer

H. Law, P. Zhao, J. Huang, D. Sejdinovic

Advances in Neural Information Processing Systems (NeurIPS), 2019

Bayesian optimisation is a popular technique for hyperparameter learning but typically requires initial exploration even in cases where potentially similar prior tasks have been solved. We propose to transfer information across tasks using kernel embeddings of distributions of training datasets used in those tasks. The resulting method has a faster convergence compared to existing baselines, in some cases requiring only a few evaluations of the target objective.

Paper

Variational Learning on Aggregate Outputs with Gaussian Processes

H. Law, D. Sejdinovic, E. Cameron, T. CD Lucas, S. Flaxman, K. Battle, K. Fukumizu

Advances in Neural Information Processing Systems (NeurIPS), 2018

We construct an approach to learning from aggregation of outputs based on variational learning with Gaussian processes. In particular, we propose new bounds and tractable approximations, leading to improved prediction accuracy and scalability to large datasets, while explicitly taking uncertainty into account. We apply our framework to a challenging and important problem, the fine-scale spatial modelling of malaria incidence, with over 1 million observations.

Paper Software Video

A Differentially Private Kernel Two-Sample Test

A. Raj*, H. Law*, D. Sejdinovic, M. Park

Preprint, 2018

Kernel two-sample testing is a useful statistical tool in determining whether data samples arise from different distributions without imposing any parametric assumptions on those distributions. However, raw data samples can expose ensitive information about individuals who participate in scientific studies, which makes the current tests vulnerable to privacy breaches. Hence, we design a new framework for kernel two-sample testing conforming to differential privacy constraints, in order to guarantee the privacy of subjects in the data.

Paper

Bayesian Approaches to Distribution Regression

H. Law*, D. Sutherland*, D. Sejdinovic, S. Flaxman

Artificial Intelligence and Statistics (AISTATS), 2018

Construct a Bayesian distribution regression formalism that accounts for bag size uncertainty, improving the robustness and performance of existing models. The models proposed can be framed in a neural network-style, and we demonstrate its performance on the IMDb-WIKI image dataset for celebrity age classification.

Paper Software

Testing and Learning on Distributions with Symmetric Noise Invariance

H. Law, C. Yau, D. Sejdinovic

Advances in Neural Information Processing Systems (NeurIPS), 2017

Construct invariant features of distributions, leading to testing and learning algorithms robust to the impairment of the input distributions with symmetric additive noise. These features lend themselves to a straight forward neural network approach, and can also be easily implemented in many algorithms.

Paper Software Video