I am a PhD student in Machine Learning at the School of Informatics of the University of Edinburgh, advised by Michael Gutmann.
My research is currently focused on developing efficient methods for (deep) statistical model estimation from incomplete data. My broader interests include deep probabilistic modelling, probabilistic inference (variational inference, MCMC, importance sampling), deep learning, and software for machine learning.
I have obtained a MSc in Artificial Intelligence from the University of Edinburgh and a BEng in Software Engineering from the University of Southampton.
We show that missing data increases the complexity of the posterior distribution of the latent variables in VAEs. To mitigate the increased posterior complexity we introduce two strategies based on (i) finite and (ii) imputation-based variational-mixtures.
We link a structured latent space in VAEs, a commonly desired property, to poor conditional sampling performance of Metropolis-within-Gibbs (MWG). To mitigate the issues of MWG we introduce two original methods for conditional sampling of VAEs: AC-MWG and LAIR.
We propose a new method for statistical model estimation from incomplete data, called variational Gibbs inference (VGI). Whilst being general-pupose, the proposed method outperforms existing VAE and normalising flow specific methods.
We propose an unsupervised representation learning method for a job title similarity model using noisy skill labels. We show that it is highly effective for tasks such as text ranking and job normalization.