Vaidotas Šimkus

I am a PhD student in Machine Learning at the School of Informatics of the University of Edinburgh, advised by Michael Gutmann.

My research is currently focused on developing efficient methods for (deep) statistical model estimation from incomplete data. My broader interests include deep probabilistic modelling, probabilistic inference (variational inference, MCMC, importance sampling), deep learning, and software for machine learning.

I have obtained a MSc in Artificial Intelligence from the University of Edinburgh and a BEng in Software Engineering from the University of Southampton.

Email  /  GitHub  /  Twitter  /  Mastodon  /  LinkedIn

profile photo

Publications

project image

Improving Variational Autoencoder Estimation from Incomplete Data with Mixture Variational Families


Vaidotas Šimkus, Michael Gutmann
arXiv pre-print, 2024
bib / arxiv

We show that missing data increases the complexity of the posterior distribution of the latent variables in VAEs. To mitigate the increased posterior complexity we introduce two strategies based on (i) finite and (ii) imputation-based variational-mixtures.

project image

Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling


Vaidotas Šimkus, Michael Gutmann
Transactions on Machine Learning Research (TMLR), 2023
url / bib / arxiv / code

We link a structured latent space in VAEs, a commonly desired property, to poor conditional sampling performance of Metropolis-within-Gibbs (MWG). To mitigate the issues of MWG we introduce two original methods for conditional sampling of VAEs: AC-MWG and LAIR.

project image

Variational Gibbs Inference for Statistical Model Estimation from Incomplete Data


Vaidotas Šimkus, Ben Rhodes, Michael Gutmann
Journal of Machine Learning Research (JMLR), 2023
url / bib / arxiv / code / poster / slides / neurips / demo

We propose a new method for statistical model estimation from incomplete data, called variational Gibbs inference (VGI). Whilst being general-pupose, the proposed method outperforms existing VAE and normalising flow specific methods.

project image

Learning Job Titles Similarity from Noisy Skill Labels


Rabih Zbib, Lucas Alvarez Lacasa, Federico Retyk, Rus Poves, Juan Aizpuru, Hermenegildo Fabregat, Vaidotas Šimkus, Emilia García-Casademont
FEAST workshop at ECML-PKDD, 2022
url / bib / arxiv / dataset

We propose an unsupervised representation learning method for a job title similarity model using noisy skill labels. We show that it is highly effective for tasks such as text ranking and job normalization.