Variational Inference for Latent Variables and Uncertain Inputs in Gaussian Processes
Authors: Andreas C. Damianou, Michalis K. Titsias, Neil D. Lawrence
JMLR 2016 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate our method on synthetic data and standard machine learning benchmarks, as well as challenging real world datasets, including high resolution video data. |
| Researcher Affiliation | Academia | Andreas C. Damianou EMAIL Dept. of Computer Science and Sheffield Institute for Translational Neuroscience University of Sheffield UK; Michalis K. Titsias EMAIL Department of Informatics Athens University of Economics and Business Greece; Neil D. Lawrence EMAIL Dept. of Computer Science and Sheffield Institute for Translational Neuroscience University of Sheffield UK |
| Pseudocode | Yes | Algorithm 1 GP Regression with Missing Inputs Model: Training and predictions |
| Open Source Code | Yes | Matlab source code for repeating the following experiments is available on-line from: http://git.io/A3TN and supplementary videos from: http://git.io/A3t5. |
| Open Datasets | Yes | We illustrate the method in the multi-phase oil flow data (Bishop and James, 1993)... We followed Taylor et al. (2007); Lawrence (2007) in considering motion capture data of walks and runs taken from subject 35 in the CMU motion capture database... We consider the well known USPS digits dataset. |
| Dataset Splits | Yes | This dataset consists of 16 × 16 images for all 10 digits and it is divided into 7291 training examples and 2007 test examples. |
| Hardware Specification | No | The paper does not explicitly describe the hardware used for the experiments. |
| Software Dependencies | No | Matlab source code for repeating the following experiments is available on-line from: http://git.io/A3TN and supplementary videos from: http://git.io/A3t5. Additionally, the paper mentions using 'scikit-learn (Pedregosa et al., 2011)' and 'GPy (authors, 2014)' for comparisons, but does not provide specific version numbers for the software used in their own implementation or for the GPy library they refer to. |
| Experiment Setup | Yes | In the experiments, a latent space variational distribution is required as initialisation. We use PCA to initialise the q dimensional means. The variances are initialised to values around 0.5... Inducing points are initialised as a random subset of the initial latent space. ARD inverse lengthscales are initialised based on a heuristic... the model is trained by optimising jointly all (hyper)parameters using the scaled conjugate gradients method. The optimisation is stopped until the change in the objective (variational lower bound) is very small. For the USPS data base. We used 10 latent dimensions and 50 inducing variables for each model. |