Amortized Variational Inference: A Systematic Review
Authors: Ankush Ganguly, Sanjana Jain, Ukrit Watchareeruetai
JAIR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we review the mathematical foundations of various VI techniques to form the basis for understanding amortized VI. Additionally, we provide an overview of the recent trends that address several issues of amortized VI, such as the amortization gap, generalization issues, inconsistent representation learning, and posterior collapse. Finally, we analyze alternate divergence measures that improve VI optimization. |
| Researcher Affiliation | Industry | Ankush Ganguly EMAIL Sanjana Jain EMAIL Ukrit Watchareeruetai EMAIL Sertis Vision Lab Sukhumvit Road, Watthana, Bangkok 10110, Thailand |
| Pseudocode | Yes | Algorithm 1: CAVI for the traditional VI optimization process... Algorithm 2: The SVI optimization process based on Hoffman et al. (2013)... Algorithm 3: The amortized VI optimization process using stochastic gradient ascent |
| Open Source Code | No | The paper is a systematic review of existing techniques and does not present new methodologies requiring custom source code. Therefore, it does not contain a statement about open-sourcing code for its own work or provide a link to a code repository. |
| Open Datasets | No | The paper is a review and does not present new experimental results using specific datasets. It mentions datasets like MNIST or Celeb A in the context of discussing other research, but it does not provide access information for data used by the authors of this paper for their own analysis. |
| Dataset Splits | No | The paper is a review and does not describe new experimental work conducted by its authors. Therefore, it does not provide information about dataset splits for reproducibility. |
| Hardware Specification | No | The paper is a review and does not describe new experimental work conducted by its authors. Therefore, it does not specify any hardware used for running experiments. |
| Software Dependencies | No | The paper is a review and does not describe new experimental work conducted by its authors. Therefore, it does not list any specific software dependencies or version numbers. |
| Experiment Setup | No | The paper is a review and does not describe new experimental work conducted by its authors. Therefore, it does not provide details about experimental setup, hyperparameters, or training settings. |