Log-Likelihood Ratio Minimizing Flows: Towards Robust and Quantifiable Neural Distribution Alignment
Authors: Ben Usman, Avneesh Sud, Nick Dufour, Kate Saenko
NeurIPS 2020 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we present experiments that verify that minimizing the proposed LRMF objective (3) with Gaussian, Real NVP, and FFJORD density estimators does indeed result in dataset alignment. |
| Researcher Affiliation | Collaboration | Ben Usman 1,2 EMAIL Avneesh Sud 2 EMAIL Nick Dufour 2 EMAIL Kate Saenko 1,3 EMAIL Boston University 1 Google AI 2 MIT-IBM Watson AI Lab 3 |
| Pseudocode | No | The paper presents mathematical definitions and derivations but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | We provide Jupyter notebooks with code in JAX [6] and Tensor Flow Probability (TFP) [7]. |
| Open Datasets | Yes | We also trained a Real NVP LRMF to map latent codes of USPS digits to latent codes of MNIST. |
| Dataset Splits | No | This enables automatic model validation and hyperparameter tuning on the held-out set. |
| Hardware Specification | No | The paper does not specify any particular hardware (e.g., GPU models, CPU types, or cloud instance specifications) used for running the experiments. |
| Software Dependencies | No | We provide Jupyter notebooks with code in JAX [6] and Tensor Flow Probability (TFP) [7]. |
| Experiment Setup | No | We used original hyperparameters and network architectures from Real NVP [8] and FFJORD [11], the exact values are given in the supplementary. |