On the Complexity of Approximating Multimarginal Optimal Transport

Authors: Tianyi Lin, Nhat Ho, Marco Cuturi, Michael I. Jordan

JMLR 2022 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Preliminary results on synthetic data and real images demonstrate the effectiveness and efficiency of our algorithms. ... 6. Experiments In this section, we evaluate our new algorithms on both synthetic data and real images.
Researcher Affiliation Collaboration Tianyi Lin darren EMAIL Department of Electrical Engineering and Computer Science University of California Berkeley, CA 94720-1776, USA, Nhat Ho EMAIL Department of Statistics and Data Sciences University of Texas, Austin Texas, 78712-1823, USA, Marco Cuturi EMAIL Google Brain Department of Statistics, CREST ENSAE, Michael I. Jordan EMAIL Department of Electrical Engineering and Computer Science and Department of Statistics University of California Berkeley, CA 94720-1776, USA. There is a mix of academic (University of California Berkeley, University of Texas, Austin, CREST ENSAE) and industry (Google Brain) affiliations.
Pseudocode Yes Algorithm 1 Multi Sinkhorn(C, η, {rk}k [m], ε ) Algorithm 2 Round(X, {rk}k [m]) Algorithm 3 Approximating MOT by Algorithm 1 and 2 Algorithm 4 Accelerated Multi Sinkhorn(C, η, { rk}k [m], ε ) Algorithm 5 Approximating MOT by Algorithms 2 and 4
Open Source Code No The paper discusses various algorithms and mentions the use of existing tools like MATLAB and the POT package but does not provide any explicit statement about making the source code for the methodology described in this paper publicly available or include links to a code repository.
Open Datasets Yes We conduct the experiment with the same setup and MNIST dataset. The MNIST dataset consists of 60,000 images of handwritten digits of size 28 by 28 pixels.
Dataset Splits No The paper mentions using the MNIST dataset and synthetic data but does not specify how these datasets were partitioned into training, validation, or test sets for the experiments.
Hardware Specification Yes All the experiments are conducted in MATLAB R2020a on a workstation with an Intel Core i5-9400F (6 cores and 6 threads) and 32GB memory, equipped with Ubuntu 18.04.
Software Dependencies Yes All the experiments are conducted in MATLAB R2020a on a workstation with an Intel Core i5-9400F (6 cores and 6 threads) and 32GB memory, equipped with Ubuntu 18.04.
Experiment Setup No The paper describes generating synthetic images with certain characteristics (e.g., uniform distribution on [0,1] for background, [0,50] for foreground, 10% square size) and setting regularization parameters (e.g., η {1, 0.2, 0.1} for synthetic data, η {1, 0.05, 0.02} for MNIST) and a maximum number of iterations (10). However, it does not provide comprehensive hyperparameter values or system-level training settings typically found in an experimental setup section for deep learning or complex model training.