AutoKeras: An AutoML Library for Deep Learning
Authors: Haifeng Jin, François Chollet, Qingquan Song, Xia Hu
JMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The experimental results are published on the Auto Keras official website (autokeras.com). |
| Researcher Affiliation | Collaboration | Haifeng Jin 1, 2 EMAIL Fran cois Chollet 1 EMAIL Qingquan Song 2 EMAIL Xia Hu 3 EMAIL 1Google LLC, Mountain View, CA 94043, USA 2Texas A&M University, College Station, TX 77843, USA 3Rice University, Houston, TX 77005, USA |
| Pseudocode | Yes | Algorithm 1 The search algorithm of Auto Keras for i 1 to t do t is the total number of evaluations in the search if i <= m then m is the number of predefined configs eval(ith pre-defined hp) Evaluate pre-defined configurations else eval(mutate(get best hp())) Mutate the current best for evaluation |
| Open Source Code | Yes | Acknowledgments We thank the reviewers for their helpful comments, and we thank all the contributors from our open-source community for their work. |
| Open Datasets | No | The paper describes the capabilities of the Auto Keras library for different data types (images, texts, structured data) and uses a hypothetical example of an 'image of a house with attributes', but it does not specify any particular datasets used for evaluation or provide access information for any. |
| Dataset Splits | No | The paper does not provide specific dataset split information for any experiments. It describes how Auto Keras handles training data and supports multi-modal data, but no explicit splits are mentioned. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. It mentions the Tensor Flow ecosystem, which is compatible with various hardware, but no concrete specifications are given. |
| Software Dependencies | No | The paper states that "Auto Keras is built on top of Keras Tuner (O Malley et al., 2019), Keras (Chollet et al., 2015), and Tensor Flow (Abadi et al., 2016)." While these are key software components, specific version numbers (e.g., Keras 2.x, TensorFlow 2.x) are not provided, only the year of their respective papers. |
| Experiment Setup | No | The paper describes how Auto Keras automates hyperparameter tuning and model selection, mentioning elements like "optimizer, learning rate, and weight decay" as being tuned. However, it does not provide specific hyperparameter values or training configurations used in any experimental setup within the paper itself. |