When Hardness of Approximation Meets Hardness of Learning
Authors: Eran Malach, Shai Shalev-Shwartz
JMLR 2022 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we show a single hardness property that implies both hardness of approximation using linear classes and shallow networks, and hardness of learning using correlation queries and gradient-descent. This allows us to obtain new results on hardness of approximation and learnability of parity functions, DNF formulas and AC0 circuits. Proof We rely on a very simple observation, that will be the key of the analysis in this section. |
| Researcher Affiliation | Academia | Eran Malach EMAIL School of Computer Science The Hebrew University Jerusalem, Israel Shai Shalev-Shwartz EMAIL School of Computer Science The Hebrew University Jerusalem, Israel |
| Pseudocode | No | The paper describes mathematical frameworks, theorems, and proofs. It does not include any explicitly labeled pseudocode or algorithm blocks. The gradient descent steps are presented as mathematical equations rather than structured algorithmic pseudocode. |
| Open Source Code | No | The paper presents theoretical results and mathematical proofs. There is no mention of code availability, supplementary materials containing code, or links to any code repositories. |
| Open Datasets | No | The paper analyzes theoretical concepts such as 'distribution families', 'parities', 'DNF formulas', and 'AC0 circuits'. It does not mention or provide access information for any specific publicly available datasets. |
| Dataset Splits | No | The paper is theoretical and does not describe experiments using specific datasets, therefore, there is no mention of dataset splits like training, validation, or test sets. |
| Hardware Specification | No | The paper focuses on theoretical analysis and mathematical proofs. It does not describe any experimental setup or specify hardware used for computations. |
| Software Dependencies | No | The paper is a theoretical work presenting mathematical proofs and analyses. It does not describe any computational experiments or implementations, and therefore does not list software dependencies with specific version numbers. |
| Experiment Setup | No | This is a theoretical paper presenting mathematical theorems and proofs, not an experimental one. Therefore, it does not contain details about experimental setup, hyperparameters, or training configurations. |