Iterated Learning in Dynamic Social Networks
Authors: Bernard Chazelle, Chu Wang
JMLR 2019 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We show our main results in each section, followed by details of the proofs and discussions. (from Section 1 Introduction). We prove that it is indeed a metric in the Appendix and also explain its name. (Section 3.1). To establish Theorem 1, we estimate the probability P that each leaner ends up picking h1. (Section 3.3). Theorem 3 For any 0 < ε < 1, the following sample size sequence makes chained iterated learning strongly ε-self-sustaining... (Section 3.5). Theorem 4 Given any small enough δ, ε > 0, the following sample size sequence for iterated Bayesian linear regression ensures that Eµt µ0 2 δ with probability greater than 1 ε: ... (Section 3.6). Theorem 6 Under the truth-hearing assumption, the system reaches truthful consensus with a convergence rate bounded by O(t γ/2η), where η is the maximum outdegree over all the networks. (Section 5.2). |
| Researcher Affiliation | Collaboration | Bernard Chazelle EMAIL Department of Computer Science, Princeton University, 35 Olden Street, Princeton, NJ 08544, Chu Wang EMAIL Amazon Inc. 500 Boren Avenue, Seattle, WA 98109 |
| Pseudocode | No | The paper describes mathematical models, theorems, and proofs using equations and prose, but does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement regarding the availability of source code or a link to a code repository. |
| Open Datasets | No | The paper primarily presents theoretical work and does not use or provide access to any specific datasets for empirical evaluation. It references previous work in language evolution models and iterated Bayesian linear regression but does not use their datasets for its own analysis. |
| Dataset Splits | No | This paper is theoretical and does not perform experiments requiring dataset splits. |
| Hardware Specification | No | The paper is a theoretical work and does not describe experimental hardware specifications. |
| Software Dependencies | No | The paper is a theoretical work and does not describe software dependencies with version numbers. |
| Experiment Setup | No | The paper focuses on theoretical analysis and does not describe any experimental setup details or hyperparameters. |