Ask, and it shall be given: On the Turing completeness of prompting

Authors: Ruizhong Qiu, Zhe Xu, Wenxuan Bao, Hanghang Tong

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Here, we present the first theoretical study on the LLM prompting paradigm to the best of our knowledge. In this work, we show that prompting is in fact Turing-complete: there exists a finite-size Transformer such that for any computable function, there exists a corresponding prompt following which the Transformer computes the function.
Researcher Affiliation Academia Ruizhong Qiu, Zhe Xu, Wenxuan Bao, & Hanghang Tong University of Illinois Urbana Champaign EMAIL
Pseudocode Yes Algorithm 1 Autoregressive generation: generateΓ (x) Input: decoder-only Transformer Γ : Σ+ Σ; nonempty input x Σ+; stop token $ Σ Output: generated output Σ+ Σω
Open Source Code Yes Our code and demo results are publicly available at https://github.com/q-rz/ICLR25-prompting-theory.
Open Datasets No The paper is theoretical and does not involve experiments that use specific datasets.
Dataset Splits No The paper is theoretical and does not involve empirical experiments with datasets that would require splitting.
Hardware Specification No The paper is theoretical and does not describe any specific hardware used for experiments. The "Experiments" section mentions a code implementation and demo results but no hardware specifications.
Software Dependencies No The paper mentions a "Python implementation" for its demonstration, but does not provide specific software dependencies or version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with hyperparameters or training configurations.