Experts has a new look! Let us know what you think of the updates.

Provide feedback
Home
Scholarly Works
Is In-Context Universality Enough? MLPs are Also...
Preprint

Is In-Context Universality Enough? MLPs are Also Universal In-Context

Abstract

The success of transformers is often linked to their ability to perform in-context learning. Recent work shows that transformers are universal in

Authors

Kratsios A; Furuya T

Publication date

February 5, 2025

DOI

10.48550/arxiv.2502.03327

Preprint server

arXiv