Preprint
Is In-Context Universality Enough? MLPs are Also Universal In-Context
Abstract
The success of transformers is often linked to their ability to perform
in-context learning. Recent work shows that transformers are universal in
Authors
Kratsios A; Furuya T
Publication date
February 5, 2025
DOI
10.48550/arxiv.2502.03327
Preprint server
arXiv