Experts has a new look! Let us know what you think of the updates.

Provide feedback
Home
Scholarly Works
Filtered not Mixed: Stochastic Filtering-Based...
Preprint

Filtered not Mixed: Stochastic Filtering-Based Online Gating for Mixture of Large Language Models

Abstract

We propose MoE-F - a formalized mechanism for combining $N$ pre-trained Large Language Models (LLMs) for online time-series prediction by adaptively

Authors

Saqur R; Kratsios A; Krach F; Limmer Y; Tian J-J; Willes J; Horvath B; Rudzicz F

Publication date

June 5, 2024

DOI

10.48550/arxiv.2406.02969

Preprint server

arXiv