Experts has a new look! Let us know what you think of the updates.

Provide feedback
Home
Scholarly Works
Parsimonious mixture‐of‐experts based on mean...
Journal article

Parsimonious mixture‐of‐experts based on mean mixture of multivariate normal distributions

Abstract

The mixture‐of‐experts (MoE) paradigm attempts to learn complex models by combining several “experts” via probabilistic mixture models. Each expert in the MoE model handles a small area of the data space in which a gating function controls the data‐to‐expert assignment. The MoE framework has been used extensively in designing non‐linear models in machine learning and statistics to model the heterogeneity in data for the purpose of regression, …

Authors

Sepahdar A; Madadi M; Balakrishnan N; Jamalizadeh A

Journal

Stat, Vol. 11, No. 1,

Publisher

Wiley

Publication Date

December 2022

DOI

10.1002/sta4.421

ISSN

2049-1573

Labels