Share Email Print
cover

Proceedings Paper

Use of localized gating in mixture of experts networks
Author(s): Viswanath Ramamurti; Joydeep Ghosh
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

The 'mixture-of-experts (MOE)' is a popular architecture for function approximation. In the standard architecture, each expert is gated via a softmax function, and its domain of application is not very localized. This paper summarizes several recent results showing the advantages of using localized gating instead. These include a natural framework for model selection/adaptation by growing and shrinking the number of experts, modeling of non-stationary environments, improving the generalization performance and obtaining confidence intervals of network outputs. These results substantially increase the scope and power of MOE networks. Several simulation results are presented to support the theoretical arguments.

Paper Details

Date Published: 25 March 1998
PDF: 12 pages
Proc. SPIE 3390, Applications and Science of Computational Intelligence, (25 March 1998); doi: 10.1117/12.304812
Show Author Affiliations
Viswanath Ramamurti, SBC Technology Resources, Inc. (United States)
Joydeep Ghosh, Univ. of Texas/Austin (United States)


Published in SPIE Proceedings Vol. 3390:
Applications and Science of Computational Intelligence
Steven K. Rogers; David B. Fogel; James C. Bezdek; Bruno Bosacchi, Editor(s)

© SPIE. Terms of Use
Back to Top