Share Email Print
cover

Proceedings Paper

Heuristic hyperparameter optimization for multilayer perceptron with one hidden layer
Author(s): Łukasz Neumann; Robert M. Nowak
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

One of the crucial steps of preparing a neural network model is the process of tuning its hyperparameters. This process can be time-consuming and hard to be done properly by hand. Tuned hyperparameters allow to obtain high accuracy of classification as well as fast training. In this paper we explore the usage of selected heuristic algorithms based on evolutionary approach: Covariance Matrix Adaptation Evolution Strategy (CMAES), Differential Evolution Strategy (DES) and jSO for the hyperparameter tuning task. Results of Multilayer Perceptron’s (MLP) hyperparameter optimization for a real-life dataset are presented. An improvement in models’ performance is observed through the usage of presented approach.

Paper Details

Date Published: 1 October 2018
PDF: 8 pages
Proc. SPIE 10808, Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2018, 108082A (1 October 2018); doi: 10.1117/12.2501569
Show Author Affiliations
Łukasz Neumann, Warsaw Univ. of Technology (Poland)
Robert M. Nowak, Warsaw Univ. of Technology (Poland)


Published in SPIE Proceedings Vol. 10808:
Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2018
Ryszard S. Romaniuk; Maciej Linczuk, Editor(s)

© SPIE. Terms of Use
Back to Top