Incremental Multiple Hidden Layers Regularized Extreme Learning Machine Based on Forced Positive-Definite Cholesky Factorization

Joint Authors

Le, Ba Tuan
Liu, Jingyi

Source

Mathematical Problems in Engineering

Issue

Vol. 2019, Issue 2019 (31 Dec. 2019), pp.1-15, 15 p.

Publisher

Hindawi Publishing Corporation

Publication Date

2019-04-24

Country of Publication

Egypt

No. of Pages

15

Main Subjects

Civil Engineering

Abstract EN

The theory and implementation of extreme learning machine (ELM) prove that it is a simple, efficient, and accurate machine learning method.

Compared with other single hidden layer feedforward neural network algorithms, ELM is characterized by simpler parameter selection rules, faster convergence speed, and less human intervention.

The multiple hidden layer regularized extreme learning machine (MRELM) inherits these advantages of ELM and has higher prediction accuracy.

In the MRELM model, the number of hidden layers is randomly initiated and fixed, and there is no iterative tuning process.

However, the optimal number of hidden layers is the key factor to determine the generalization ability of MRELM.

Given this situation, it is obviously unreasonable to determine this number by trial and random initialization.

In this paper, an incremental MRELM training algorithm (FC-IMRELM) based on forced positive-definite Cholesky factorization is put forward to solve the network structure design problem of MRELM.

First, an MRELM-based prediction model with one hidden layer is constructed, and then a new hidden layer is added to the prediction model in each training step until the generalization performance of the prediction model reaches its peak value.

Thus, the optimal network structure of the prediction model is determined.

In the training procedure, forced positive-definite Cholesky factorization is used to calculate the output weights of MRELM, which avoids the calculation of the inverse matrix and Moore-Penrose generalized inverse of matrix involved in the training process of hidden layer parameters.

Therefore, FC-IMRELM prediction model can effectively reduce the computational cost brought by the process of increasing the number of hidden layers.

Experiments on classification and regression problems indicate that the algorithm can be effectively used to determine the optimal network structure of MRELM, and the prediction model training by the algorithm has excellent performance in prediction accuracy and computational cost.

American Psychological Association (APA)

Liu, Jingyi& Le, Ba Tuan. 2019. Incremental Multiple Hidden Layers Regularized Extreme Learning Machine Based on Forced Positive-Definite Cholesky Factorization. Mathematical Problems in Engineering،Vol. 2019, no. 2019, pp.1-15.
https://search.emarefa.net/detail/BIM-1196679

Modern Language Association (MLA)

Liu, Jingyi& Le, Ba Tuan. Incremental Multiple Hidden Layers Regularized Extreme Learning Machine Based on Forced Positive-Definite Cholesky Factorization. Mathematical Problems in Engineering No. 2019 (2019), pp.1-15.
https://search.emarefa.net/detail/BIM-1196679

American Medical Association (AMA)

Liu, Jingyi& Le, Ba Tuan. Incremental Multiple Hidden Layers Regularized Extreme Learning Machine Based on Forced Positive-Definite Cholesky Factorization. Mathematical Problems in Engineering. 2019. Vol. 2019, no. 2019, pp.1-15.
https://search.emarefa.net/detail/BIM-1196679

Data Type

Journal Articles

Language

English

Notes

Includes bibliographical references

Record ID

BIM-1196679