- Tytuł:
- Convergence Analysis of An Improved Extreme Learning Machine Based on Gradient Descent Method
- Autorzy:
-
Yusong, L.
Zhixun, S.
Bingjie, Y.
Xiaoling, G.
Zhaoyang, S. - Powiązania:
- https://bibliotekanauki.pl/articles/972914.pdf
- Data publikacji:
- 2016
- Wydawca:
- Społeczna Akademia Nauk w Łodzi
- Tematy:
-
neural networks
monotonicity
weak convergence
strong convergence
USUA
MNIST - Opis:
- Extreme learning machine (ELM) is an efficient algorithm, but it requires more hidden nodes than the BP algorithms to reach the matched performance. Recently, an efficient learning algorithm, the upper-layer-solution-unaware algorithm (USUA), is proposed for the single-hidden layer feed-forward neural network. It needs less number of hidden nodes and testing time than ELM. In this paper, we mainly give the theoretical analysis for USUA. Theoretical results show that the error function monotonously decreases in the training procedure, the gradient of the error function with respect to weights tends to zero (the weak convergence), and the weight sequence goes to a fixed point (the strong convergence) when the iterations approach positive infinity. An illustrated simulation has been implemented on the MNIST database of handwritten digits which effectively verifies the theoretical results.
- Źródło:
-
Journal of Applied Computer Science Methods; 2016, 8 No. 1; 5-15
1689-9636 - Pojawia się w:
- Journal of Applied Computer Science Methods
- Dostawca treści:
- Biblioteka Nauki