- Tytuł:
- Effects of Sparse Initialization in Deep Belief Networks
- Autorzy:
-
Grzegorczyk, K.
Kurdziel, M.
Wójcik, P. I. - Powiązania:
- https://bibliotekanauki.pl/articles/305264.pdf
- Data publikacji:
- 2015
- Wydawca:
- Akademia Górniczo-Hutnicza im. Stanisława Staszica w Krakowie. Wydawnictwo AGH
- Tematy:
-
sparse initialization
Deep Belief Networks
Noisy Rectified Linear Units - Opis:
- Deep neural networks are often trained in two phases: first, hidden layers are pretrained in an unsupervised manner, and then the network is fine-tuned with error backpropagation. Pretraining is often carried out using Deep Belief Networks (DBNs), with initial weights set to small random values. However, recent results established that well-designed initialization schemes, e.g., Sparse Initialization (SI), can greatly improve the performance of networks that do not use pretraining. An interesting question arising from these results is whether such initialization techniques wouldn’t also improve pretrained networks. To shed light on this question, in this work we evaluate SI in DBNs that are used to pretrain discriminative networks. The motivation behind this research is our observation that SI has an impact on the features learned by a DBN during pretraining. Our results demonstrate that this improves network performance: when pretraining starts from sparsely initialized weight matrices, networks achieve lower classification errors after fine-tuning.
- Źródło:
-
Computer Science; 2015, 16 (4); 313-327
1508-2806
2300-7036 - Pojawia się w:
- Computer Science
- Dostawca treści:
- Biblioteka Nauki