The present paper aims to propose a new computational method for potential learning to
improve generalization and interpretation. Potential learning has been proposed to simplify
the computational procedures of information maximization and to specify which
neurons should be fired. However, it is often the case that potential learning sometimes
absorbs too much information content on input patterns in the early stage of learning,
which tends to degrade generalization performance. This can be solved by making potential
learning as slow as possible. Accordingly, we here propose a procedure called
“self-assimilation” in which connection weights are accentuated by their characteristics
observed in the specific learning step. This makes it possible to predict future connection
weights in the early stage of learning. Thus, it is possible to improve generalization by
slow learning and at the same time to improve the interpretation of connection weights
via the enhanced characteristics of the connection weights. The method was applied to
an artificial data set, as well as a real data set of counter services at a local government
office in the Tokyo metropolitan area. The results show that improved generalization
was observed by making learning as slow as possible. In addition, the number of strong
connection weights became smaller for better interpretation by self-assimilation.
Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies
Informacja
SZANOWNI CZYTELNICY!
UPRZEJMIE INFORMUJEMY, ŻE BIBLIOTEKA FUNKCJONUJE W NASTĘPUJĄCYCH GODZINACH:
Wypożyczalnia i Czytelnia Główna: poniedziałek – piątek od 9.00 do 19.00