Efficient Construction and Training of Multilayer Perceptrons by Incremental Pattern Selection


The Transactions of the Korea Information Processing Society (1994 ~ 2000), Vol. 3, No. 3, pp. 429-438, Apr. 1996
10.3745/KIPSTE.1996.3.3.429,   PDF Download:

Abstract

An incremental learning algorithms is presented that constructs a multilayer perceptron whose size is optimal for solving a given problem. unlike conventional algorithms in which a fixed size training set is processed repeatedly, the method uses an increasing number of critical examples to find a necessary and sufficient number of hidden units for learning the entire data. Experimental results in hand-written digit recognition shows that the network size optimization combined with incremental pattern selection generalizes significantly better and converges faster than conventional methods.


Statistics
Show / Hide Statistics

Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article
[IEEE Style]
Z. B. Tak, "Efficient Construction and Training of Multilayer Perceptrons by Incremental Pattern Selection," The Transactions of the Korea Information Processing Society (1994 ~ 2000), vol. 3, no. 3, pp. 429-438, 1996. DOI: 10.3745/KIPSTE.1996.3.3.429.

[ACM Style]
Zhang Byoung Tak. 1996. Efficient Construction and Training of Multilayer Perceptrons by Incremental Pattern Selection. The Transactions of the Korea Information Processing Society (1994 ~ 2000), 3, 3, (1996), 429-438. DOI: 10.3745/KIPSTE.1996.3.3.429.