Please use this identifier to cite or link to this item: https://hdl.handle.net/10316/27727
Title: Eigenvalue decay: a new method for neural network regularization
Authors: Ludwig, Oswaldo 
Nunes, Urbano 
Araujo, Rui 
Keywords: Transduction; Regularization; Genetic algorithm; Classification margin; Neural network
Issue Date: 26-Jan-2014
Publisher: Elsevier
Citation: LUDWIG, Oswaldo; NUNES, Urbano; ARAUJO, Rui - Eigenvalue decay: a new method for neural network regularization. "Neurocomputing". ISSN 0925-2312. Vol. 124 (2014) p. 33–42
Serial title, monograph or event: Neurocomputing
Volume: 124
Abstract: This paper proposes two new training algorithms for multilayer perceptrons based on evolutionary computation, regularization, and transduction. Regularization is a commonly used technique for preventing the learning algorithm from overfitting the training data. In this context, this work introduces and analyzes a novel regularization scheme for neural networks (NNs) named eigenvalue decay, which aims at improving the classification margin. The introduction of eigenvalue decay led to the development of a new training method based on the same principles of SVM, and so named Support Vector NN (SVNN). Finally, by analogy with the transductive SVM (TSVM), it is proposed a transductive NN (TNN), by exploiting SVNN in order to address transductive learning. The effectiveness of the proposed algorithms is evaluated on seven benchmark datasets.
URI: https://hdl.handle.net/10316/27727
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2013.08.005
Rights: openAccess
Appears in Collections:I&D ISR - Artigos em Revistas Internacionais

Files in This Item:
File Description SizeFormat
Eigenvalue decay.pdf718.75 kBAdobe PDFView/Open
Show full item record

SCOPUSTM   
Citations

66
checked on May 6, 2024

WEB OF SCIENCETM
Citations 5

57
checked on May 2, 2024

Page view(s) 50

412
checked on May 7, 2024

Download(s) 20

1,216
checked on May 7, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.