An advanced pruning method in the architecture of extreme learning machines using L1-regularization and bootstrapping.

dc.contributor.authorSouza, Paulo Vitor de Campos
dc.contributor.authorTorres, Luiz Carlos Bambirra
dc.contributor.authorSilva, Gustavo Rodrigues Lacerda
dc.contributor.authorBraga, Antônio de Pádua
dc.contributor.authorLughofer, Edwin
dc.date.accessioned2022-09-15T17:51:48Z
dc.date.available2022-09-15T17:51:48Z
dc.date.issued2020pt_BR
dc.description.abstractExtreme learning machines (ELMs) are efficient for classification, regression, and time series prediction, as well as being a clear solution to backpropagation structures to determine values in intermediate layers of the learning model. One of the problems that an ELM may face is due to a large number of neurons in the hidden layer, making the expert model a specific data set. With a large number of neurons in the hidden layer, overfitting is more likely and thus unnecessary information can deterioriate the performance of the neural network. To solve this problem, a pruning method is proposed, called Pruning ELM Using Bootstrapped Lasso BR-ELM, which is based on regularization and resampling techniques, to select the most representative neurons for the model response. This method is based on an ensembled variant of Lasso (achieved through bootstrap replications) and aims to shrink the output weight parameters of the neurons to 0 as many and as much as possible. According to a subset of candidate regressors having significant coefficient values (greater than 0), it is possible to select the best neurons in the hidden layer of the ELM. Finally, pattern classification tests and benchmark regression tests of complex real-world problems are performed by comparing the proposed approach to other pruning models for ELMs. It can be seen that statistically BR-ELM can outperform several related state-of-the-art methods in terms of classification accuracies and model errors (while performing equally to Pruning-ELM P-ELM), and this with a significantly reduced number of finally selected neurons.pt_BR
dc.identifier.citationSOUZA, P. V. de C. et al. An advanced pruning method in the architecture of extreme learning machines using L1-regularization and bootstrapping. Eletronics, v. 9, n. 5, 2020. Disponível em: <https://www.mdpi.com/2079-9292/9/5/811>. Acesso em: 29 abr. 2022.pt_BR
dc.identifier.doihttps://doi.org/10.3390/electronics9050811pt_BR
dc.identifier.issn2079-9292
dc.identifier.urihttp://www.repositorio.ufop.br/jspui/handle/123456789/15294
dc.language.isoen_USpt_BR
dc.rightsabertopt_BR
dc.rights.licenseThis article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). Fonte: o PDF do artigo.pt_BR
dc.subjectLasso with bootstrappingpt_BR
dc.subjectPruning of neuronspt_BR
dc.subjectLeast angle regressionpt_BR
dc.titleAn advanced pruning method in the architecture of extreme learning machines using L1-regularization and bootstrapping.pt_BR
dc.typeArtigo publicado em periodicopt_BR
Arquivos
Pacote Original
Agora exibindo 1 - 1 de 1
Nenhuma Miniatura disponível
Nome:
ARTIGO_AdvancedPruningMethod.pdf
Tamanho:
1.68 MB
Formato:
Adobe Portable Document Format
Descrição:
Licença do Pacote
Agora exibindo 1 - 1 de 1
Nenhuma Miniatura disponível
Nome:
license.txt
Tamanho:
1.71 KB
Formato:
Item-specific license agreed upon to submission
Descrição: