**Título:** Meta-learning to optimize the number of hidden nodes of MLP networks trained by Extreme Learning Machine algorithm

**Autores:** Lucas, Tarcísio; Prudêncio, Ricardo; Ludermir, Teresa; Soares, Carlos

**Resumo:** The optimization of Artificial Neural Networks (ANNs) is an important task to the success of using these models in real-world applications. The solutions adopted to this task are expensive in general, involving trial-and-error procedures or expert knowledge which is not always available. In this work, we investigate the use of meta-learning to chose the number of hidden nodes in MLP networks trained by Extreme Learning Machine algorithm. Meta-learning is a research field aiming to automatically acquiring knowledge which relates features of the learning problems to the performance of the learning algorithms. The meta-learning techniques were originally proposed and evaluated to the algorithm selection problem. In recent years, meta-learning was investigated to optimize parameters specifically for Support Vector Machines. However, meta-learning can be adopted as a more general strategy to optimize ANN parameters, which motivates new efforts in this research direction. Meta-learning treats the parameter selection is just another supervised learning task. In our work, we generated a base of meta-examples associated to 93 regression problems. Each meta-example was generated from a regression problem and stored: 16 features describing the problem (e.g., number of attributes and correlation among the problem’s attributes) and the best number of nodes for this problem, empirically chosen from a range of possible values. This set of meta-examples was given as input to a meta-learner which was able to predict the best number of nodes for new problems based on their features. The experiments performed in this case study revealed satisfactory results.

**Palavras-chave:** Meta-learning; parameter selection; neural network optimization; Multilayer Perceptron; Extreme Learning Machine

**Páginas:** 8

**Código DOI:** 10.21528/CBIC2011-32.6

**Artigo em pdf:** st_32.6.pdf

**Arquivo BibTex:** st_32.6.bib