Abstract:
|
Feature selection techniques try to select the most suitable subset from a set of attributes, some of which can be useless or contain too much noise. Since analyzing all possible subsets has exponential cost, a search is usually done in a heuristic manner. One of these heuristics is Forward Selection, that starts with an empty subset and adds at every step the most salient element, keeping all the previously selected elements in the current subset. On the other hand, neural networks are a useful tool for pattern recognition and function approximation. There exist many types of algorithms for supervised neural networks training. Among them, growing algorithms start from a network without hidden neurons and add neurons one at a time until the network reaches a suitable performance. Most growing algorithms implicitly perform a search with Forward Selection with the objective of finding a good subset of neurons. This link between both fields (feature selection and neuron selection) suggests using other search algorithms, coming from the feature selection field, to create new neural networks training algorithms. This work proposes and analyzes different algorithms for the construction of neural networks based on existent feature selection methods and on the weights selection performed by SAOCIF (Sequential Approximation with Optimal Coefficients and Interacting Frequencies. SAOCIF has been shown to be a competitive growing scheme for the construction of neural networks. Experimental results show that, with the same number of neurons, a better approximation than SAOCIF is indeed achieved though at the expense of a higher computational cost. |