Abstract:
|
Feed-forward Neural Networks (FNNs) and Support Vector Machines
(SVMs) are two machine learning frameworks developed from very
different starting points of view. The solutions obtained by the
respective frameworks may be very different. In this work a new
learning model for FNNs will be proposed such that, in the linearly
separable case, tends to obtain the same solution that SVMs. The key
idea of the model is a weighting of the sum-of-squares error function,
which is inspired in the AdaBoost algorithm. The model depends on a
parameter that controls the hardness of the margin, as in SVMs, so
that it can be used for the non-linearly separable case as well. In
addition, it allows to deal with multiclass and multilabel problems in
a natural way (as FNNs usually do), and it is not restricted to the
use of kernel functions. Finally, it is independent of the concrete
training algorithm used. Both theoretic and experimental results will
be shown to confirm these ideas. |