Abstract:
|
Random sampling is an efficient method for dealing with constrained optimization problems. In computational geometry, this method has been successfully applied, through Clarkson’s algorithm (Clarkson 1996), to solve a general class of problems called violator spaces. In machine learning, Transductive Support Vector Machines (TSVM) is a learning method used when only a small fraction of labeled data is available, which implies solving a nonconvex optimization problem.
Several approximation methods have been proposed to solve it, but they usually find suboptimal solutions. However, a global optimal solution may be obtained by using exact techniques, but at the cost of suffering an exponential time complexity with respect to the number of instances. In this article, an interpretation of TSVM in terms of violator space is given. A randomized method is presented that extends the use of exact methods, thus reducing the time complexity exponentially w.r.t. the number of support vectors of the optimal solution instead of exponentially w.r.t. the number of instances. |