Support Vector Machines are today widely used in Machine Learning. Designed originally to handle classification problems, their success relies on 2 main ingredients: linear classification, that leads to some simple quadratic constrained optimization problems, that can be solved using standard powerful algorithms; the "kernel trick", that allows the programmer to solve the quadratic problem in a very high dimensional feature space while computing only in the low-dimension original space. Beyond classification, several other Machine Learning problems can be addressed by using other constraints on the resulting quadratic problem, including rank-regression. Surrogate models are mandatory for any optimization method when tackling expensive objective functions - the more for Evolutionary Algorithms, that are well-known for their appetite in terms of CPU cycles. Regression-SVM are one of the many methods that have been used to build approximations of the objective function in the evolutionary framework. However, the flexibility of SVM to address other ML paradigms mentioned above can lead to original surrogate models. Along these lines, in the framework of continuous evolutionary optimization, this talk will present a rank-SVM based surrogate model leading to a comparison-based surrogate-assisted CMA-ES.
Stochastic continuous optimization, Surrogate models, Rank Support Vector Machines
Loshchilov, I., Schoenauer, M. and Sebag, M.. (2012)
Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy,
Proc. ACM-GECCO, .