Logo der Universität Wien

MixSVM: Mixed-Precision Support Vector Machine Classification on FPGAs


Support vector machines (SVMs) are supervised machine learning methods for the construction of maximum margin classifiers. SVMs have proven to be highly efficient in a wide range of applications.

The MixSVM project deals with the efficient implementation of the SVM classification process (in contrast to the SVM training phase) on FPGAs. One way is to quantize support vectors to minimize implementation cost. This is, however, a highly non-linear operation and hinders development of learning techniques which include hardware implementation cost into their objective function.

In this project, the goal is to provide a near-optimal implementation of SVM classification for a wide range of SVM variants (number of support vectors, value range, kernel) with minimal support vector quantization. We believe that this can be achieved by exploiting structure in the support vectors, varying precision levels and the fact that support vectors remain constant and are known up-front in many situations. Rather than limiting the support vectors to a single bitwidth, we group them into classes with similar requirements and provide specialized processing elements. The result of the project will be a custom-tailored mixed-precision implementation of SVM  classification, given a set of support vectors.

  • Research Lab Computational Technologies and Applications
Further details
R&D projects, public and industry funding
2009 - 2012
Research Group Theory and Applications of Algorithms
Contact us
Faculty of Computer Science
University of Vienna

Währinger Straße 29
A-1090 Vienna