Logo der Universität Wien

MixSVM: Mixed-Precision Support Vector Machine Classification on FPGAs

Table of content
Grafik

Further details
Type
R&D projects, public and industry funding
Duration
2009 - 2012
Executive
Research Group Theory and Applications of Algorithms
Links
http://rlcta.univie.ac.at/mixsvm
Abstract

Support vector machines (SVMs) are supervised machine learning methods for the construction of maximum margin classifiers. SVMs have proven to be highly efficient in a wide range of applications.

The MixSVM project deals with the efficient implementation of the SVM classification process (in contrast to the SVM training phase) on FPGAs. One way is to quantize support vectors to minimize implementation cost. This is, however, a highly non-linear operation and hinders development of learning techniques which include hardware implementation cost into their objective function.

In this project, the goal is to provide a near-optimal implementation of SVM classification for a wide range of SVM variants (number of support vectors, value range, kernel) with minimal support vector quantization. We believe that this can be achieved by exploiting structure in the support vectors, varying precision levels and the fact that support vectors remain constant and are known up-front in many situations. Rather than limiting the support vectors to a single bitwidth, we group them into classes with similar requirements and provide specialized processing elements. The result of the project will be a custom-tailored mixed-precision implementation of SVM  classification, given a set of support vectors.

Grafik Top
Associates
  • Research Lab Computational Technologies and Applications
Grafik Top

Theory and Applications of Algorithms

Faculty of Computer Science
University of Vienna
Währinger Straße 29
A-1090 Vienna

F +43-1-4277-9 783

Universität Wien | Universitätsring 1 | 1010 Wien | T +43-1-4277-0