1Libsvm is a simple, easy-to-use, and efficient software for SVM
2classification and regression. It solves C-SVM classification, nu-SVM
3classification, one-class-SVM, epsilon-SVM regression, and nu-SVM
4regression. It also provides an automatic model selection tool for
5C-SVM classification. This document explains the use of libsvm.
6
7Libsvm is available at
8http://www.csie.ntu.edu.tw/~cjlin/libsvm
9Please read the COPYRIGHT file before using libsvm.
10
11Table of Contents
12=================
13
14- Quick Start
15- Installation and Data Format
16- `svm-train' Usage
17- `svm-predict' Usage
18- `svm-scale' Usage
19- Tips on Practical Use
20- Examples
21- Precomputed Kernels
22- Library Usage
23- Java Version
24- Building Windows Binaries
25- Additional Tools: Sub-sampling, Parameter Selection, Format checking, etc.
26- MATLAB/OCTAVE Interface
27- Python Interface
28- Additional Information
29
30Quick Start
31===========
32
33If you are new to SVM and if the data is not large, please go to
34`tools' directory and use easy.py after installation. It does
35everything automatic -- from data scaling to parameter selection.
36
37Usage: easy.py training_file [testing_file]
38
39More information about parameter selection can be found in
40`tools/README.'
41
42Installation and Data Format
43============================
44
45On Unix systems, type `make' to build the `svm-train', `svm-predict',
46and `svm-scale' programs. Run them without arguments to show the
47usages of them.
48
49On other systems, consult `Makefile' to build them (e.g., see
50'Building Windows binaries' in this file) or use the pre-built
51binaries (Windows binaries are in the directory `windows').
52
53The format of training and testing data files is:
54
55<label> <index1>:<value1> <index2>:<value2> ...
56.
57.
58.
59
60Each line contains an instance and is ended by a '\n' character. For
61<label> in the training set, we have the following cases.
62
63* classification: <label> is an integer indicating the class label
64 (multi-class is supported).
65
66* For regression, <label> is the target value which can be any real
67 number.
68
69* For one-class SVM, <label> is not used and can be any number.
70
71In the test set, <label> is used only to calculate accuracy or
72errors. If it's unknown, any number is fine. For one-class SVM, if
73non-outliers/outliers are known, their labels in the test file must be
74+1/-1 for evaluation.
75
76The pair <index>:<value> gives a feature (attribute) value: <index> is
77an integer starting from 1 and <value> is a real number. The only
78exception is the precomputed kernel, where <index> starts from 0; see
79the section of precomputed kernels. Indices must be in ASCENDING
80order.
81
82A sample classification data included in this package is
83`heart_scale'. To check if your data is in a correct form, use
84`tools/checkdata.py' (details in `tools/README').
85
86Type `svm-train heart_scale', and the program will read the training
87data and output the model file `heart_scale.model'. If you have a test
88set called heart_scale.t, then type `svm-predict heart_scale.t
89heart_scale.model output' to see the prediction accuracy. The `output'
90file contains the predicted class labels.
91
92For classification, if training data are in only one class (i.e., all
93labels are the same), then `svm-train' issues a warning message:
94`Warning: training data in only one class. See README for details,'
95which means the training data is very unbalanced. The label in the
96training data is directly returned when testing.
97
98There are some other useful programs in this package.
99
100svm-scale:
101
102 This is a tool for scaling input data file.
103
104svm-toy:
105
106 This is a simple graphical interface which shows how SVM
107 separate data in a plane. You can click in the window to
108 draw data points. Use "change" button to choose class
109 1, 2 or 3 (i.e., up to three classes are supported), "load"
110 button to load data from a file, "save" button to save data to
111 a file, "run" button to obtain an SVM model, and "clear"
112 button to clear the window.
113
114 You can enter options in the bottom of the window, the syntax of
115 options is the same as `svm-train'.
116
117 Note that "load" and "save" consider dense data format both in
118 classification and the regression cases. For classification,
119 each data point has one label (the color) that must be 1, 2,
120 or 3 and two attributes (x-axis and y-axis values) in
121 [0,1). For regression, each data point has one target value
122 (y-axis) and one attribute (x-axis values) in [0, 1).
123
124 Type `make' in respective directories to build them.
125
126 You need Qt library to build the Qt version.
127 (available from http://www.trolltech.com)
128
129 You need GTK+ library to build the GTK version.
130 (available from http://www.gtk.org)
131
132 The pre-built Windows binaries are in the `windows'
133 directory. We use Visual C++ on a 64-bit machine.
134
135`svm-train' Usage
136=================
137
138Usage: svm-train [options] training_set_file [model_file]
139options:
140-s svm_type : set type of SVM (default 0)
141 0 -- C-SVC (multi-class classification)
142 1 -- nu-SVC (multi-class classification)
143 2 -- one-class SVM
144 3 -- epsilon-SVR (regression)
145 4 -- nu-SVR (regression)
146-t kernel_type : set type of kernel function (default 2)
147 0 -- linear: u'*v
148 1 -- polynomial: (gamma*u'*v + coef0)^degree
149 2 -- radial basis function: exp(-gamma*|u-v|^2)
150 3 -- sigmoid: tanh(gamma*u'*v + coef0)
151 4 -- precomputed kernel (kernel values in training_set_file)
152-d degree : set degree in kernel function (default 3)
153-g gamma : set gamma in kernel function (default 1/num_features)
154-r coef0 : set coef0 in kernel function (default 0)
155-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)
156-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)
157-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
158-m cachesize : set cache memory size in MB (default 100)
159-e epsilon : set tolerance of termination criterion (default 0.001)
160-h shrinking : whether to use the shrinking heuristics, 0 or 1 (default 1)
161-b probability_estimates : whether to train a SVC or SVR model for probability estimates, 0 or 1 (default 0)
162-wi weight : set the parameter C of class i to weight*C, for C-SVC (default 1)
163-v n: n-fold cross validation mode
164-q : quiet mode (no outputs)
165
166
167option -v randomly splits the data into n parts and calculates cross
168validation accuracy/mean squared error on them.
169
170See libsvm FAQ for the meaning of outputs.
171
172`svm-predict' Usage
173===================
174
175Usage: svm-predict [options] test_file model_file output_file
176options:
177-b probability_estimates: whether to predict probability estimates, 0 or 1 (default 0); for one-class SVM only 0 is supported
178
179model_file is the model file generated by svm-train.
180test_file is the test data you want to predict.
181svm-predict will produce output in the output_file.
182
183`svm-scale' Usage
184=================
185
186Usage: svm-scale [options] data_filename
187options:
188-l lower : x scaling lower limit (default -1)
189-u upper : x scaling upper limit (default +1)
190-y y_lower y_upper : y scaling limits (default: no y scaling)
191-s save_filename : save scaling parameters to save_filename
192-r restore_filename : restore scaling parameters from restore_filename
193
194See 'Examples' in this file for examples.
195
196Tips on Practical Use
197=====================
198
199* Scale your data. For example, scale each attribute to [0,1] or [-1,+1].
200* For C-SVC, consider using the model selection tool in the tools directory.
201* nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of training
202 errors and support vectors.
203* If data for classification are unbalanced (e.g. many positive and
204 few negative), try different penalty parameters C by -wi (see
205 examples below).
206* Specify larger cache size (i.e., larger -m) for huge problems.
207
208Examples
209========
210
211> svm-scale -l -1 -u 1 -s range train > train.scale
212> svm-scale -r range test > test.scale
213
214Scale each feature of the training data to be in [-1,1]. Scaling
215factors are stored in the file range and then used for scaling the
216test data.
217
218> svm-train -s 0 -c 5 -t 2 -g 0.5 -e 0.1 data_file
219
220Train a classifier with RBF kernel exp(-0.5|u-v|^2), C=10, and
221stopping tolerance 0.1.
222
223> svm-train -s 3 -p 0.1 -t 0 data_file
224
225Solve SVM regression with linear kernel u'v and epsilon=0.1
226in the loss function.
227
228> svm-train -c 10 -w1 1 -w-2 5 -w4 2 data_file
229
230Train a classifier with penalty 10 = 1 * 10 for class 1, penalty 50 =
2315 * 10 for class -2, and penalty 20 = 2 * 10 for class 4.
232
233> svm-train -s 0 -c 100 -g 0.1 -v 5 data_file
234
235Do five-fold cross validation for the classifier using
236the parameters C = 100 and gamma = 0.1
237
238> svm-train -s 0 -b 1 data_file
239> svm-predict -b 1 test_file data_file.model output_file
240
241Obtain a model with probability information and predict test data with
242probability estimates
243
244Precomputed Kernels
245===================
246
247Users may precompute kernel values and input them as training and
248testing files. Then libsvm does not need the original
249training/testing sets.
250
251Assume there are L training instances x1, ..., xL and.
252Let K(x, y) be the kernel
253value of two instances x and y. The input formats
254are:
255
256New training instance for xi:
257
258<label> 0:i 1:K(xi,x1) ... L:K(xi,xL)
259
260New testing instance for any x:
261
262<label> 0:? 1:K(x,x1) ... L:K(x,xL)
263
264That is, in the training file the first column must be the "ID" of
265xi. In testing, ? can be any value.
266
267All kernel values including ZEROs must be explicitly provided. Any
268permutation or random subsets of the training/testing files are also
269valid (see examples below).
270
271Note: the format is slightly different from the precomputed kernel
272package released in libsvmtools earlier.
273
274Examples:
275
276 Assume the original training data has three four-feature
277 instances and testing data has one instance:
278
279 15 1:1 2:1 3:1 4:1
280 45 2:3 4:3
281 25 3:1
282
283 15 1:1 3:1
284
285 If the linear kernel is used, we have the following new
286 training/testing sets:
287
288 15 0:1 1:4 2:6 3:1
289 45 0:2 1:6 2:18 3:0
290 25 0:3 1:1 2:0 3:1
291
292 15 0:? 1:2 2:0 3:1
293
294 ? can be any value.
295
296 Any subset of the above training file is also valid. For example,
297
298 25 0:3 1:1 2:0 3:1
299 45 0:2 1:6 2:18 3:0
300
301 implies that the kernel matrix is
302
303 [K(2,2) K(2,3)] = [18 0]
304 [K(3,2) K(3,3)] = [0 1]
305
306Library Usage
307=============
308
309These functions and structures are declared in the header file
310`svm.h'. You need to #include "svm.h" in your C/C++ source files and
311link your program with `svm.cpp'. You can see `svm-train.c' and
312`svm-predict.c' for examples showing how to use them. We define
313LIBSVM_VERSION and declare `extern int libsvm_version;' in svm.h, so
314you can check the version number.
315
316Before you classify test data, you need to construct an SVM model
317(`svm_model') using training data. A model can also be saved in
318a file for later use. Once an SVM model is available, you can use it
319to classify new data.
320
321- Function: struct svm_model *svm_train(const struct svm_problem *prob,
322 const struct svm_parameter *param);
323
324 This function constructs and returns an SVM model according to
325 the given training data and parameters.
326
327 struct svm_problem describes the problem:
328
329 struct svm_problem
330 {
331 int l;
332 double *y;
333 struct svm_node **x;
334 };
335
336 where `l' is the number of training data, and `y' is an array containing
337 their target values. (integers in classification, real numbers in
338 regression) `x' is an array of pointers, each of which points to a sparse
339 representation (array of svm_node) of one training vector.
340
341 For example, if we have the following training data:
342
343 LABEL ATTR1 ATTR2 ATTR3 ATTR4 ATTR5
344 ----- ----- ----- ----- ----- -----
345 1 0 0.1 0.2 0 0
346 2 0 0.1 0.3 -1.2 0
347 1 0.4 0 0 0 0
348 2 0 0.1 0 1.4 0.5
349 3 -0.1 -0.2 0.1 1.1 0.1
350
351 then the components of svm_problem are:
352
353 l = 5
354
355 y -> 1 2 1 2 3
356
357 x -> [ ] -> (2,0.1) (3,0.2) (-1,?)
358 [ ] -> (2,0.1) (3,0.3) (4,-1.2) (-1,?)
359 [ ] -> (1,0.4) (-1,?)
360 [ ] -> (2,0.1) (4,1.4) (5,0.5) (-1,?)
361 [ ] -> (1,-0.1) (2,-0.2) (3,0.1) (4,1.1) (5,0.1) (-1,?)
362
363 where (index,value) is stored in the structure `svm_node':
364
365 struct svm_node
366 {
367 int index;
368 double value;
369 };
370
371 index = -1 indicates the end of one vector. Note that indices must
372 be in ASCENDING order.
373
374 struct svm_parameter describes the parameters of an SVM model:
375
376 struct svm_parameter
377 {
378 int svm_type;
379 int kernel_type;
380 int degree; /* for poly */
381 double gamma; /* for poly/rbf/sigmoid */
382 double coef0; /* for poly/sigmoid */
383
384 /* these are for training only */
385 double cache_size; /* in MB */
386 double eps; /* stopping criteria */
387 double C; /* for C_SVC, EPSILON_SVR, and NU_SVR */
388 int nr_weight; /* for C_SVC */
389 int *weight_label; /* for C_SVC */
390 double* weight; /* for C_SVC */
391 double nu; /* for NU_SVC, ONE_CLASS, and NU_SVR */
392 double p; /* for EPSILON_SVR */
393 int shrinking; /* use the shrinking heuristics */
394 int probability; /* do probability estimates */
395 };
396
397 svm_type can be one of C_SVC, NU_SVC, ONE_CLASS, EPSILON_SVR, NU_SVR.
398
399 C_SVC: C-SVM classification
400 NU_SVC: nu-SVM classification
401 ONE_CLASS: one-class-SVM
402 EPSILON_SVR: epsilon-SVM regression
403 NU_SVR: nu-SVM regression
404
405 kernel_type can be one of LINEAR, POLY, RBF, SIGMOID.
406
407 LINEAR: u'*v
408 POLY: (gamma*u'*v + coef0)^degree
409 RBF: exp(-gamma*|u-v|^2)
410 SIGMOID: tanh(gamma*u'*v + coef0)
411 PRECOMPUTED: kernel values in training_set_file
412
413 cache_size is the size of the kernel cache, specified in megabytes.
414 C is the cost of constraints violation.
415 eps is the stopping criterion. (we usually use 0.00001 in nu-SVC,
416 0.001 in others). nu is the parameter in nu-SVM, nu-SVR, and
417 one-class-SVM. p is the epsilon in epsilon-insensitive loss function
418 of epsilon-SVM regression. shrinking = 1 means shrinking is conducted;
419 = 0 otherwise. probability = 1 means model with probability
420 information is obtained; = 0 otherwise.
421
422 nr_weight, weight_label, and weight are used to change the penalty
423 for some classes (If the weight for a class is not changed, it is
424 set to 1). This is useful for training classifier using unbalanced
425 input data or with asymmetric misclassification cost.
426
427 nr_weight is the number of elements in the array weight_label and
428 weight. Each weight[i] corresponds to weight_label[i], meaning that
429 the penalty of class weight_label[i] is scaled by a factor of weight[i].
430
431 If you do not want to change penalty for any of the classes,
432 just set nr_weight to 0.
433
434 *NOTE* Because svm_model contains pointers to svm_problem, you can
435 not free the memory used by svm_problem if you are still using the
436 svm_model produced by svm_train().
437
438 *NOTE* To avoid wrong parameters, svm_check_parameter() should be
439 called before svm_train().
440
441 struct svm_model stores the model obtained from the training procedure.
442 It is not recommended to directly access entries in this structure.
443 Programmers should use the interface functions to get the values.
444
445 struct svm_model
446 {
447 struct svm_parameter param; /* parameter */
448 int nr_class; /* number of classes, = 2 in regression/one class svm */
449 int l; /* total #SV */
450 struct svm_node **SV; /* SVs (SV[l]) */
451 double **sv_coef; /* coefficients for SVs in decision functions (sv_coef[k-1][l]) */
452 double *rho; /* constants in decision functions (rho[k*(k-1)/2]) */
453 double *probA; /* pairwise probability information */
454 double *probB;
455 int *sv_indices; /* sv_indices[0,...,nSV-1] are values in [1,...,num_traning_data] to indicate SVs in the training set */
456
457 /* for classification only */
458
459 int *label; /* label of each class (label[k]) */
460 int *nSV; /* number of SVs for each class (nSV[k]) */
461 /* nSV[0] + nSV[1] + ... + nSV[k-1] = l */
462 /* XXX */
463 int free_sv; /* 1 if svm_model is created by svm_load_model*/
464 /* 0 if svm_model is created by svm_train */
465 };
466
467 param describes the parameters used to obtain the model.
468
469 nr_class is the number of classes. It is 2 for regression and one-class SVM.
470
471 l is the number of support vectors. SV and sv_coef are support
472 vectors and the corresponding coefficients, respectively. Assume there are
473 k classes. For data in class j, the corresponding sv_coef includes (k-1) y*alpha vectors,
474 where alpha's are solutions of the following two class problems:
475 1 vs j, 2 vs j, ..., j-1 vs j, j vs j+1, j vs j+2, ..., j vs k
476 and y=1 for the first j-1 vectors, while y=-1 for the remaining k-j
477 vectors. For example, if there are 4 classes, sv_coef and SV are like:
478
479 +-+-+-+--------------------+
480 |1|1|1| |
481 |v|v|v| SVs from class 1 |
482 |2|3|4| |
483 +-+-+-+--------------------+
484 |1|2|2| |
485 |v|v|v| SVs from class 2 |
486 |2|3|4| |
487 +-+-+-+--------------------+
488 |1|2|3| |
489 |v|v|v| SVs from class 3 |
490 |3|3|4| |
491 +-+-+-+--------------------+
492 |1|2|3| |
493 |v|v|v| SVs from class 4 |
494 |4|4|4| |
495 +-+-+-+--------------------+
496
497 See svm_train() for an example of assigning values to sv_coef.
498
499 rho is the bias term (-b). probA and probB are parameters used in
500 probability outputs. If there are k classes, there are k*(k-1)/2
501 binary problems as well as rho, probA, and probB values. They are
502 aligned in the order of binary problems:
503 1 vs 2, 1 vs 3, ..., 1 vs k, 2 vs 3, ..., 2 vs k, ..., k-1 vs k.
504
505 sv_indices[0,...,nSV-1] are values in [1,...,num_traning_data] to
506 indicate support vectors in the training set.
507
508 label contains labels in the training data.
509
510 nSV is the number of support vectors in each class.
511
512 free_sv is a flag used to determine whether the space of SV should
513 be released in free_model_content(struct svm_model*) and
514 free_and_destroy_model(struct svm_model**). If the model is
515 generated by svm_train(), then SV points to data in svm_problem
516 and should not be removed. For example, free_sv is 0 if svm_model
517 is created by svm_train, but is 1 if created by svm_load_model.
518
519- Function: double svm_predict(const struct svm_model *model,
520 const struct svm_node *x);
521
522 This function does classification or regression on a test vector x
523 given a model.
524
525 For a classification model, the predicted class for x is returned.
526 For a regression model, the function value of x calculated using
527 the model is returned. For an one-class model, +1 or -1 is
528 returned.
529
530- Function: void svm_cross_validation(const struct svm_problem *prob,
531 const struct svm_parameter *param, int nr_fold, double *target);
532
533 This function conducts cross validation. Data are separated to
534 nr_fold folds. Under given parameters, sequentially each fold is
535 validated using the model from training the remaining. Predicted
536 labels (of all prob's instances) in the validation process are
537 stored in the array called target.
538
539 The format of svm_prob is same as that for svm_train().
540
541- Function: int svm_get_svm_type(const struct svm_model *model);
542
543 This function gives svm_type of the model. Possible values of
544 svm_type are defined in svm.h.
545
546- Function: int svm_get_nr_class(const svm_model *model);
547
548 For a classification model, this function gives the number of
549 classes. For a regression or an one-class model, 2 is returned.
550
551- Function: void svm_get_labels(const svm_model *model, int* label)
552
553 For a classification model, this function outputs the name of
554 labels into an array called label. For regression and one-class
555 models, label is unchanged.
556
557- Function: void svm_get_sv_indices(const struct svm_model *model, int *sv_indices)
558
559 This function outputs indices of support vectors into an array called sv_indices.
560 The size of sv_indices is the number of support vectors and can be obtained by calling svm_get_nr_sv.
561 Each sv_indices[i] is in the range of [1, ..., num_traning_data].
562
563- Function: int svm_get_nr_sv(const struct svm_model *model)
564
565 This function gives the number of total support vector.
566
567- Function: double svm_get_svr_probability(const struct svm_model *model);
568
569 For a regression model with probability information, this function
570 outputs a value sigma > 0. For test data, we consider the
571 probability model: target value = predicted value + z, z: Laplace
572 distribution e^(-|z|/sigma)/(2sigma)
573
574 If the model is not for svr or does not contain required
575 information, 0 is returned.
576
577- Function: double svm_predict_values(const svm_model *model,
578 const svm_node *x, double* dec_values)
579
580 This function gives decision values on a test vector x given a
581 model, and return the predicted label (classification) or
582 the function value (regression).
583
584 For a classification model with nr_class classes, this function
585 gives nr_class*(nr_class-1)/2 decision values in the array
586 dec_values, where nr_class can be obtained from the function
587 svm_get_nr_class. The order is label[0] vs. label[1], ...,
588 label[0] vs. label[nr_class-1], label[1] vs. label[2], ...,
589 label[nr_class-2] vs. label[nr_class-1], where label can be
590 obtained from the function svm_get_labels. The returned value is
591 the predicted class for x. Note that when nr_class = 1, this
592 function does not give any decision value.
593
594 For a regression model, dec_values[0] and the returned value are
595 both the function value of x calculated using the model. For a
596 one-class model, dec_values[0] is the decision value of x, while
597 the returned value is +1/-1.
598
599- Function: double svm_predict_probability(const struct svm_model *model,
600 const struct svm_node *x, double* prob_estimates);
601
602 This function does classification or regression on a test vector x
603 given a model with probability information.
604
605 For a classification model with probability information, this
606 function gives nr_class probability estimates in the array
607 prob_estimates. nr_class can be obtained from the function
608 svm_get_nr_class. The class with the highest probability is
609 returned. For regression/one-class SVM, the array prob_estimates
610 is unchanged and the returned value is the same as that of
611 svm_predict.
612
613- Function: const char *svm_check_parameter(const struct svm_problem *prob,
614 const struct svm_parameter *param);
615
616 This function checks whether the parameters are within the feasible
617 range of the problem. This function should be called before calling
618 svm_train() and svm_cross_validation(). It returns NULL if the
619 parameters are feasible, otherwise an error message is returned.
620
621- Function: int svm_check_probability_model(const struct svm_model *model);
622
623 This function checks whether the model contains required
624 information to do probability estimates. If so, it returns
625 +1. Otherwise, 0 is returned. This function should be called
626 before calling svm_get_svr_probability and
627 svm_predict_probability.
628
629- Function: int svm_save_model(const char *model_file_name,
630 const struct svm_model *model);
631
632 This function saves a model to a file; returns 0 on success, or -1
633 if an error occurs.
634
635- Function: struct svm_model *svm_load_model(const char *model_file_name);
636
637 This function returns a pointer to the model read from the file,
638 or a null pointer if the model could not be loaded.
639
640- Function: void svm_free_model_content(struct svm_model *model_ptr);
641
642 This function frees the memory used by the entries in a model structure.
643
644- Function: void svm_free_and_destroy_model(struct svm_model **model_ptr_ptr);
645
646 This function frees the memory used by a model and destroys the model
647 structure. It is equivalent to svm_destroy_model, which
648 is deprecated after version 3.0.
649
650- Function: void svm_destroy_param(struct svm_parameter *param);
651
652 This function frees the memory used by a parameter set.
653
654- Function: void svm_set_print_string_function(void (*print_func)(const char *));
655
656 Users can specify their output format by a function. Use
657 svm_set_print_string_function(NULL);
658 for default printing to stdout.
659
660Java Version
661============
662
663The pre-compiled java class archive `libsvm.jar' and its source files are
664in the java directory. To run the programs, use
665
666java -classpath libsvm.jar svm_train <arguments>
667java -classpath libsvm.jar svm_predict <arguments>
668java -classpath libsvm.jar svm_toy
669java -classpath libsvm.jar svm_scale <arguments>
670
671Note that you need Java 1.5 (5.0) or above to run it.
672
673You may need to add Java runtime library (like classes.zip) to the classpath.
674You may need to increase maximum Java heap size.
675
676Library usages are similar to the C version. These functions are available:
677
678public class svm {
679 public static final int LIBSVM_VERSION=324;
680 public static svm_model svm_train(svm_problem prob, svm_parameter param);
681 public static void svm_cross_validation(svm_problem prob, svm_parameter param, int nr_fold, double[] target);
682 public static int svm_get_svm_type(svm_model model);
683 public static int svm_get_nr_class(svm_model model);
684 public static void svm_get_labels(svm_model model, int[] label);
685 public static void svm_get_sv_indices(svm_model model, int[] indices);
686 public static int svm_get_nr_sv(svm_model model);
687 public static double svm_get_svr_probability(svm_model model);
688 public static double svm_predict_values(svm_model model, svm_node[] x, double[] dec_values);
689 public static double svm_predict(svm_model model, svm_node[] x);
690 public static double svm_predict_probability(svm_model model, svm_node[] x, double[] prob_estimates);
691 public static void svm_save_model(String model_file_name, svm_model model) throws IOException
692 public static svm_model svm_load_model(String model_file_name) throws IOException
693 public static String svm_check_parameter(svm_problem prob, svm_parameter param);
694 public static int svm_check_probability_model(svm_model model);
695 public static void svm_set_print_string_function(svm_print_interface print_func);
696}
697
698The library is in the "libsvm" package.
699Note that in Java version, svm_node[] is not ended with a node whose index = -1.
700
701Users can specify their output format by
702
703 your_print_func = new svm_print_interface()
704 {
705 public void print(String s)
706 {
707 // your own format
708 }
709 };
710 svm.svm_set_print_string_function(your_print_func);
711
712Building Windows Binaries
713=========================
714
715Windows binaries are available in the directory `windows'. To re-build
716them via Visual C++, use the following steps:
717
7181. Open a DOS command box (or Visual Studio Command Prompt) and change
719to libsvm directory. If environment variables of VC++ have not been
720set, type
721
722"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars64.bat"
723
724You may have to modify the above command according which version of
725VC++ or where it is installed.
726
7272. Type
728
729nmake -f Makefile.win clean all
730
7313. (optional) To build shared library libsvm.dll, type
732
733nmake -f Makefile.win lib
734
7354. (optional) To build 32-bit windows binaries, you must
736 (1) Setup "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars32.bat" instead of vcvars64.bat
737 (2) Change CFLAGS in Makefile.win: /D _WIN64 to /D _WIN32
738
739Another way is to build them from Visual C++ environment. See details
740in libsvm FAQ.
741
742- Additional Tools: Sub-sampling, Parameter Selection, Format checking, etc.
743============================================================================
744
745See the README file in the tools directory.
746
747MATLAB/OCTAVE Interface
748=======================
749
750Please check the file README in the directory `matlab'.
751
752Python Interface
753================
754
755See the README file in python directory.
756
757Additional Information
758======================
759
760If you find LIBSVM helpful, please cite it as
761
762Chih-Chung Chang and Chih-Jen Lin, LIBSVM : a library for support
763vector machines. ACM Transactions on Intelligent Systems and
764Technology, 2:27:1--27:27, 2011. Software available at
765http://www.csie.ntu.edu.tw/~cjlin/libsvm
766
767LIBSVM implementation document is available at
768http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf
769
770For any questions and comments, please email cjlin@csie.ntu.edu.tw
771
772Acknowledgments:
773This work was supported in part by the National Science
774Council of Taiwan via the grant NSC 89-2213-E-002-013.
775The authors thank their group members and users
776for many helpful discussions and comments. They are listed in
777http://www.csie.ntu.edu.tw/~cjlin/libsvm/acknowledgements
778
779