|
virtual float | getInitialStepSize () const =0 |
| Parameter initialStepSize of a SVMSGD optimization problem.
|
|
virtual float | getMarginRegularization () const =0 |
| Parameter marginRegularization of a SVMSGD optimization problem.
|
|
virtual int | getMarginType () const =0 |
| Margin type, one of SVMSGD::MarginType.
|
|
virtual float | getShift ()=0 |
|
virtual float | getStepDecreasingPower () const =0 |
| Parameter stepDecreasingPower of a SVMSGD optimization problem.
|
|
virtual int | getSvmsgdType () const =0 |
| Algorithm type, one of SVMSGD::SvmsgdType.
|
|
virtual TermCriteria | getTermCriteria () const =0 |
| Termination criteria of the training algorithm. You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon).
|
|
virtual Mat | getWeights ()=0 |
|
virtual void | setInitialStepSize (float InitialStepSize)=0 |
|
virtual void | setMarginRegularization (float marginRegularization)=0 |
|
virtual void | setMarginType (int marginType)=0 |
|
virtual void | setOptimalParameters (int svmsgdType=SVMSGD::ASGD, int marginType=SVMSGD::SOFT_MARGIN)=0 |
| Function sets optimal parameters values for chosen SVM SGD model.
|
|
virtual void | setStepDecreasingPower (float stepDecreasingPower)=0 |
|
virtual void | setSvmsgdType (int svmsgdType)=0 |
|
virtual void | setTermCriteria (const cv::TermCriteria &val)=0 |
|
virtual float | calcError (const Ptr< TrainData > &data, bool test, OutputArray resp) const |
| Computes error on the training or test dataset.
|
|
virtual bool | empty () const CV_OVERRIDE |
| Returns true if the Algorithm is empty (e.g. in the very beginning or after unsuccessful read.
|
|
virtual int | getVarCount () const =0 |
| Returns the number of variables in training samples.
|
|
virtual bool | isClassifier () const =0 |
| Returns true if the model is classifier.
|
|
virtual bool | isTrained () const =0 |
| Returns true if the model is trained.
|
|
virtual float | predict (InputArray samples, OutputArray results=noArray(), int flags=0) const =0 |
| Predicts response(s) for the provided sample(s)
|
|
virtual bool | train (const Ptr< TrainData > &trainData, int flags=0) |
| Trains the statistical model.
|
|
virtual bool | train (InputArray samples, int layout, InputArray responses) |
| Trains the statistical model.
|
|
| Algorithm () |
|
virtual | ~Algorithm () |
|
virtual void | clear () |
| Clears the algorithm state.
|
|
virtual String | getDefaultName () const |
|
virtual void | read (const FileNode &fn) |
| Reads algorithm parameters from a file storage.
|
|
virtual void | save (const String &filename) const |
|
virtual void | write (FileStorage &fs) const |
| Stores algorithm parameters in a file storage.
|
|
void | write (const Ptr< FileStorage > &fs, const String &name=String()) const |
| simplified API for language bindingsThis is an overloaded member function, provided for convenience. It differs from the above function only in what argument(s) it accepts.
|
|
Stochastic Gradient Descent SVM classifier.
SVMSGD provides a fast and easy-to-use implementation of the SVM classifier using the Stochastic Gradient Descent approach, as presented inbottou2010large.
The classifier has following parameters:
- model type,
- margin type,
- margin regularization ( \(\lambda\)),
- initial step size ( \(\gamma_0\)),
- step decreasing power ( \(c\)),
- and termination criteria.
The model type may have one of the following values: SGD and ASGD.
- ASGD is Average Stochastic Gradient Descent SVM Classifier. ASGD classifier averages weights vector on each step of algorithm by the formula \(\widehat{w}_{t+1} = \frac{t}{1+t}\widehat{w}_{t} + \frac{1}{1+t}w_{t+1}\)
The recommended model type is ASGD (followingbottou2010large).
The margin type may have one of the following values: SOFT_MARGIN or HARD_MARGIN.
- You should use HARD_MARGIN type, if you have linearly separable sets.
- You should use SOFT_MARGIN type, if you have non-linearly separable sets or sets with outliers.
- In the general case (if you know nothing about linear separability of your sets), use SOFT_MARGIN.
The other parameters may be described as follows:
- Margin regularization parameter is responsible for weights decreasing at each step and for the strength of restrictions on outliers (the less the parameter, the less probability that an outlier will be ignored). Recommended value for SGD model is 0.0001, for ASGD model is 0.00001.
- Initial step size parameter is the initial value for the step size \(\gamma(t)\). You will have to find the best initial step for your problem.
- Step decreasing power is the power parameter for \(\gamma(t)\) decreasing by the formula, mentioned above. Recommended value for SGD model is 1, for ASGD model is 0.75.
Note that the parameters margin regularization, initial step size, and step decreasing power should be positive.
To use SVMSGD algorithm do as follows:
- then the SVM model can be trained using the train features and the correspondent labels by the method train().
- after that, the label of a new feature vector can be predicted using the method predict().
svmsgd->train(trainData);
svmsgd->predict(samples, responses);