Low performance when training SVM model using “polynomial” kernel function
Hello,
I am trying to compare the performance of SVM regression (or SVR) between "rbf", "polynomial", and "linear".
The training works well when using "rbf" and "linear" (e.g., 0.7~0.8 of R^2).
However, when "polynomial" function was applied as kernel function, the performance degraded to 0.001 of R^2 or negative.
I used the code:
Mdl = fitrsvm(X,Y,"Standardize",’true’,’KernelFunction’,’polynomial’,’OptimizeHyperparameters’,{‘BoxConstraint’,’Epsilon’,’KernelScale’,’PolynomialOrder’},’HyperparameterOptimizationOptions’,struct(‘MaxObjectiveEvaluations’,100))
Please help
Thank you.Hello,
I am trying to compare the performance of SVM regression (or SVR) between "rbf", "polynomial", and "linear".
The training works well when using "rbf" and "linear" (e.g., 0.7~0.8 of R^2).
However, when "polynomial" function was applied as kernel function, the performance degraded to 0.001 of R^2 or negative.
I used the code:
Mdl = fitrsvm(X,Y,"Standardize",’true’,’KernelFunction’,’polynomial’,’OptimizeHyperparameters’,{‘BoxConstraint’,’Epsilon’,’KernelScale’,’PolynomialOrder’},’HyperparameterOptimizationOptions’,struct(‘MaxObjectiveEvaluations’,100))
Please help
Thank you. Hello,
I am trying to compare the performance of SVM regression (or SVR) between "rbf", "polynomial", and "linear".
The training works well when using "rbf" and "linear" (e.g., 0.7~0.8 of R^2).
However, when "polynomial" function was applied as kernel function, the performance degraded to 0.001 of R^2 or negative.
I used the code:
Mdl = fitrsvm(X,Y,"Standardize",’true’,’KernelFunction’,’polynomial’,’OptimizeHyperparameters’,{‘BoxConstraint’,’Epsilon’,’KernelScale’,’PolynomialOrder’},’HyperparameterOptimizationOptions’,struct(‘MaxObjectiveEvaluations’,100))
Please help
Thank you. svm, polynomial, regression MATLAB Answers — New Questions