unrecognized table variable name minLS in Bayesian optimization of TreeBagger
Hello,
I tried to do the hyperparameter tuning using bayesian optimization for the randomforest model I made using Treebagger. This is the code I used. I didnot use ensemble bagged trees in regression learner directly because eachtime i check the optimization the minimum mse output is like 498 learners and 1 minimum leaf size for 9 number of predictors.
264 inputTable=readtable(‘dataall_trainingregression.csv’);
265 predictorNames = {‘temp_diff’, ‘temp_median’, ‘NDVI’, ‘Clay’, ‘elevation’, ‘slope’, ‘TWI’, ‘sand’, ‘DOY’};
266 predictors = inputTable(:, predictorNames);
267 response = inputTable.daily_meanSM;
268 n=length(inputTable.daily_meanSM);
269 cvp = cvpartition(n,’KFold’,5);
271 maxMinLS = 20;
272 minLS = optimizableVariable(‘minLS’,[1,maxMinLS],’Type’,’integer’);
273 numPTS = optimizableVariable(‘numPTS’,[1,size(predictors,2)],’Type’,’integer’);
274 hyperparametersRF = [minLS; numPTS];
275 fun = @(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp);
277 results = bayesopt(fun,hyperparametersRF,…
‘AcquisitionFunctionName’,’expected-improvement-plus’,’Verbose’,0);
280 function yfit = myfunction(params,predictors,response,test)
281 Mdl1 = TreeBagger(30,predictors,response,…
282 ‘Method’,"regression",’Surrogate’,"on",…
283 ‘PredictorSelection’,"curvature",…
284 ‘OOBPredictorImportance’,"on",’MinLeafSize’,params.minLS,…
285 ‘NumPredictorsToSample’,params.numPTS);
286 yfit = predict(Mdl1,test);
288 end
for this code i receive an error message of :
Error using crossval>evalFun
The function ‘myfunction’ generated the following error:
Unrecognized table variable name ‘minLS’.
Error in crossval>getLossVal (line 529)
funResult = evalFun(funorStr,arg(1:end-1));
Error in crossval (line 428)
[funResult,outarg] = getLossVal(i, nData, cvp, data, predfun);
Error in model_hyperparameter_tuning>@(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp) (line 275)
fun = @(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp);
Error in BayesianOptimization/callObjNormally (line 13)
Objective = this.ObjectiveFcn(conditionalizeX(this, X));
Error in BayesianOptimization/callObjFcn (line 25)
= callObjNormally(this, X);
Error in BayesianOptimization/runSerial (line 24)
ObjectiveFcnObjectiveEvaluationTime, ObjectiveNargout] = callObjFcn(this, this.XNext);
Error in BayesianOptimization/run (line 9)
this = runSerial(this);
Error in BayesianOptimization (line 184)
this = run(this);
Error in bayesopt (line 323)
Results = BayesianOptimization(Options);
Error in model_hyperparameter_tuning (line 277)
results = bayesopt(fun,hyperparametersRF,…
can anyone please help me to solve this problem. ThanksHello,
I tried to do the hyperparameter tuning using bayesian optimization for the randomforest model I made using Treebagger. This is the code I used. I didnot use ensemble bagged trees in regression learner directly because eachtime i check the optimization the minimum mse output is like 498 learners and 1 minimum leaf size for 9 number of predictors.
264 inputTable=readtable(‘dataall_trainingregression.csv’);
265 predictorNames = {‘temp_diff’, ‘temp_median’, ‘NDVI’, ‘Clay’, ‘elevation’, ‘slope’, ‘TWI’, ‘sand’, ‘DOY’};
266 predictors = inputTable(:, predictorNames);
267 response = inputTable.daily_meanSM;
268 n=length(inputTable.daily_meanSM);
269 cvp = cvpartition(n,’KFold’,5);
271 maxMinLS = 20;
272 minLS = optimizableVariable(‘minLS’,[1,maxMinLS],’Type’,’integer’);
273 numPTS = optimizableVariable(‘numPTS’,[1,size(predictors,2)],’Type’,’integer’);
274 hyperparametersRF = [minLS; numPTS];
275 fun = @(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp);
277 results = bayesopt(fun,hyperparametersRF,…
‘AcquisitionFunctionName’,’expected-improvement-plus’,’Verbose’,0);
280 function yfit = myfunction(params,predictors,response,test)
281 Mdl1 = TreeBagger(30,predictors,response,…
282 ‘Method’,"regression",’Surrogate’,"on",…
283 ‘PredictorSelection’,"curvature",…
284 ‘OOBPredictorImportance’,"on",’MinLeafSize’,params.minLS,…
285 ‘NumPredictorsToSample’,params.numPTS);
286 yfit = predict(Mdl1,test);
288 end
for this code i receive an error message of :
Error using crossval>evalFun
The function ‘myfunction’ generated the following error:
Unrecognized table variable name ‘minLS’.
Error in crossval>getLossVal (line 529)
funResult = evalFun(funorStr,arg(1:end-1));
Error in crossval (line 428)
[funResult,outarg] = getLossVal(i, nData, cvp, data, predfun);
Error in model_hyperparameter_tuning>@(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp) (line 275)
fun = @(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp);
Error in BayesianOptimization/callObjNormally (line 13)
Objective = this.ObjectiveFcn(conditionalizeX(this, X));
Error in BayesianOptimization/callObjFcn (line 25)
= callObjNormally(this, X);
Error in BayesianOptimization/runSerial (line 24)
ObjectiveFcnObjectiveEvaluationTime, ObjectiveNargout] = callObjFcn(this, this.XNext);
Error in BayesianOptimization/run (line 9)
this = runSerial(this);
Error in BayesianOptimization (line 184)
this = run(this);
Error in bayesopt (line 323)
Results = BayesianOptimization(Options);
Error in model_hyperparameter_tuning (line 277)
results = bayesopt(fun,hyperparametersRF,…
can anyone please help me to solve this problem. Thanks Hello,
I tried to do the hyperparameter tuning using bayesian optimization for the randomforest model I made using Treebagger. This is the code I used. I didnot use ensemble bagged trees in regression learner directly because eachtime i check the optimization the minimum mse output is like 498 learners and 1 minimum leaf size for 9 number of predictors.
264 inputTable=readtable(‘dataall_trainingregression.csv’);
265 predictorNames = {‘temp_diff’, ‘temp_median’, ‘NDVI’, ‘Clay’, ‘elevation’, ‘slope’, ‘TWI’, ‘sand’, ‘DOY’};
266 predictors = inputTable(:, predictorNames);
267 response = inputTable.daily_meanSM;
268 n=length(inputTable.daily_meanSM);
269 cvp = cvpartition(n,’KFold’,5);
271 maxMinLS = 20;
272 minLS = optimizableVariable(‘minLS’,[1,maxMinLS],’Type’,’integer’);
273 numPTS = optimizableVariable(‘numPTS’,[1,size(predictors,2)],’Type’,’integer’);
274 hyperparametersRF = [minLS; numPTS];
275 fun = @(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp);
277 results = bayesopt(fun,hyperparametersRF,…
‘AcquisitionFunctionName’,’expected-improvement-plus’,’Verbose’,0);
280 function yfit = myfunction(params,predictors,response,test)
281 Mdl1 = TreeBagger(30,predictors,response,…
282 ‘Method’,"regression",’Surrogate’,"on",…
283 ‘PredictorSelection’,"curvature",…
284 ‘OOBPredictorImportance’,"on",’MinLeafSize’,params.minLS,…
285 ‘NumPredictorsToSample’,params.numPTS);
286 yfit = predict(Mdl1,test);
288 end
for this code i receive an error message of :
Error using crossval>evalFun
The function ‘myfunction’ generated the following error:
Unrecognized table variable name ‘minLS’.
Error in crossval>getLossVal (line 529)
funResult = evalFun(funorStr,arg(1:end-1));
Error in crossval (line 428)
[funResult,outarg] = getLossVal(i, nData, cvp, data, predfun);
Error in model_hyperparameter_tuning>@(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp) (line 275)
fun = @(params)crossval(‘mse’,predictors,response,’Predfun’,@myfunction,’Partition’,cvp);
Error in BayesianOptimization/callObjNormally (line 13)
Objective = this.ObjectiveFcn(conditionalizeX(this, X));
Error in BayesianOptimization/callObjFcn (line 25)
= callObjNormally(this, X);
Error in BayesianOptimization/runSerial (line 24)
ObjectiveFcnObjectiveEvaluationTime, ObjectiveNargout] = callObjFcn(this, this.XNext);
Error in BayesianOptimization/run (line 9)
this = runSerial(this);
Error in BayesianOptimization (line 184)
this = run(this);
Error in bayesopt (line 323)
Results = BayesianOptimization(Options);
Error in model_hyperparameter_tuning (line 277)
results = bayesopt(fun,hyperparametersRF,…
can anyone please help me to solve this problem. Thanks treebagger, bayesian optimization MATLAB Answers — New Questions