How to determine feature importance using gradient boosting?
When using XGBoost in Python you can train a model and then use the embedded feature importance of XGBoost to determine which features are the most important.
In Matlab there is no implementation of XGBoost, but there is fitrensemble which is similar (afaik). Is there a way to use it for detemination of feature importance? Or is there maybe another way to do feature importance the way XGBoost does it?When using XGBoost in Python you can train a model and then use the embedded feature importance of XGBoost to determine which features are the most important.
In Matlab there is no implementation of XGBoost, but there is fitrensemble which is similar (afaik). Is there a way to use it for detemination of feature importance? Or is there maybe another way to do feature importance the way XGBoost does it? When using XGBoost in Python you can train a model and then use the embedded feature importance of XGBoost to determine which features are the most important.
In Matlab there is no implementation of XGBoost, but there is fitrensemble which is similar (afaik). Is there a way to use it for detemination of feature importance? Or is there maybe another way to do feature importance the way XGBoost does it? machine learning, gradient boosting, xgboost, feature importance MATLAB Answers — New Questions