xgboost's plotting API states:
xgboost.plot_importance(booster, ax=None, height=0.2, xlim=None, ylim=None, title='Feature importance', xlabel='F score', ylabel='Features', importance_type='weight', max_num_features=None, grid=True, **kwargs)¶
Plot importance based on fitted trees.
Parameters:
booster (Booster, XGBModel or dict) – Booster or XGBModel instance, or dict taken by Booster.get_fscore()
...
max_num_features (int, default None) – Maximum number of top features displayed on plot. If None, all features will be displayed.
In my implementation, however, running:
booster_ = XGBClassifier(learning_rate=0.1, max_depth=3, n_estimators=100,
silent=False, objective='binary:logistic', nthread=-1,
gamma=0, min_child_weight=1, max_delta_step=0, subsample=1,
colsample_bytree=1, colsample_bylevel=1, reg_alpha=0,
reg_lambda=1, scale_pos_weight=1, base_score=0.5, seed=0)
booster_.fit(X_train, y_train)
from xgboost import plot_importance
plot_importance(booster_, max_num_features=10)
Returns:
AttributeError: Unknown property max_num_features
While running it without the parameter max_num_features
plots correctly the entire feature set (which in my case is gigantic, ~10k features).
Any ideas of what's going on?
Thanks in advance.
Details:
> python -V
Python 2.7.12 :: Anaconda custom (x86_64)
> pip freeze | grep xgboost
xgboost==0.4a30
Try to upgrade your xgboost library to 0.6. It should solve the problem. To upgrade the package, try this:
$ pip install -U xgboost
If you get an error, try this:
$ brew install gcc@5
$ pip install -U xgboost
(Refer to this https://github.com/dmlc/xgboost/issues/1501)