How to obtain features' weights

Paul85 picture Paul85 · Jan 21, 2014 · Viewed 12k times · Source

I am dealing with highly imbalanced data set and my idea is to obtain values of feature weights from my libSVM model. As for now I am OK with the linear kernel, where I can obtain feature weights, but when I am using rbf or poly, I fail to reach my objective.

Here I am using sklearn for my model and it's easy to obtain feature weights for linear kernel using .coef_. Can anyone help me to do same thing for rbf or poly? What I've tried to do so far is given below:

svr = SVC(C=10, cache_size=200, class_weight='auto', coef0=0.0, degree=3.0, gamma=0.12,kernel='rbf', max_iter=-1, probability=True, random_state=0,shrinking=True, tol=0.001, verbose=False)
clf = svr.fit(data_train,target_train)
print clf.coef_

Answer

BartoszKP picture BartoszKP · Jan 21, 2014

This is not only impossible, as stated in the documentation:

Weights asigned to the features (coefficients in the primal problem). This is only available in the case of linear kernel.

but also it doesn't make sense. In linear SVM the resulting separating plane is in the same space as your input features. Therefore its coefficients can be viewed as weights of the input's "dimensions".

In other kernels, the separating plane exists in another space - a result of kernel transformation of the original space. Its coefficients are not directly related to the input space. In fact, for the rbf kernel the transformed space is infinite-dimensional (you can get a starting point on this on Wikipedia of course).