Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
277 views
in Technique[技术] by (71.8m points)

python - How to obtain features' weights

I am dealing with highly imbalanced data set and my idea is to obtain values of feature weights from my libSVM model. As for now I am OK with the linear kernel, where I can obtain feature weights, but when I am using rbf or poly, I fail to reach my objective.

Here I am using sklearn for my model and it's easy to obtain feature weights for linear kernel using .coef_. Can anyone help me to do same thing for rbf or poly? What I've tried to do so far is given below:

svr = SVC(C=10, cache_size=200, class_weight='auto', coef0=0.0, degree=3.0, gamma=0.12,kernel='rbf', max_iter=-1, probability=True, random_state=0,shrinking=True, tol=0.001, verbose=False)
clf = svr.fit(data_train,target_train)
print clf.coef_
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

This is not only impossible, as stated in the documentation:

Weights asigned to the features (coefficients in the primal problem). This is only available in the case of linear kernel.

but also it doesn't make sense. In linear SVM the resulting separating plane is in the same space as your input features. Therefore its coefficients can be viewed as weights of the input's "dimensions".

In other kernels, the separating plane exists in another space - a result of kernel transformation of the original space. Its coefficients are not directly related to the input space. In fact, for the rbf kernel the transformed space is infinite-dimensional (you can get a starting point on this on Wikipedia of course).


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...