Is there any way to get variable importance with Keras?

user1367204 picture user1367204 · May 22, 2017 · Viewed 20.8k times · Source

I am looking for a proper or best way to get variable importance in a Neural Network created with Keras. The way I currently do it is I just take the weights (not the biases) of the variables in the first layer with the assumption that more important variables will have higher weights in the first layer. Is there another/better way of doing it?

Answer

Daniel Möller picture Daniel Möller · May 22, 2017

Since everything will be mixed up along the network, the first layer alone can't tell you about the importance of each variable. The following layers can also increase or decrease their importance, and even make one variable affect the importance of another variable. Every single neuron in the first layer itself will give each variable a different importance too, so it's not something that straightforward.

I suggest you do model.predict(inputs) using inputs containing arrays of zeros, making only the variable you want to study be 1 in the input.

That way, you see the result for each variable alone. Even though, this will still not help you with the cases where one variable increases the importance of another variable.