I'm using Keras to build a LSTM and tuning it by doing gradient descent with an external cost function. So the weights are updated with:
weights := weights + alpha* gradient(cost)
I know that I can get the weights with keras.getweights()
, but how can I do the gradient descent and update all weights and update the weights correspondingly. I try to use initializer
, but I still didn't figure it out. I only found some related code with tensorflow but I don't know how to convert it to Keras.
Any help, hint or advice will be appreciated!
keras.layer.set_weights()
is what you are looking for:
import numpy as np
from keras.layers import Dense
from keras.models import Sequential
model = Sequential()
model.add(Dense(10, activation='relu', input_shape=(10,)))
model.add(Dense(5, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(optimizer='adam', loss='categorical_crossentropy')
a = np.array(model.get_weights()) # save weights in a np.array of np.arrays
model.set_weights(a + 1) # add 1 to all weights in the neural network
b = np.array(model.get_weights()) # save weights a second time in a np.array of np.arrays
print(b - a) # print changes in weights
Have a look at the respective page of the keras documentation here.