Using pretrained gensim Word2vec embedding in keras

shivank01 picture shivank01 · Sep 1, 2018 · Viewed 9.6k times · Source

I have trained word2vec in gensim. In Keras, I want to use it to make matrix of sentence using that word embedding. As storing the matrix of all the sentences is very space and memory inefficient. So, I want to make embedding layer in Keras to achieve this so that It can be used in further layers(LSTM). Can you tell me in detail how to do this?

PS: It is different from other questions because I am using gensim for word2vec training instead of keras.

Answer

Seb picture Seb · Aug 6, 2019

With the new Gensim version this is pretty easy:

w2v_model.wv.get_keras_embedding(train_embeddings=False)

there you have your Keras embedding layer