I'm looking for a tensorboard embedding example, with iris data for example like the embedding projector http://projector.tensorflow.org/
But unfortunately i couldn't find one. Just a little bit information about how to do it in https://www.tensorflow.org/how_tos/embedding_viz/
Does someone knows a basic tutorial for this functionality?
Basics:
1) Setup a 2D tensor variable(s) that holds your embedding(s).
embedding_var = tf.Variable(....)
2) Periodically save your embeddings in a LOG_DIR.
3) Associate metadata with your embedding.
I've used FastText's pre-trained word vectors with TensorBoard.
import os
import tensorflow as tf
import numpy as np
import fasttext
from tensorflow.contrib.tensorboard.plugins import projector
# load model
word2vec = fasttext.load_model('wiki.en.bin')
# create a list of vectors
embedding = np.empty((len(word2vec.words), word2vec.dim), dtype=np.float32)
for i, word in enumerate(word2vec.words):
embedding[i] = word2vec[word]
# setup a TensorFlow session
tf.reset_default_graph()
sess = tf.InteractiveSession()
X = tf.Variable([0.0], name='embedding')
place = tf.placeholder(tf.float32, shape=embedding.shape)
set_x = tf.assign(X, place, validate_shape=False)
sess.run(tf.global_variables_initializer())
sess.run(set_x, feed_dict={place: embedding})
# write labels
with open('log/metadata.tsv', 'w') as f:
for word in word2vec.words:
f.write(word + '\n')
# create a TensorFlow summary writer
summary_writer = tf.summary.FileWriter('log', sess.graph)
config = projector.ProjectorConfig()
embedding_conf = config.embeddings.add()
embedding_conf.tensor_name = 'embedding:0'
embedding_conf.metadata_path = os.path.join('log', 'metadata.tsv')
projector.visualize_embeddings(summary_writer, config)
# save the model
saver = tf.train.Saver()
saver.save(sess, os.path.join('log', "model.ckpt"))
Then run this command in your terminal:
tensorboard --logdir=log