Is it possible to define a TensorFlow graph with more than one input? For instance, I want to give the graph two images and one text, each one is processed by a bunch of layers with a fc layer at the end. Then there is a node that computes a loss function that takes into account the three representations. The aim is to let the three nets to backpropagate considering the joint representation loss. Is it possible? any example/tutorial about it?
This is completely straight forward thing. For "one input" you would have something like:
def build_column(x, input_size):
w = tf.Variable(tf.random_normal([input_size, 20]))
b = tf.Variable(tf.random_normal([20]))
processing1 = tf.nn.sigmoid(tf.matmul(x, w) + b)
w = tf.Variable(tf.random_normal([20, 3]))
b = tf.Variable(tf.random_normal([3]))
return tf.nn.sigmoid(tf.matmul(processing1, w) + b)
input1 = tf.placeholder(tf.float32, [None, 2])
output1 = build_column(input1, 2) # 2-20-3 network
and you can simply add more such "columns" and merge them anytime you want
input1 = tf.placeholder(tf.float32, [None, 2])
output1 = build_column(input1, 2)
input2 = tf.placeholder(tf.float32, [None, 10])
output2 = build_column(input1, 10)
input3 = tf.placeholder(tf.float32, [None, 5])
output3 = build_column(input1, 5)
whole_model = output1 + output2 + output3 # since they are all the same size
and you will get network which looks like:
2-20-3\
\
10-20-3--SUM (dimension-wise)
/
5-20-3/
or to make a single valued output
w1 = tf.Variable(tf.random_normal([3, 1]))
w2 = tf.Variable(tf.random_normal([3, 1]))
w3 = tf.Variable(tf.random_normal([3, 1]))
whole_model = tf.matmul(output1, w1) + tf.matmul(output2, w2) + tf.matmul(output3, w3)
to get
2-20-3\
\
10-20-3--1---
/
5-20-3/