How to implement a matrix multiplication in Keras?

Rui Meng picture Rui Meng · May 3, 2017 · Viewed 10.9k times · Source

I just want to implement a function that given a matrix X returns the covariance matrix of X (X^T*X), which is just a simple matrix multiplication.

In Tensorflow it's gonna be easy: tf.matmul(X, tf.transpose(X))

But I didn't expect that it's a nightmare with Keras. The APIs in Keras like multiply and dot don't fit my request. I also tried different ways (Lambda layer and mixed with TF operations) but still failed, occurred lots of errors.

Hope someone may help. Thanks.

Answer

grovina picture grovina · May 21, 2017

Actually you do have the analogous in Keras. Try dot(x, transpose(x)).

A working example comparing the two platforms follows.

import keras.backend as K
import numpy as np
import tensorflow as tf


def cov_tf(x_val):
    x = tf.constant(x_val)
    cov = tf.matmul(x, tf.transpose(x))
    return cov.eval(session=tf.Session())

def cov_keras(x_val):
    x = K.constant(x_val)
    cov = K.dot(x, K.transpose(x))
    return cov.eval(session=tf.Session())

if __name__ == '__main__':
    x = np.random.rand(4, 5)
    delta = np.abs(cov_tf(x) - cov_keras(x)).max()
    print('Maximum absolute difference:', delta)

The maximum absolute difference is printed and gives me something around 1e-7.