Implement Relu derivative in python numpy

Bon picture Bon · Sep 25, 2017 · Viewed 41.7k times · Source

I'm trying to implement a function that computes the Relu derivative for each element in a matrix, and then return the result in a matrix. I'm using Python and Numpy.

Based on other Cross Validation posts, the Relu derivative for x is 1 when x > 0, 0 when x < 0, undefined or 0 when x == 0

Currently, I have the following code so far:

def reluDerivative(self, x):
    return np.array([self.reluDerivativeSingleElement(xi) for xi in x])

def reluDerivativeSingleElement(self, xi):
    if xi > 0:
        return 1
    elif xi <= 0:
        return 0

Unfortunately, xi is an array because x is an matrix. reluDerivativeSingleElement function doesn't work on array. So I'm wondering is there a way to map values in a matrix to another matrix using numpy, like the exp function in numpy?

Thanks a lot in advance.

Answer

Jakub Bartczuk picture Jakub Bartczuk · Sep 25, 2017

That's an exercise in vectorization.

This code

if x > 0:
  y = 1
elif xi <= 0:
  y = 0

Can be reformulated into

y = (x > 0) * 1

This is something that will work for numpy arrays, since boolean expressions involving them are turned into arrays of values of these expressions for elements in said array.