In its API documentation, it says "Computes rectified linear".
Is it Re(ctified) L(inear)... what is U then?
Re(ctified) L(inear) (U)nit
Usually a layer in a neural network has some input, say a vector, and multiplies that by a weight matrix, resulting i.e. again in a vector.
Each value in the result (usually a float) is then considered an output. However, most layers in neural networks nowadays involve nonlinearities, hence an add-on function that, you might say, adds complexity to these output values. For long these have been sigmoids and tanhs.
But more recently people use a function that results in 0 if the input is negative, and the input itself if that input is 0 or positive. This specific add-on function (or better "activation function") is called a relu.