Neural network backpropagation with RELU
if x <= 0, output is 0. if x > 0, output is 1 The ReLU function is defined as: For x > 0 the output is x, i.e. f(x) = max(0,x) So for the derivative f ‘(x) it’s actually: if x < 0, output is 0. if x > 0, output is 1. The … Read more