[Solved] Is use of Leaky ReLu after Batch Normalization (BN) is useful


The BN layer tries to zero-mean its output by subtracting an expectation over inputs. So we can expect some of its output values to be negative.

So the LeakyReLU following the BN layer will still receive negative values.

solved Is use of Leaky ReLu after Batch Normalization (BN) is useful