[Solved] Neural network output is always 1


I didn’t check your code but…
No, you can’t use fixed weight like that. Each node in your hidden layer will get larger and larger values as you increase the amount of inputs. Sigmoid will scale the large values to 1.

Think about it:
Let’s say that you have 100 inputs each having “random” input value of 0.1. For simplicity, let’s just forget everything else. Since your weights are constant 0.5, all the nodes in the hidden layer will get a same value that consists of “sigmoided” sum of each input*weight, that is sigm(0.1*0.5*100) = sigm(5) -> ~1

So, the more you have positive inputs with constant positive weights the more close all the hidden layers’ outputs will become to 1.

0

solved Neural network output is always 1