Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
BadFox23316y@Fast-Nop so, the problem is that it can't differentiate whether more changes in the next layer is needed versus the previous layers?
-
@BadFox actually, it's how chained partial differentials work. A very nice and free NN tutorial that makes the problem clear is http://neuralnetworksanddeeplearning.com/... and specifically chapter 5 (why deep NNs are hard to train), but that doesn't make much sense without having read the chapters before.
-
BadFox23316y@Fast-Nop well, I guess I found my reading material for the next however long it takes. Thanks for the help. (^^)
Related Rants
Does gradient descent in artificial neural networks apply the most changes closest to the input layer?
question
neural network
algorithm
deep learning
neural networks
gradient