Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
Root797336yIt's amusing just how bad we are at this neural net thing.
In ten years this approach will be hilarious and/or cringe-worthy 😊 -
@Root
Just try out ALL neuron combinations! Come on, it is not that difficult. Time to compute that must be finite... -
@Root not sure about that, unlike the old logic and deduction based systems, DL actually works. Also, the maths and theory behind deep learning is pretty rigourous.
@Gregozor2121 nobody does that, since most combinations would be pointless. The idea of DL is to chain as many neurons together as possible, this allows it to approximate the target function with the best representation possible. Doesn't leave many combinations to try. -
@orseji depends on what part of ML you want t to learn, how deep you want to go, and how good your knowledge of mathematics (calculus, probability & statistics, linear algebra etc.) is.
-
It's kind of like my justification of "drinking kills brain cells.. but only the weak ones"
-
Awlex177456yBut the thing is, your brain cells die for real, while the NNs just take a time out 🤷♂️
Dropout layers improve neural net learning by randomly "killing neurons" thus preventing overfitting.
That's how I will justify my alcoholism from now on.
rant