33
Koumoul
6y

Dropout layers improve neural net learning by randomly "killing neurons" thus preventing overfitting.

That's how I will justify my alcoholism from now on.

Comments
  • 3
    It's amusing just how bad we are at this neural net thing.

    In ten years this approach will be hilarious and/or cringe-worthy 😊
  • 0
    @Root

    Just try out ALL neuron combinations! Come on, it is not that difficult. Time to compute that must be finite...
  • 2
    @Gregozor2121 😅

    Brute-force vs calculus.
    The first one is easy, but takes millennia.
  • 0
    @Root not sure about that, unlike the old logic and deduction based systems, DL actually works. Also, the maths and theory behind deep learning is pretty rigourous.

    @Gregozor2121 nobody does that, since most combinations would be pointless. The idea of DL is to chain as many neurons together as possible, this allows it to approximate the target function with the best representation possible. Doesn't leave many combinations to try.
  • 1
    @orseji depends on what part of ML you want t to learn, how deep you want to go, and how good your knowledge of mathematics (calculus, probability & statistics, linear algebra etc.) is.
  • 1
    It's kind of like my justification of "drinking kills brain cells.. but only the weak ones"
  • 1
    But the thing is, your brain cells die for real, while the NNs just take a time out 🤷‍♂️
Add Comment