I’ve come back to the 3Brown1Blue YouTube series on Neural Networks a few times now, but only recently took the time to sit down and try to digest them thoroughly. This time around, I find myself wishing that they had been the first stop when I first started digging into deep learning earlier this year. At times, deep learning can feel like just a lot of funny mathematical notation, which in a sense it is, but these videos do a great job of both keeping with that notation, but also granting an intuitive perspective as to what is going on and why, one of the hallmarks of a great teacher.

Just as a small example, when explaining back propagation, he takes the time to mention the potential link to Hebbian theory. In a phrase, “neurons that fire together wire together.” Perhaps it’s a stretch to say that artificial neural nets really mimic the inner workings of a human brain, but it sure does solidify the concept.


Source: 3Brown1Blue - What is backpropagation really doing?