![]() ![]() - Code and perform gradient computations using backpropagation and parameter updates using optimizers: Stochastic Gradient Descent (SGD), AdaGrad, RMSprop, and Adam.- Program activation functions: Rectified Linear (ReLU), Softmax, Sigmoid, and Linear.- Learn how to connect these neurons in layers. ![]() Then you're shown how to use NumPy (the go-to 3rd party library in Python for doing mathematics) to do the same thing, since learning more about using NumPy can be a great side-benefit of the book. Everything is covered to code, train, and use a neural network from scratch in Python.Įverything we do is shown first in pure, raw, Python (no 3rd party libraries). Within short order, we're coding our first neurons, creating layers of neurons, building activation functions, calculating loss, and doing backpropagation with various optimizers. First off, there's none of that "intro to programming" padding of any kind! The book starts off with a brief outline of what neural networks are and some general background on the structure of machine learning algorithms, as I expect some people will have neural networks as their first machine learning algorithm and be a bit confused about terms like "features" and "labels".
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |