This is my note while using PyTorch
Table of Contents: Dataset Initialization Others Dataset Initialization Xavier Kaiming Others nn from scratch: create weights, bias with torch.rand & torch.zeros with requires_grad make forward pass & call loss.backward to calculate gradients update weights, bias with gradients
From chapter 1 - 12, this book explains lots of methods from linear to cnn and rnn. I am only interested in building a autograd framework from chapter 13 and beyond. Therefore, my approach to this book is to build a framework then apply it to the rest of the book instead of avoid using framework like chapter 1-12.
Chapter 2: how do machines learn? Deep Learning, Machine Learning, AI Parametric models and nonparametric models Supervised vs unsupervised Chapter 3: forward propagation use numpy.
Main goals: - Implement the autograd framework - Implement SGD, Adam, etc. - Analyze: weight initialization, weight decay, batch norm. - Use numpy
Chapter 1: Foundations The author explains how the derivatives and chain rules is used from basic functions to matrix-matrix multiplication.
Chapter 2: Fundamentals The author explains about linear regression and neural networks as stacked of linear regression with non-linear activations.
Chapter 3: Deep Learning from Scratch Build a neural network with autograd.
Operations class Operation(object):
forward(input): Call self._output() to calculate the result store the input and output backward(output_grad): check the shape of self.
There are too many books to read. In this section, I will list some main ideas of each book.
Reading path:
Deep Learning from Scratch - Seth Weidman Programming PyTorch for Deep Learning - Ian Pointer Grokking Deep Learning Table of Contents: [Deep Learning Book]() [Dive into Deep Learning]() Deep Learning with PyTorch - Eli Stevens, Luca Antiga Programming PyTorch for Deep Learning - Ian Pointer [Deep Learning for Computer Vision]() Hands-on Machine Learning with Scikit-Learn Keras and TensorFlow 2.