ABSTRACT

In the last two chapters, we've learned about tensors and automatic differentiation. In the upcoming two, we take a break from studying torch mechanics and, instead, find out what we're able to do with what we already have. Using nothing but tensors, and supported by nothing but autograd, we can already do two things:

minimize a function (i.e., perform numerical optimization), and

build and train a neural network.