Chapter 0 - Fundamentals

Before embarking on this curriculum, it is necessary to understand the basics of deep learning, including basic machine learning terminology, what neural networks are, and how to train them.

In this chapter, you’ll learn about some coding best practices, become familiar with the PyTorch library, and build & train your own neural networks (CNNs and ResNets).

Note - this chapter is mainly for getting everyone up to the same level, so the rest of the program can proceed. Some of the participants will have more experience with ML than others, and everyone will have different backgrounds, so you might find some of this material more relevant and some less so.

You can find the actual content at this page.

Ray Tracing

These exercise involve practicing batched matrix operations in PyTorch by writing a basic graphics renderer. You’ll start with an extremely simplified case and work up to rendering your very own 3D Pikachu!

CNNs & ResNets

This section is designed to get you familiar with basic neural networks: how they are structured, the basic operations like linear layers and convolutions which go into making them, and why they work as well as they do. You’ll build up to assembling ResNet34 from scratch.

Optimization & Hyperparameters

In these exercises, you’ll explore various optimization algorithms such as SGD and Adam. You’ll also learn how to efficiently train models & perform hyperparameter sweeps using Weights & Biases.

Backprop

In these exercises, you’ll build your very own system that can run the backpropagation algorithm in essentially the same way as PyTorch does. By the end of the day, you'll be able to train a multi-layer perceptron neural network, using your own backprop system!

GANs & VAEs

We conclude the first week by studying two important classes of generative image models: GANs and VAEs. These carry some important conceptual insights which will be relevant later in the course, and should also bring much of this week’s content full-circle.