study-group

I’ve recently started a reading group with some friends focused on deep learning. We meet each week at a great coffee shop, go over an assigned reading or project, and ask each other lots of questions. At least, that’s the goal!

As background, everyone in the group has experience with machine learning and is pretty mathematically sophisticated. But the only real pre-requisite when I form a group like this is that everyone is friendly, collaborative, and a non-show off.

Weekly readings

Here are the readings we’ve done each week. This may be useful if you’d also like a gentle way into deep learning or would like to form a study group of your own. Of course, with hind sight, we would have changed up some of the readings and order, but this is still a good place to start.

Week 1

Read this survey paper and get on the same page about goals for the reading group.

Week 2

Go through this super simple tutorial on implementing a neural network in very few lines of code. There’s a part two that would also be appropriate to read here for those inclined.

Week 3

Read the chapter on autoencoders from the Deep Learning Book. Autoencoders sort of blow my mind.

Week 4

Go through this example of on autoencoder on the NMIST dataset. Involves Tensorflow so those who haven’t used tensorflow before also read this tutorial.

Had a great argument (heated discussion?) on whether autoencoders can be classified under supervised learning.

Week 5

Read Chapter 6 of the Deep Learning Book - it’s an introductory chapter on feedforward neural networks. Some of us also read the first 5 chapters of the book as a nice refresher and different perspective on linear algebra, numerical methods, machine learning, and probability. I also liked reading this great explanation of information theory (makes cross entropy crystal clear).

I really loved this chapter! Highly recommended.

Week 6

Implement a feedforward neural network to train NMIST. Similar to the autoencoder exercise, but this time try to do it from scratch using Tensorflow 1.0, which just came out, and is not exactly backwards compatible with the code we wrote beforehand. Also spend some time getting familiar with Tensorboard.

Week 7

The plan is to move onto recurrent neural networks! I’m really psyched to auto-generate some poetry, move on to more interesting data sets, and get some GPU action going. Anyone want to give me free GPU credits on their favorite cloud platform?