Machine Learning Recipes with Josh Gordon

Hello World – Machine Learning Recipes #1

Six lines of Python is all it takes to write your first machine learning program! In this episode, we’ll briefly introduce what machine learning is and why it’s important. Then, we’ll follow a recipe for supervised learning (a technique to create a classifier from examples) and code it up.

Visualizing a Decision Tree – Machine Learning Recipes #2

Last episode, we treated our Decision Tree as a blackbox. In this episode, we’ll build one on a real dataset, add code to visualize it, and practice reading it – so you can see how it works under the hood. And hey — I may have gone a little fast through some parts. Just let me know, I’ll slow down. Also: we’ll do a Q&A episode down the road, so if anything is unclear, just ask!

What Makes a Good Feature? – Machine Learning Recipes #3

Good features are informative, independent, and simple. In this episode, we’ll introduce these concepts by using a histogram to visualize a feature from a toy dataset. Updates: many thanks for the supportive feedback! I’d love to release these episodes faster, but I’m writing them as we go. That way, I can see what works and (more importantly) where I can improve.

Let’s Write a Pipeline – Machine Learning Recipes #4

In this episode, we’ll write a basic pipeline for supervised learning with just 12 lines of code. Along the way, we’ll talk about training and testing data. Then, we’ll work on our intuition for what it means to “learn” from data.

Writing Our First Classifier – Machine Learning Recipes #5

Welcome back! It’s time to write our first classifier. This is a milestone if you’re new to machine learning. We’ll start with our code from episode #4 and comment out the classifier we imported. Then, we’ll code up a simple replacement – using a scrappy version of k-Nearest Neighbors.

Train an Image Classifier with TensorFlow for Poets – Machine Learning Recipes #6

Monet or Picasso? In this episode, we’ll train our own image classifier, using TensorFlow for Poets. Along the way, I’ll introduce Deep Learning, and add context and background on why the classifier works so well. Here are links to learn more, thanks for watching, and have fun!

Classifying Handwritten Digits with TF.Learn – Machine Learning Recipes #7

Last time we wrote an image classifier using TensorFlow for Poets. This time, we’ll write a basic one using TF.Learn. To make it easier for you to try this out, I wrote a Jupyter Notebook for this episode — https://goo.gl/NNlMNu — and I’ll start with a quick screencast of installing TensorFlow using Docker, and serving the notebook. This is a great way to get all the dependencies installed and properly configured. I’ve linked some additional notebooks below you can try out, too. Next, I’ll start introducing a linear classifier. My goal here is just to get us started. I’d like to spend a lot more time on this next episode, if there’s interest? I have a couple alternate ways of introducing them that I think would be helpful (and I put some exceptional links below for you to check out to learn more, esp. Colah’s blog and CS231n – wow!). Finally, I’ll show you how to reproduce those nifty images of weights from TensorFlow.org’s Basic MNIST’s tutorial.

Leave a Reply

Your email address will not be published. Required fields are marked *