### A Journey Through Fastbook (AJTFB) - Chapter 9: Tabular Modeling

In chapter of 8 of “Deep Learning for Coders with fastai & PyTorch” we learned that the neural network version of of collaborative model is in fact built on something called `TabularModel`

, and that in fact, an `EmbeddingNN`

is nothing but a `TabularModel`

*without any continuous (or real) numbers*. “Structured” or “tabular” data describes datasets that look like an Excel spreadsheet or a relational database table, of which, it may be a composed of both categorical and/or real numbers. Working with such data is the subject of chapter 9, so lets go!

### A Journey Through Fastbook (AJTFB) - Chapter 8: Collaborative Filtering

This chapter of “Deep Learning for Coders with fastai & PyTorch” moves us away from computer vision to collaborative filtering (think recommendation systems). We’ll explore building these models using the traditional “dot product” approach and also using a neural network, but we’ll begin by covering the idea of “latent factors,” which are both important for colloborative and tabular models. Lets go!

### A Journey Through Fastbook (AJTFB) - Chapter 7: Advanced techniques for training image classification models

This chapter of "Deep Learning for Coders with fastai & PyTorch" details several techniques you can apply to getting SOTA results with your image classification models! It’s the last chapter dedicated to computer vision before diving into colloborate filtering, tabular, and NLP models

### A Journey Through Fastbook (AJTFB) - Chapter 6: Regression

Its the more things you can do with computer vision chapter of "Deep Learning for Coders with fastai & PyTorch"! Having looked at both multiclass and multilable classification, we now turn our attention to regression tasks. In particular, we’ll look at key point regression models covered in chapter 6. Soooo lets go!

### A Journey Through Fastbook (AJTFB) - Chapter 6: Multilabel Classification

Its the more things you can do with computer vision chapter of "Deep Learning for Coders with fastai & PyTorch"! We’ll go over everything you need to know to get started with multi-label classification tasks from datablocks to training and everything in between. Next post we’ll look at regression tasks, in particular key point regression models that are also covered in chapter 6. Soooo lets go!

### A Journey Through Fastbook (AJTFB) - Chapter 5: Multiclass classification

Its the image classification chapter of "Deep Learning for Coders with fastai & PyTorch"! We’ll go over everything you need to know to get started with multiclass classification, from setting up your DataBlock and loss function, to some of the core techniques for evaluating and improving your model’s predictions. So without further adieu, lets go …

### Contributing to fastai: Setup your local development environment & submit a PR

A few hours ago I was working on a PR for fastai, and as it has been awhile I realized I couldn’t quite remember all the steps required to do so. Fortunately, I got it figured out pretty quickly and decided I better blog the steps for when I forget next (I am almost 50 after all). So for all you developers looking to contribute to fastai, or really any open source project, here’s everything you need to know to setup your local development environment and submit PRs to fastai. Enjoy!

### Multilingual Sequence Classifaction with the MBart Family

Need to do some multi-lingual sequence classification? Look no further, at least if you want to use MBart and/or the MBart-50 variety of models. Working against the `amazon_reviews_multi`

dataset I’ll show you how to use the `blurr`

library to configure the huggingface objects, build DataLoaders, and train a model that you can use for classifying German text. I’ll throw in a bit of the inference code so that you can see how easy `blurr`

makes it to use your trained model to boot. Let’s go …

### A Journey Through Fastbook (AJTFB) - Chapter 4: Stochastic Gradient Descent

The fourth in a weekly-ish series where I revisit the fast.ai book, "Deep Learning for Coders with fastai & PyTorch", and provide commentary on the bits that jumped out to me chapter by chapter. So without further adieu, let’s go!

### A Journey Through Fastbook (AJTFB) - Chapter 3: Data Ethics

The third in a weekly-ish series where I revisit the fast.ai book, "Deep Learning for Coders with fastai & PyTorch", and provide commentary on the bits that jumped out to me chapter by chapter. So without further adieu, let’s go!

### A Journey Through Fastbook (AJTFB) - Chapter 2: Doing Deep Learning

The second in a weekly-ish series where I revisit the fast.ai book, "Deep Learning for Coders with fastai & PyTorch", and provide commentary on the bits that jumped out to me chapter by chapter. So without further adieu, let’s go!

### A Journey Through Fastbook (AJTFB) - Chapter 1: The Basics of Deep Learning

The first in a weekly-ish series where I revisit the fast.ai book, "Deep Learning for Coders with fastai & PyTorch", and provide commentary on the bits that jumped out to me chapter by chapter. So without further adieu, let’s go!

### Summarization with blurr

blurr is a libray I started that integrates huggingface transformers with the world of fastai v2, giving fastai devs everything they need to train, evaluate, and deploy transformer specific models. In this article, I provide a simple example of how to use blurr’s new summarization capabilities to train, evaluate, and deploy a BART summarization model.

### Finding DataBlock Nirvana with fast.ai v2 - Part 1

The path to enlightment begins here!