fastai is a high-level framework for PyTorch. There was quite a hype about it in 2018. I tried it once in a while for serious production use. Here is my list of the good, the bad, and the ugly ;).

fast.ai is the company, fastai the framework. (I reviewed the fast.ai courses earlier).


Contents


Introduction to fastai

fastai is a high-level framework similar to Keras. However, unlike Keras it uses PyTorch. By default, PyTorch requires (CUDA) support (does anyone know how it works with AMD?). PyTorch offers a cpu version called pytorch-cpu and torchvision-cpu. However, it seems like like some of the NumPy-like functions in PyTorch are not supported for CPUs.

Some useful resources:

A little summary upfront. I revisit fastai from time to time and realize that it is under heavy development, and therefore my impression has changed from time to time. I think that this is a very nice tool for learning some basics. However, for production use I prefer PyTorch or TensorFlow implementations with proper pipelines - much faster and transparent than using fastai for this.

The good

Let’s start with the good stuff. Fastai offers a fairly simple start into the field of deep learning. With 5 lines of code, we are able to train a model. Moreover, the vision stack has a very strong focus on transfer learning. This is useful if we stick with our standard neural network architectures. Further, it supports text (NLP) and tabular data. This is a big advantage, because designing neural networks (especially for tabular data) can be quite painful to prototype.

An outstanding point is, that fastai uses some very interesting optimization functions to reduce training times - very cool stuff ;).

The bad

On the other hand, focusing on transfer learning makes us blind for many recent developments. There are other network architectures that seem to outperform the commonly used. This brings us to the next point. Implementing custom architectures is much more complicated than with Keras. However, similar to Keras, I prefer to implement custom functions in either TensorFlow or PyTorch.

When it comes to documentation, I’m not so sure how good or bad the documentation is. I’ve tried to read it, but for many of my real-world datasets and use cases, I can’t find anything helpful

The ugly

This brings us to the ugly thing with fastai. The data loading and preprocessing pipeline - especially for images - is a pain in the ass. Unless we are using ImageNet datasets, it is almost pointless. Most datasets that I (have to) use, do not fit into this, and quite honestly it wouldn’t make sense. Compared to my manual (and automated) data loading and preprocessing pipelines, fastai sucks. IMHO it lacks of transparency and more importantly documentation - and no, I’m not checking the source code line by line, if I simply can use my standard TF or PyTorch pipelines that work well.