Transfer Learning, Finding Learning Rate, and More

Training a neural network from scratch is unnecessary in a situation when pretrained models are readily available. This notebook showcases loading a pretrained model directly from pytorch.models subpackage, its subsequent re-training for the specific problem at hand, a method used to determine the optimal learning rate, ways of defining custom transforms and finally assembling several models into an ensemble.

PyTorch_Article

Once again, the examples are based on Chapter 4 of Ian Pointer’s “Programming PyTorch for Deep Learning”, released by O’Reilly in 2019.

Jupyter notebook of the model may be found on GitHub: https://tinyurl.com/yyzth69w

Also, the same notebook is available on Google Colab (where it can be tested it against the GPU available there): https://tinyurl.com/y4rlnqfs

Convolution Neural Network Image Classifier

Following the initial image classifier built using the feed-forward architecture the convolution neural networks are the next step on the quest of surpassing human-level accuracy. In essence, the nodes making up the layers of CNNs are not fully connected with nodes adjacent to them, but are rather divided into distinct sets covered by the filter, or the convolution kernel (hence the name).

PyTorch_Article

Such an approach allows the network to “compress” tensors and thus extract and (hopefully) learn important details about the images it is trying to classify. In result the network is both more accurate and also faster to train from the get-go. Also, it’s important to mention that the CNN in this case is actually AlexNet, historically important first network that attempted to classify images as described.

In case you’re interested in an entry level PyTorch deep learning textbook I wholeheartedly recommend the following book by Ian Pointer: https://tinyurl.com/y269xn26

Jupyter notebook of the model may be found on GitHub: https://tinyurl.com/y467epf9

Also, the same notebook is available on Google Colab (where it can be tested it against the GPU available there): https://tinyurl.com/y3faorp8

A Simple Feed-Forward Image Classifier

Recently I dared to make first steps into the exciting and vast world of deep learning, which was a near impossible endeavor prior to finding a proper textbook to gently guide me into the field. While the text and the code examples found therein proved to be of great help, I decided to supplement those by writing my own examples (and extensively commenting them) and logging my progress by means of blog posts.

First yield of such practice is thus before you – a simple feed-forward neural network used to classify images into two classes, namely cats and fish.

Jupyter notebook of the model may be found on GitHub: https://tinyurl.com/yypx9ump

Also, the same notebook is available on Google Colab (where it can be tested it against the GPU available there): https://tinyurl.com/y3d2jfuu