Reverse Mode Autodifferentiation Explained

Reverse Mode Autodifferentiation Explained This article is my attempt to explain reverse mode autodifferentiation to myself and hopefully to anyone else that finds this useful. (Link to notebook) Why autodifferentiation? The reason we prefer autodifferentiation over symbolic differentiation is due to its efficiency and simplicity. Instead of writing out explicit derivatives or parsing complex expressions and finding their symbolic derivatives, we can just compute a derivative at a particular value directly with the help of autodifferentiation....

January 14, 2024

Why convolutions are effective

Convolutional neural networks have seen great success in computer vision tasks. However, why is this architecture so effective? This article hopes to elucidate the apparent efficacy of convolutional networks in many computer vision tasks. We’ll approach this by training a convolutional network on the Fashion MNIST dataset. (Link to notebook). A brief look at the dataset First, we make some necessary imports: import torch import torch.nn as nn import torch.nn.functional as F from torch....

January 14, 2024

Optimizing Matrix Multiplication with Zig

I recently started playing with the Zig programming language and wanted to try it out for its speed. And what better way to do that than to try optimizing matrix multiplication? Since there are a plethora of resources to understand how to multiply matrices efficiently (see the Resources section below), I won’t be doing anything intense in this article (though maybe in the future I will). The naive matrix multiplication algorithm is given below in Zig:...

May 7, 2023