This is my personal playground of writing an auto diff engine a la micrograd but based on tensors instead of single values to pack more punch. The tensor class is using numpy arrays for data and gradients.
My personal goal is to implement the following:
- tensor based autograd
- linear layers
- train dense network on mnist
You can read about the process of building this library in Notes.md. Even though I write in a conversational tone these are basically just notes for myself to deeply understand what I did. I hope that these are also helpful to someone other than me but here is no guarantee.