Skip to content

lukas-kuhn/macrograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

micrograd with a bit more punch

This is my personal playground of writing an auto diff engine a la micrograd but based on tensors instead of single values to pack more punch. The tensor class is using numpy arrays for data and gradients.

goals

My personal goal is to implement the following:

  • tensor based autograd
  • linear layers
  • train dense network on mnist

notes

You can read about the process of building this library in Notes.md. Even though I write in a conversational tone these are basically just notes for myself to deeply understand what I did. I hope that these are also helpful to someone other than me but here is no guarantee.

About

micrograd with a lot more punch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published