micrograd-pp

micrograd_pp

GitHub

Micrograd++ is a minimalistic wrapper around NumPy which adds support for automatic differentiation. Designed as a learning tool, Micrograd++ provides an accessible entry point for those interested in understanding automatic differentiation and backpropagation or seeking a clean, educational resource.

Micrograd++ draws inspiration from Andrej Karpathy’s awesome micrograd library, prioritizing simplicity and readability over speed. Unlike micrograd, which tackles scalar inputs, Micrograd++ supports tensor inputs (specifically, NumPy arrays). This makes it possible to train larger networks.

Usage

Micrograd++ is not yet pip-able. Therefore, you will have to clone the Micrograd++ repository to your home directory and include it in any script or notebook you want to use it in by first executing the snippet below:

import sys
sys.path.insert(0, os.path.expanduser("~/micrograd-pp/python"))

Examples