Welcome to microgradpp 🚀#
Hey there! Tired of clunky, hard-to-use C++ machine learning libraries? Say hello to microgradpp — a pure C++17 implementation of an automatic differentiation engine with a neural network library built right on top.
It's inspired by Andrej Karpathy's legendary micrograd, but brings all that simplicity and educational goodness straight to the C++ world.
What is microgradpp?#
microgradpp implements backpropagation (reverse-mode autodiff) over a dynamically built computational graph. In plain English: it figures out gradients automatically so you can train neural networks without doing the math by hand. Pretty sweet, right?
Why choose microgradpp?#
| Feature | microgradpp | Python micrograd |
|---|---|---|
| Language | Modern C++17 | Python |
| Performance | Fast (compiled) | Interpreted |
| Computer Vision | ✅ OpenCV support | ❌ |
| Header-Only Option | ✅ | ✅ |
Features at a Glance#
- ✅ Pure C++17 — high performance, no Python overhead
- ✅ Automatic Differentiation — backprop just works
- ✅ Computer Vision Support — OpenCV integration included
- ✅ Header-Only Option — drop it into any project easily
- ✅ Tensor Class — simplified data loading and manipulation
- ✅ Multiple Activation Functions — ReLU, Tanh, Sigmoid, and more
If you find microgradpp useful, consider giving it a ⭐ on GitHub!