Basic Usage#
Alright, let's get to the fun part! Here's everything you need to know to start using microgradpp.
The Value Class#
The Value class is the heart of microgradpp. Every number in your computation is wrapped in a Value, and microgradpp tracks all the operations so it can compute gradients automatically.
#include <iostream>
#include "Value.h"
int main() {
// Create values
auto a = microgradpp::Value::create(2.0);
auto b = microgradpp::Value::create(3.0);
// Build a computation graph
auto c = a * b;
// Run backpropagation
c->backProp();
std::cout << "a->grad: " << a->grad << std::endl; // 3.0
std::cout << "b->grad: " << b->grad << std::endl; // 2.0
return 0;
}When you call backProp(), microgradpp walks backward through your computation graph and fills in the .grad field for every Value involved. That's reverse-mode autodiff in action!
More Operations#
You can chain operations and use powers too:
auto x = microgradpp::Value::create(2.0);
auto y = microgradpp::Value::create(3.0);
// Chain multiple operations
auto z = x * y + x->pow(2);
z->backProp();
// x->grad and y->grad now hold the correct gradientsBuilding a Neural Network#
Once you're comfortable with Value, building a full neural network is just a few lines away. microgradpp's MLP class makes it super easy:
#include "microgradpp/MLP.hpp"
// Create a Multi-Layer Perceptron
// Input: 3 features | Hidden layers: 4, 4 neurons | Output: 1 neuron
MLP model(3, {4, 4, 1});
// Forward pass
auto output = model(input_data);
// Training loop
model.zeroGrad(); // Clear gradients from the previous step
loss->backProp(); // Compute gradients
// Update weights... // Adjust parametersAlways call model.zeroGrad() before backProp() in your training loop. Otherwise, gradients from previous iterations will accumulate and your training will go haywire!
Activation Functions#
microgradpp supports several activation functions out of the box:
- ReLU — great for hidden layers
- Tanh — smooth and bounded
- Sigmoid — perfect for binary outputs
These can be configured when setting up your layers.