- a Mini Auto Differentiation / Gradient Descent library for teaching purposes.
Ref: micrograd, and other online webs.
Probably one of the lightest NN library online.
much faster for small datasets.
Pros: small space, fast runtime. easy to use for small datasets.
Cons: not optimized in terms of big datasets.
Prereqs: pip3 install numpy
Run: python3 example.py
Example Output: [0, 1, 1, 0]
Math Explanation for .backward()
(for educational purposes).
It could be found there.
Math Explanation for .backward_opt()
(for better and faster convergence).
It could be found there.
Math Explanation for multi-D y (./iris/minigrad_iris.py
).
It could be found there.
Optimizer is used by the library by default.
RAdam (Rectified Adam), which introduces a rectification term in addition to the popular standard Adam optimizer (which uses Momentum and RMSprop and a bias-correction mechanism). It performs well for small datasets.
For additional support for multi-D y, check out /iris
. I seperated them out so it's easier to study the basics bc it's more of a tutorial.