teenyarray is to numpy as teenygrad is to tinygrad (or PyTorch)
I'm building this as a personal project in hopes of producing a distillation of a library like Numpy. Just as teenygrad includes only core deep-learning functionality needed to train the MNIST, my goal is to implement a core array API that I can use to train the MNIST. This means replacing Numpy as a dependency of teenygrad with this library.
- design outlines design choices
- developer-docs will help get things working
- python-api will eventually have the Python API
- auxillary-docs has external resources and documentation that I've found helpful or relied on
- build my first "real" cpp project
- build my first "under the hood" project that exposes Python bindings
- Learn about numpy internals
- Practice lower level systems concepts as I write (hopefully clean and efficient) cpp
- (eventually) learn some GPU programming by implementing this on METAL
- get the 4 main classes defined (skeleton) and exposed via bindings
- Dtype
- Tarray
- add array creation methods
- static methods (.zeroes() .ones(), etc)
- figure out a way to bind fill to python (takes in python type, passes through correct func with C-type (which is manually created from the temlate function))
- might need a new typeMap??
- from python data (.array())
- figure out a default datatype if no argument is provided
- getitem and setitem
- basic indexing
- slicing
- reshape
- tranposing
- add array creation methods
- Tfunc
- break the the 28 OPs into categories: Unary, Binary, Reduce, Ternary, and Load
- handle broadcasting
- ArrayScalar
- get element-wise ops working
- figure out how iteration will work
- work on memory management
- lazy evaluation
- copy on write?
- get a testing framework setup
- write tests for all the foundational methods
- add an overall line counter (resist the bloat!)
- do a writeup on why i built this, clearly outlining the process
- work on optimizations (matmul, etc.)
- once end to end CPU is working, port to METAL
- get some sort of benchmark going