Week#2: Baby Steps
Time Spent: ~10 Hours
Things done:
Tackled Tinygrad docs with perplexity. Identified the main components and high-level process of conversion. This covered things my previous approach with just cursor+claude on the codebase did not.
First time heard about JIT compilation and read up on it(Tinygrad uses it sometimes). Pretty Cool. Will revisit in a while to read more in depth.
Went through the Micrograd codebase to understand it. It has a cool example of usage.
Thoughts: Slow week, had other commitments come up. But progress is always cool. I am happy.
The main issue I faced was "What the heck does an ML engineer need this for?" for every ML framework I looked at. I have an approach in mind that might be fruitful. Let's see how it pans out next week.
Resources:
- Tinygrad Documentation
- JIT Compilers: Computerphile has a great intro video here. And just used perplexity to crawl through the internet for it.
- Micrograd Codebase