Try the 8×8 Samples
Select a digit to load its 8×8 bitmap (from samples/<digit>.txt
) and render it below.
Bitmap Grid
Raw Text
This page renders the 8×8 sample files directly from your repository.
Inference here is not executed in the browser; run the C demo locally:
make && ./build/demo samples/3.txt
About
This project demonstrates a minimal end‑to‑end pipeline for TinyML:
- Train a small neural network for digit recognition in Python/Colab
- Export weights and biases
- Convert them into C arrays
- Run inference in C as if on a microcontroller (chosen target: 8051)
Project Structure
- Training:
notebooks/MicroProcessor.ipynb
- Raw weights:
weights_raw/
- C arrays:
weights/weights.c
,weights/weights.h
- Inference code:
src/
,include/
- Samples:
samples/
How It Works
- Prepare 8×8 digit samples (
samples/<digit>.txt
) - Run the training notebook in Colab to generate weights
- Transform weights into
C
arrays - Compile and run inference on device
This approach highlights how machine learning models can be distilled into minimal forms that fit on constrained devices.