This project implements a multilayer perceptron (MLP) neural network in Rust from scratch. It includes forward and backward propagation for different modules such as Linear, Sigmoid, and CrossEntropyLoss.
Download the MNIST dataset from Kaggle and place the files in the data folder.
- Training progress tracking: Training and validation loss/accuracy are monitored using
kdam. - Efficient matrix operations: Layers and optimizers leverage the
ndarraycrate. - CSV data handling: The dataset is loaded using the
polarscrate.
To see available options, run:
cargo run --release -- --help
Usage: simple-dnn-mnist [OPTIONS]
Options:
-t, --train <TRAIN> [default: data/mnist_train.csv]
-v, --validation <VALIDATION> [default: data/mnist_test.csv]
--training-batch-size <TRAINING_BATCH_SIZE> [default: 32]
--validation-batch-size <VALIDATION_BATCH_SIZE> [default: 64]
--lr <LR> [default: 1e-2]
-e, --epochs <EPOCHS> [default: 100]
-h, --help Print help
-V, --version Print versionUse the following command to train the model:
$ cargo run --release -- -t data/mnist_train.csv -v data/mnist_test.csv -e 100 -l 1e-2