Skip to content

A hand made neural network and backprop implementation

Notifications You must be signed in to change notification settings

haydn-jones/MNISTFromScratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GitHub Logo

MNIST From Scratch

In an effort to get more familiar with backpropagation I embarked on a project in which I wrote a neural network, a small linear algebra library, and a simple neural network optimizer from scratch. The neural network is trained on the MNIST dataset. I have implemented the following from scratch:

- Backpropagation algorithm
- Neural network class
- Parallelization (with OpenMP, so not entirely from scratch)
- Necessary linear algebra operations (transpose, matrix multiplication, etc.)
- Loading dataset from binary distribution

Features

- Batch level and dataset-level parallelization with OpenMP
- Pretty printing of image and evaluation of image by network
- Zero'ing out of small weights (just an experiment)
- Script to download dataset (download_dataset.sh)

Building and Running

cd MNISTFromScratch
mkdir build
cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j
./MNISTFromScratch

Pretty printing output

example output

Releases

No releases published

Packages

No packages published