Talha Ilyas
Course: Advanced Neural Networks
Instructor: Prof. Hyongsuk Kim
Quarter: Winter 2019

A nice blockquote

image-right

Link Text

Abstract

This Repo creates a Neural Network form scratch using only numpy library. The NN is tested on MNIST-Digits dataset which contains 10 classes for classification. MNIST like a ‘hello_world’ dataset in machine learning community. This repo also builds the popular optimizers like; *ADAM *RMSprop *Adagrad Gradient descent with and without momentum term by using only numpy The effect of different activations functions (e.g. sigmoid, ReLu) is also studied in this repo. This repo takes you through the steps regarding what is happening under the hood of artificial neural networks created by high level libraries like Tenforflow Keras Pytorch As this repo only uses numpy so it only runs on CPU that’s why its considerably slower than the models running on GPUs.

Files