98% Accurate Neural Network

Learning Rate

image1

The highest Accuracies were achieved at the lowest Learning Rates.   

Epochs

Epochs

The highest Accuracy was achieved at 10 or more Epochs (an Epoch is how many times the the data was entered into the Neural Net).

Number of Hidden Nodes

image2

The highest Accuracy was achieved with about half as many Hidden Nodes as Input Nodes.   Further increase in HNs did not improve Accuracy.

Activation Function

output signal = activation function(sum of input signals)

The signal emerging from a node is the activation function of the sum of signals entering the node.


How my Net works

My neural network was written in Python, using a Jupyter notebook.   It learned (trained) from  one set of numbers, (0-9), then was tested  on a second set of numbers.    To see details of this neural network, such as samples of the correct or incorrect predictions, see below.     Find me  at Steven J. Klatte on LinkedIn.

Predicted Numbers

image3

<- My Net correctly found this to be a "2"!

image4

The Net thought this figure was an "8" ->

The output signal for the "8" was very weak, only 17/99.   Generally, the output signal, or "confidence of prediction," of a miss was much lower (52%), than the confidence of an accurate prediction, 94%.

image5

<- The MNIST dataset consists of 70,000 digits, from 0-9. Each digit is represented by 785 numbers. The first is the intended number, 0-9 (in this case, a "6"). The remaining 784 numbers represent the greyscale (0-254) of each pixel of a 28x28 square, into which the number was written.

The data above represents about 200 runs on my PC.   The time of runs varied from about 1 minute to a few hours, depending on Epochs, Learning Rates, and Hidden Nodes.   I also wrote and explored Deep Neural Networks, having two hidden layers, and "tilted" the MNIST digits by 10 degrees.