This is a self written fully connected neural network with three layers, trained on the MNIST database of handwritten digits. For activation, I chose ReLu for the input and hidden layer and sigmoid for the output. The cost function is the mean squared error and the training algorithm is the mini-batch stochastic gradient descent. The weight and bias initialisation is a random number between -1 and 1. I did not chose any weight regulisation. It trained for 200 epochs with 64 samples per batch and a static learning rate of 0.5. It achieved 95% accuracy on the test set.
The MNIST dataset is filled with american digits, so it might not recognize your version of certain numbers.Sample of the MNIST training data from Wikipedia
Below the network can be tried out by drawing a digit inside the box and pressing the recognize button. Click on the fast forward button on the right of the upper canvas to progress more quickly.