# Neural Network xor example from scratch (no libs)

jacek
4,345 views

## Introduction

XOR example is a toy problem, a hello world for introducing neural networks. It means you have to build and train the neural network so that given 2 inputs it will output what a XOR function would output (at least close to it). This isn't math heavy explanatory tutorial. You should have at least a vague idea how do neural networks work. This article is intended to provide building blocks in form of simple python scripts. No libraries, no numpy are used to build this simple neural network. Beware the style of the python scripts is hackatonish, but I hope more easily understood this way.

## First script

This is simple script, an implementation of this image.

Here the neural network is just a bunch of loosely written variables. It is trained on xor examples for 10000 epochs, using stochastic gradient descent (or minibatch of size 1 if you like), so no matrix transpositions are needed. Learning rate is 0.1.

The

Example output:

``````1000 mean squared error: 0.24979266353990032
2000 mean squared error: 0.24831882619126208
3000 mean squared error: 0.23561863285516624
4000 mean squared error: 0.1780693775264198
5000 mean squared error: 0.06912242900384753
6000 mean squared error: 0.029067840008850473
7000 mean squared error: 0.01615164457711759
8000 mean squared error: 0.01062363347939824
9000 mean squared error: 0.007720927162456013
10000 mean squared error: 0.005980352776240471
0 0 0.08988830233230768
1 0 0.9260414851726995
0 1 0.9254344628052803
1 1 0.06936586304646092
``````

Your mileage may vary. Sometimes this simple net will diverge and output for all inputs the 0.666..., or it would need more iterations to train. It's normal as it is more sensitive to starting random weights than more complex models. NN libraries suffer from that too, but they can mitigate it by smarter weights initialization. You can play around with learning rate (alpha).