A simple Numpy python script for CMD console to build a 2-input 3-levels of neurons (2 input-level neurons) (two hidden layer neurons) (one output-level neuron) model to illustrate how an AI model can perform the XOR operation. This script initializes a tiny neural network with random weights and trains it using the backpropagation algorithm. After training, the network should be able to correctly perform the XOR operation on the 2 inputs. The key to solving the XOR problem with a neural network is to have a non-linear activation function, like the sigmoid function used here, and a hidden layer that can create the necessary non-linear decision boundaries. This script illustrates how an AI model can perform the logical XOR operation, using a minimal simple neural network with a single hidden layer containing two neurons.
Adaptive learning rate is used to refine the loss.
The script produces a working XOR having a loss under 1% for all inputs.
But, the output is never exactly 1.0 or 0.0 as would be a true boolean XOR gate.