### Theory

### Activation functions:-

**In computational networks, the activation function of a node defines the output of that node given input or set of inputs.**

Fig 1. General structure of an artificial neural network with a single perceptron.

### Types of activation functions:-

**1. Hard-limit Activation Function**

**2. Soft-limit (Sigmoidal) Activation Function**

**3. Piecewise Linear Activation Function**

**4. Signum Activation Function**

### Working:-

Let us consider the problem of building an OR Gate using single layer perceptron.Following is the truth table of OR Gate.

Referring to the above neural network and truth table, X and Y are the two inputs corresponding to X1 and X2. Let Y' be the output of the perceptron and let Z' be the output of the neural network after applying the activation function (Signum in this case). Let the weights be W1=1 and W2=1.

Now,

Y' = X*w

_{1}+ Y*w

_{2}

Z' = F(Y') ; F is the Activation Function. Let us assume the threshold is 0.5.

Thus, Z' = F(Y') will be defined as

**Z' = 1 , Y' >= 0.5
= 0 , Y' < 0.5**

For X = 0 & Y = 0

Y' = X*w

_{1}+ Y*w

_{2}

Y' = 0*1 + 0*1

Y' = 0 + 0

Y' = 0

Z' = F(Y')

Z' = F(0)

For Signum activation function, F(x) = 0 ; x < 0.5

Z' = 0

For X = 0 & Y = 1

Y' = X*w

_{1}+ Y*w

_{2}

Y' = 0*1 + 1*1

Y' = 0 + 1

Y' = 1

Z' = F(Y')

Z' = F(1)

For Signum activation function, F(x) = 1 ; x > 0.5

Z' = 1

For X=1 & Y=0

Y' = X*w

_{1}+ Y*w

_{2}

Y' = 1*1 + 0*1

Y' = 1 + 0

Y' = 1

Z' = F(Y')

Z' = F(1)

For Signum activation function, F(x) = 1 ; x > 0.5

Z' = 1

For X=1 & Y=1

Y' = X*w

_{1}+ Y*w

_{2}

Y' = 1*1 + 1*1

Y' = 1 + 1

Y' = 1

Z' = F(Y')

Z' = F(1)

For Signum activation function, F(x) = 1 ; x > 0.5

Z' = 1

Thus we can plot a graph as shown below where ^ represents 0 and X represents 1.

Here, the line represents the decision boundary which seperates the 2 classes.

**The decision boundary will seperate the 2 classes if similar points lie on the same side of the decision boundary.**Here, the equation of the line will be:

*X+Y=0.5*

Thus, as you can see point (0,0) will have X+Y < 0.5 and will lie on the origin side of the line.

While points (0,1), (1,0), (1,1) will have X+Y > 0.5 and will lie on the non origin side of the line.

Also, from truth table, (0,0) has OR output 0 and (0,1), (1,0), (1,1) have OR outputs 1. This is why we can say that this

**decision boundary correctly classifies the points.**In the simulation you can try various values of w1, w2 and threshold and find that there are many such decision boundaries that can correctly classify these points.

But, if the decision boundary is such that similar points are not on the same side of the line then it is not the correct decision boundary. For e.g, consider the following image:

Here, the equation of the line will be:

*X+Y=1.5*

Thus, as you can see points (0,0), (0,1), (1,0) will have X+Y < 1.5 and will lie on the origin side of the line.

While point (1,1) will have X+Y > 1.5 and will lie on the non-origin side of the line.

But from truth table, (0,0) has OR output 0 and (0,1), (1,0), (1,1) have OR outputs 1. This is the reason why this

**decision boundary is incorrect.**