Implementation of XOR Gate Using Multi-Layer Perceptron/ Error Back Propagation

Post Test

1. Why is the XOR problem exceptionally interesting to neural network researchers?

A. Because it can be expressed in a way that allows you to use a neural network
B. Because it is complex binary operation that cannot be solved using neural networks
C. Because it can be solved by a single layer perceptron
D. Because it is the simplest linearly inseparable problem that exists

2. What is back propagation?

A. It is another name given to the curvy function in the perceptron
B. It is the transmission of error back through the network to adjust the inputs
C. It is the transmission of error back through the network to allow weights to be adjusted so that the network can learn
D. None of the mentioned

3. What type of learning algorithm is used in EBPMLP?

A. Supervised learning
B. Reinforcement learning
C. Active learning
D. Unsupervised learning

4. What effect does the learning rate have?

A. Always increases the rate of change of weights
B. Always decreases the rate of change of weights
C. Increases the rate if value too high and decreases the rate if value too low
D. No effect



Hints:- Try these values and verify that you get the correct output.

1. MLP:
W11 = W12 = W21 = W22 = 1,
b1 = 1.5, b2 = 0.5 and b3 = 0.5,
V1 = -2 and V2 = 1.

2. EBP:
W11 = 1, W12 = -1, W21 = 2 and W22 = 3,
b1 = b2 = b3 = -1,
V1 = -1 and V2 = -2,
Learning rate = 0.75 and No. of iterations = 10,00,000.