Multilayer Perceptron in Machine Learning

A Multilayer Perceptron or MLP is one of the simplest feed-forward neural networks. Multilayer Perceptrons are the types of neural networks which are bidirectional as they foreword propagation of the inputs and backward propagation of the weights. If you want to learn about Multiplayer Perceptron in machine learning, then this article is for you. In this article, I will take you through an introduction to Multilayer Perceptron and its implementation using Python.

Multilayer Perceptron

Some machine learning practitioners often confuse Perceptron and a Multilayer Perceptron with each other. Perceptron is the most basic architecture of the neural network, it is also known as a single-layered neural network. Perceptron is specially designed for the problems of binary classification, but MLPs has nothing to do with perceptron.

A Multilayer Perceptron has an input layer and an output layer with one or more hidden layers. In MLPs, all neurons in one layer are connected to all neurons in the next layer. Here, the input layer receives the input signals and the desired task is performed by the output layer. And the hidden layers are responsible for all the calculations. Here is the architecture of the multilayer perceptrons:

Multilayer Perceptron

I hope you now have understood what are multilayer perceptrons in machine learning. Now in the section below, I will take you through its implementation using Python.

Multilayer Perceptron using Python

We can use the Keras library in Python to build an architecture of Multiplayer Perceptrons using Python. So let’s see how to build an architecture of a Multilayer perceptron by using the Keras library in Python:

from keras.models import Sequential
from keras.layers import Dense, Activation
model = Sequential()
model.add(Dense(64, input_dim=2))
model.add(Activation("relu"))
model.add(Dense(32))
model.add(Activation("relu"))
model.add(Dense(16))
model.add(Activation("relu"))
model.add(Dense(2))
model.add(Activation("softmax"))
model.compile(optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"])
model.summary()
view raw MLP.py hosted with ❤ by GitHub
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_4 (Dense)              (None, 64)                192       
_________________________________________________________________
activation_4 (Activation)    (None, 64)                0         
_________________________________________________________________
dense_5 (Dense)              (None, 32)                2080      
_________________________________________________________________
activation_5 (Activation)    (None, 32)                0         
_________________________________________________________________
dense_6 (Dense)              (None, 16)                528       
_________________________________________________________________
activation_6 (Activation)    (None, 16)                0         
_________________________________________________________________
dense_7 (Dense)              (None, 2)                 34        
_________________________________________________________________
activation_7 (Activation)    (None, 2)                 0         
=================================================================
Total params: 2,834
Trainable params: 2,834
Non-trainable params: 0
_________________________________________________________________

In the above neural network architecture, I have added:

  1. 64 neurons to the input layer;
  2. 32 neurons to the first hidden layer;
  3. 16 neurons to the second hidden layer;
  4. and 2 neurons to the output layer.

 This is how you can build a multiplayer perceptron using Python.

Summary

In MLPs, all neurons in one layer are connected to all neurons in the next layer. Here, the input layer receives the input signals and the desired task is performed by the output layer. And the hidden layers are responsible for all the calculations. I hope you liked this article on an introduction to Multilayer Perceptron and its implementation using Python. Feel free to ask your valuable questions in the comments section below.

Aman Kharwal
Aman Kharwal

I'm a writer and data scientist on a mission to educate others about the incredible power of data📈.

Articles: 1534

Leave a Reply