All Courses

ReLU Activation Function

Kajal Pawar

3 years ago

ReLU Activation Function | insideaiml
Table of Contents
  • What is Activation Function
  • ReLU Activation Function
  • How to write a ReLU function and its derivative in python?
  • Advantages of ReLU function
  • Disadvantages of ReLU function
  • A simple implementation of ReLU activation function in python

What is Activation Function

          The activation function is actually a simple function that converts your input or set of inputs into a certain result or output. There are different types of activation functions that do this job differently.
The activation functions can be divided in three categories
  • Ridge functions 
  • Radial functions 
  • Fold functions
In this article we study ReLU activation function which is the example of ridge function.

ReLU Activation Function

          ReLU stands for Rectified Linear Unit. ReLU activation function is one of the most used activation functions in the deep learning models. ReLU function is used in almost all convolutional neural networks or deep learning models.
ReLU Graph | insideaiml
The ReLU function  takes the maximum value.
The equation of the ReLU function is given by:
Simplest equation of the ReLU function | insideaiml
Note:
ReLU function is not fully interval-derivable, but we can take sub-gradient, as shown in the figure below. Although ReLU is simple, it is an important achievement in recent years for deep learning researchers.
ReLU (Rectified Linear Unit) function | insideaiml
The ReLU (Rectified Linear Unit) function is an activation function that is currently more popular as compared with the sigmoid function and the tanh function.
   
Recommended blog for you :  Sigmoid Activation Function
    

How to write a ReLU function and its derivative in python?

          So, writing a ReLU function and its derivative is quite easy. Simply we have to define a function for the formula. It is implemented as shown below:
ReLU function
def
relu_function(z):
    return max(0, z)
ReLU function derivative
def relu_prime_function(z):
    return 1 if z > 0 else 0

Advantages of ReLU function

  • When the input is OK, no gradient saturation problem.
  • Easy to implement and very fast
  • The calculation speed is very quickly. The ReLU function has only a direct relationship. Even so forward or backward, much faster than tanh and sigmoid.(tanh and Sigmoid  you need to calculate the object, which will move slowly.)

Disadvantages of ReLU function

  • When the input is negative, ReLU is not fully functional, which means when it comes to the wrong number installed, ReLU will die. This problem is also known as the Dead Neurons problem. While you are forward propagation process, not a problem. Some areas are sensitive while others are present unsympathetic. But in the back propagation process, if you enter something negative number, the gradient will be completely zero, with the same problem as sigmoid function and tanh function.
  • We find that the result of ReLU function can be 0 or positive number, which means that ReLU activity is not 0-centric activity.
  • ReLU function can only be used within Hidden layers of a Neural Network Model.
To overcome the Dead Neurons problem of ReLU function another modification was introduced which is called Leaky ReLU. It introduces a small slope to keep the updates alive and overcome the dead neurons problem of ReLU.
Another variant was made from both ReLu and Leaky ReLu called which is known as Maxout function which we will be discussing in details in other articles.

A simple implementation of ReLU activation function in python

# importing libraries
from matplotlib import pyplot
 
# create rectified linear function
def rectified(x):
    return max(0.0, x)
 
# define a series of inputs
series_in = [x for x in range(-10, 11)]
# calculate outputs for our inputs
series_out = [rectified(x) for x in series_in]
# line plot of raw inputs to rectified outputs
pyplot.plot(series_in, series_out)
pyplot.show()
Output: The plot of ReLU activation function is given below
ReLU Avtivation Function plot | insideaiml
I hope you enjoyed reading this article and finally, you came to know about ReLU Activation Function.
To know more about python programming language follow the insideaiml youtube channel.  
For more such blogs/courses on data science, machine learning, artificial intelligence and emerging new technologies do visit us at InsideAIML.
Thanks for reading…
Happy Learning…
    
Recommended course for you :
  
Recommended blogs for you :

Submit Review