Home » Can you explain the function of the sigmoid?

Can you explain the function of the sigmoid?

by Uneeb Khan

Early in deep learning, the sigmoid function is used. An easy-to-implement smoothing function. “Sigmoidal” comes from the Greek letter Sigma, and the curve has an “S”-shaped Y-axis slope.

A tanh function illustrates a sigmoidal component, a sort of logistic function that refers to any function that preserves the “S” shape (x). tanh(x) follows a similar structure, however, it exists between -1 and 1 rather than 0 and 1. Standard sigmoidal functions fall between 0 and 1. A sigmoidal function is differentiable in and of itself, which means that we can easily compute the slope of the sigmoid curve for any two given locations.

When examining the sigmoid function, we observe that the output is centered within the open interval (0,1). Although it is possible to evaluate probability, we should not approach it in the typical sense. When it came to popularity, the sigmoid function formerly dominated. One way to consider it is the rate at which neurons fire their axons. The region of the cell that responds most readily to stimulation is located in the center, where the slope is relatively steep. The inhibitory component of the neuron is situated on the neuron’s sides, where the hill is somewhat gentle.

A few issues exist with the sigmoid function itself.

1) The gradient of the function approaches 0 as the input advances away from the origin. While working on the backpropagation procedure for neural networks, we all employ a concept known as the chain rule of differential. This enables us to calculate the difference between each weight w. After sigmoid backpropagation, the difference in this chain is minimal. It may also pass through numerous sigmoid functions, reducing the weight(long-term)’s impact on the loss function. This is a setting that could be more conducive to weight optimization. This type of problem has two names: gradient saturation and gradient dispersion.

2) Weight update efficiency is reduced since the function output is not centred on 0.

3) The sigmoid function requires exponential calculations, making computer computations more time-intensive.

Listed below are some of the advantages and disadvantages connected with Sigmoid functions:

Sigmoid Function Provides the Following Advantages: –

It provides a smooth gradient, which is advantageous for us because it prevents “jumping” in the output data.

Limiting neuron output values to 0–1 standardize them.

It makes accurate predictions, meaning the outcomes are incredibly close to 1 or 0, allowing us to improve the model’s performance.

Sigmoid functions suffer from the following drawbacks:

It is especially susceptible to the issue of fading gradients.

Not zero-centered.

The length of time required for power operations contributes to the overall complexity of the model.

How do you construct a sigmoid function in Python, as well as its derivative?

Consequently, it is not difficult to formulate a sigmoid function and its derivative. Simply put, a function must be defined for the formula to operate.

The purpose of the Sigmoid

def sigmoid(z): return 1.0 / (1 + np.exp(-z))

Sigmoid function derivative def sigmoid prime(z):

return sigmoid(z) * (1-sigmoid(z))

Code in Python exhibiting a simple implementation of the sigmoid activation function

#import libraries

import NumPy as np from matplotlib. pyplot as plt in the import statement

Define the sigmoid function using def sigmoid(x).

s=1/(1+np.exp(-x))

ds=s*(1-s)

return s, ds, and a=np.

subsequently, arrange (-6,6,0.01)

sigmoid(x)

Center axes with plt.subplots(figsize=(9, 5)). formula.

position(‘center’) ax.spines[‘left’] sax.spines[‘right’]

put sax.spines[‘top’] color(‘none’) to the x-axis.

set ticks position(“lowest”).

y-axis.

set ticks position() to “left”

# Create and show the figure using the code below: axe. plot(a, sigmoid(x)[0], color=”#307EC7″, linewidth=3, label=”sigmoid”) axe. plot(a, sigmoid(x)[0], color=”#307EC7″, linewid

ax.plot(a,sigmoid(x)[1], color=”#9621E2,” linewidth=”3,” and label=”derivative” ax.plot(a,sigmoid(x)[2], color=”#9621E2″, linewidth=3, label=”derivative”) ax.legend(loc=”upper right”, frameon=”false”).

fig.show()

Output:

As illustrated below, the preceding code graphs the sigmoid and its derivative function.

A tanh function illustrates a sigmoidal component, a sort of logistic function that refers to any function that preserves the “S” shape (x). tanh(x) follows a similar structure, however, it exists between -1 and 1 rather than 0 and 1. The standard sigmoid function lies between 0 and 1. A sigmoid function is differentiable in and of itself, which means that we can easily compute the slope of the sigmoid curve for any two given locations.

Output is centered in the open interval (0,1). Although it is possible to evaluate probability, we should not approach it in the typical sense. When it came to popularity, the sigmoid function formerly dominated. One way to consider it is the rate at which neurons fire their axons. The region of the cell that responds most readily to stimulation is located in the center, where the slope is relatively steep. The neuron’s inhibitory component lies on its sides, where the hill is gentler.

Summary

I hope you enjoyed reading this post and that you now have a better grasp of the Sigmoid Activation Function and how to construct it using Python.

Similar articles and courses on data science, machine learning, artificial intelligence, and other fascinating new technologies may be found at InsideAIML.

Thank you for reading so carefully…

Happy Studying…

Also read 

Related Posts

MarketGuest is an online webpage that provides business news, tech, telecom, digital marketing, auto news, and website reviews around World.

Contact us: [email protected]

@2024 – MarketGuest. All Right Reserved. Designed by Techager Team