top of page
Search

Activation Function in Neutral Network

  • proximaalpha2
  • Aug 1, 2022
  • 1 min read



The activation function performs some kind of action to change the sum into a number that is frequently between two limits. This transition frequently involves non-linearity.


An artificial neural network's activation function is so crucial. Before transferring the input to the next layer of neurons or completing the output, they can perform a non-linear transformation on the input to determine whether a neuron should be engaged or not.




What activation function an ML engineer uses for the network's nodes is one of the most crucial and significant decisions they must make. This is dependent on the network's structure, dataset, and goal.



Why are activation functions necessary?


A neuron's activation status is determined by an activation function. According to this, it will perform some straightforward mathematical operations to decide whether the input from the neuron to the network is important or not for the prediction process.




The objective of the activation function is to enable the introduction of non-linearity into an artificial neural network and the generation of output from a set of input values fed to a layer.




Why non-linear activation functions are necessary:


-


Without an activation function, a neural network is essentially just a linear regression model. The activation function transforms the input in a non-linear way, enabling it to learn and carry out more difficult tasks.




Conclusion

The activation function transforms the input in a non-linear way, enabling it to learn and carry out more difficult tasks. The activation functions used by all hidden levels are often the same. For best results, ReLU activation function should only be employed in the hidden layer.


You can read relu stands for what here in detail.


 
 
 

Comentarios


learnpython

©2022 by learnpython. Proudly created with Wix.com

bottom of page