top of page
Search

Deep Neural Networks are nonlinear Activation Functions

  • Writer: Arturo Devesa
    Arturo Devesa
  • Mar 2, 2021
  • 1 min read

Almost any process imaginable can be represented as a functional computation in a neural network, provided that the activation function is non-linear.



Non-linear functions address the problems of a linear activation function:

  1. They allow backpropagation because they have a derivative function which is related to the inputs.

  2. They allow “stacking” of multiple layers of neurons to create a deep neural network. Multiple hidden layers of neurons are needed to learn complex data sets with high levels of accuracy.


https://missinglink.ai/guides/neural-network-concepts/7-types-neural-network-activation-functions-right/

 
 
 

Recent Posts

See All
Big Tech macroeconomic trends

The 2022 Russian invasion of Ukraine caused inflation from the sanctions against Russian energy worldwide. This caused the US Fed to...

 
 
 

Comments


©2020 by Arturo Devesa.

bottom of page