Deep Neural Networks are nonlinear Activation Functions

Almost any process imaginable can be represented as a functional computation in a neural network, provided that the activation function is non-linear.

Non-linear functions address the problems of a linear activation function:

  1. They allow backpropagation because they have a derivative function which is related to the inputs.

  2. They allow “stacking” of multiple layers of neurons to create a deep neural network. Multiple hidden layers of neurons are needed to learn complex data sets with high levels of accuracy.

Recent Posts

See All

AI behavioral prediction

what you want is ultimately to learn about human behaviors and anticipate them. That’s called behavioral prediction. These behavioral predictions are made using recurrent neural networks: they use pas

Dijkstra shortest path algorithm

Word ladder game (change only one letter to go from Fool to Sage): Fool, Pool, Poll, Pole, Pale, Sale, Sage. How? Dijkstra shortest path algorithm