Dr. Mark Humphrys

School of Computing. Dublin City University.

Home      Blog      Teaching      Research      Contact

Online coding site: Ancient Brain

coders   JavaScript worlds

Search:

CA114      CA170

CA686      Projects

Free AI coding exercises


Deep learning

There has been a massive growth of neural networks after 2005.
  

New approaches

A series of new approaches to fixing memory, credit assignment, weight initialisation, more zero output nodes, GPUs for parallel hardware, and other modifications led to "deep neural networks".

  
Breakthrough paper for deep neural networks:
  1. "A fast learning algorithm for deep belief nets". Hinton, G. E., Osindero, S., & Teh, Y. W. (2006). Neural computation, 18(7), 1527-1554.
    Weights are initialized by training in a certain way rather than randomly.



Rectifier function: Zero-output nodes

Deep learning discovered issues in using the Sigmoid function and other continuous functions as the activation function.

It has been discovered that a much simpler activation function, the Rectifier function, has important properties for deep neural networks.
The Rectifier function is:

f(x) = max(0,x)
  

The "Rectifier" activation function (blue).
From here.

Properties:

Impact on neural network learning:

  

Parallel Hardware: Neural networks on GPUs

Clearly a neural network maps perfectly to parallel hardware. It consists almost entirely of simple calculations that could be done in parallel, with a CPU at each node.

It is very wasteful to implement a neural network on serial hardware.

Modern computers already have massively parallel systems for doing simple calculations: GPUs.
So implementing neural networks on GPUs became an important part of Deep Learning.

  

Neural networks in JS on GPU




Sample applications of Neural networks




The brain

The brain has 100 billion neurons, each with up to 15,000 connections with other neurons. (Actually these figures include the entire nervous system, distributed over the body, which can be seen as an extension of the brain).

The adult brain of H.Sapiens is the most complex known object in the universe. Perhaps the most complex object that has ever existed. One brain is far more complex than the entire world telephone system / Internet (which has smaller number of nodes, and much less connectivity).

If we considered each neuron as roughly the equivalent of a simple CPU with 100 k of memory, then we have 100 billion CPUs with 10,000 terabytes of memory, all working in parallel and massively interconnected with hundreds of trillions of connections.

It is not surprising that the brain is so complex and at the same time consciousness and intelligence are mysterious. What would be surprising would be if the brain was a simple object.

  


ancientbrain.com      w2mind.org      humphrysfamilytree.com

On the Internet since 1987.

Wikipedia: Sometimes I link to Wikipedia. I have written something In defence of Wikipedia. It is often a useful starting point but you cannot trust it. Linking to it is like linking to a Google search. A starting point, not a destination. I automatically highlight in red all links to Wikipedia and Google search and other possibly-unreliable user-generated content.