Deep Learning

I worked in the field of artificial intelligence during the "AI winter" (a backlash against too optimistic predictions of achieving "real AI") and to this day I avoid getting too optimistic of huge short term gains in our field. That said, in the last several months a few things are stirring up my old optimism!

I have been enjoying Geoffrey Hinton's Coursera course Neural Networks for Machine Learning and I was pleased to see a front page New York Times article this morning on deep learning. Another really nice reference for deep learning is a very large PDF/viewgraph presentation by Richard Socher, Yoshua Bengio and Chris Manning.

Another very good resource is the Deep Learning Tutorial that provides the theory, math, and working Python example code.

Deep neural networks have many hidden layers and have traditionally been difficult to train.

In addition to very fast processors (graphics chipsets) a very neat engineering trick is pre-training weights in deep networks by stacking Restricted Boltzmann Machines, etc. After pre-training, weights can be fine-tuned using back propagation.

I haven't been this optimistic about (relatively) short term progress in AI since the early 1980s. Hoorah!

Comments

  1. Интересная статья. Открыл новый для себя блог.

    ReplyDelete
  2. Отличная новость.

    ReplyDelete

Post a Comment

Popular posts from this blog

Ruby Sinatra web apps with background work threads

Time and Attention Fragmentation in Our Digital Lives

My Dad's work with Robert Oppenheimer and Edward Teller