Monday, August 4, 2014

DEEP LEARNING

DEEP LEARNING

Deep Learning and Unsupervised Feature Learning
ANDREW NG
http://cs.stanford.edu/people/ang/?portfolio=deep-learning-and-unsupervised-feature-learning

Unsupervised Feature Learning and Deep Learning
http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial


https://www.youtube.com/results?search_query=unsupervised+deep+learning
Andrew Ng: Deep Learning, Self-Taught Learning and Unsupervised Feature Learning
Deep Learning London Meetup: Brains, Data, Machine Intelligence & Cortical Learning with Jeff Hawkins
Geoff Hinton - Recent Developments in Deep Learning

Yann LeCun tutorial on deep learning
http://www.cs.nyu.edu/~yann/talks/lecun-ranzato-icml2013.pdf


matlab

Neural Network Toolbox™
User’s Guide
R2014a
Mark Hudson Beale
Martin T. Hagan

Howard B. Demuth

RNN

RNNLM Toolkit

http://rnnlm.org/
Hands-on tutorial on RNNLM toolkit (C)
http://research.microsoft.com/apps/video/default.aspx?id=172643
Recurrent Neural Network language modeling toolkit (C)


Miscellaneous Code for Neural Networks, Reinforcement Learning, and Other Fun Stuff (matlab)
http://www.cs.colostate.edu/~anderson/code/

Layer recurrent neural network
http://www.mathworks.com/help/nnet/ref/layrecnet.html

A tutorial on training recurrent neural
networks, covering BPPT, RTRL, EKF and the
"echo state network" approach
Herbert Jaeger
Fraunhofer Institute for Autonomous Intelligent Systems (AIS)
since 2003: International University Bremen

http://minds.jacobs-university.de/sites/default/files/uploads/papers/ESNTutorialRev.pdf

CONTEXT DEPENDENT RECURRENT NEURAL NETWORK LANGUAGE MODEL
Tomas Mikolov ∗
BRNO University of Technology
Czech Republic
Geoffrey Zweig
Microsoft Research
Redmond, WA USA
http://research.microsoft.com/pubs/176926/rnn_ctxt.pdf

google search - finding latent semantic with Recurrent neural networks
https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=finding%20latent%20semantic%20with%20Recurrent%20neural%20networks

google search  - RNN matlab
https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=RNN%20matlab

Word2Vec implementation in DL4J
http://deeplearning4j.org/word2vec.html

Deep Learning Videos


Geoff Hinton - Recent Developments in Deep Learning
May 30, 2015
https://www.youtube.com/watch?v=vShMxxqtDDs

Deep Learning: The Theoretician's Nightmare or Paradise? (LeCun, NYU, August 2012)







https://www.youtube.com/watch?v=vShMxxqtDDs


Beyond Deep Learning – 3rd Generation Neural Nets
3rd Gen Spiking Neural Nets (SNNs)
Posted by William Vorhies on October 4, 2016
Summary: If Deep Learning is powered by 2nd generation neural nets. What will the 3rd generation look like? What new capabilities does that imply and when will it get here?

http://www.datasciencecentral.com/profiles/blogs/beyond-deep-learning-3rd-generation-neural-nets

By far the fastest expanding frontier of data science is AI and specifically the rapid advances in Deep Learning.  Advances in Deep Learning have been dependent on artificial neural nets and especially Convolutional Neural Nets (CNNs).  In fact our use of the word “deep” in Deep Learning refers to the fact that CNNs have large numbers of hidden layers.  Microsoft recently won the annual ImageNet competition with a CNN comprised of 152 layers.  Compare that with the 2, 3, or 4 hidden layers that are still typical when we use ordinary back-prop NNs for traditional predictive analytic problems.
Two things are happening. 
  • First, CNNs have come close to achieving 100% efficiency for image, speech, and text recognition.  Now that there are industrial strength platforms for CNNs we are in the age of exploitation where these features are rapidly being incorporated in the apps we use every day.
  • Second, we are rapidly recognizing the limitations of CNNs which are 2nd generation neural nets, and we’re ready to move on to 3rd generation and eventually 4th gen neural nets.  Needless to say, 3rd gen NNs didn’t get started yesterday.  It’s research that’s been ongoing for some years and will still take a year or three to become mainstream.  We want to share a little of what you can expect.

Richard Socher

2016 Stanford CS224d: Deep Learning for NLP
YouTube 15 videos 18 hours

Andrej Karpathy


2016 Stanford CS231n: Convolutional Neural Networks for Visual Recognition
https://www.youtube.com/playlist?list=PLlJy-eBtNFt6EuMxFYRiNRS07MCWN5UIA
https://www.youtube.com/playlist?list=PLLvH2FwAQhnpj1WEB-jHmPuUeQ8mX-XXG
CS231n: Convolutional Neural Networks for Visual Recognition
http://cs231n.stanford.edu/syllabus.html
https://github.com/cs231n/cs231n.github.io
YouTube 19 videos

OpenAI Ian Goodfellow MIT Press
Deep Learning Text Book
http://www.deeplearningbook.org/
free HTML rendition








No comments:

Post a Comment