Richard Socher and James Hong and Sameep Bagadia and David Dindi and B. Ramsundar and N. Arivazhagan and Qiaojing Yan
http://academictorrents.com/details/dd9b74b50a1292b4b154094b7338ec1d66e8894d
2016 CS224D Lecture Videos
https://www.youtube.com/playlist?list=PLmImxx8Char9Ig0ZHSyTqGsdhb9weEGam
Online Singular Value Decomposition Calculator
http://comnuan.com/cmnn01004/
2016 CS224D Lecture Videos
https://www.youtube.com/playlist?list=PLmImxx8Char9Ig0ZHSyTqGsdhb9weEGam
Online Singular Value Decomposition Calculator
http://comnuan.com/cmnn01004/
ACL 2012 + NAACL 2013 Tutorial: Deep Learning for NLP (without Magic)
http://www.socher.org/index.php/DeepLearningTutorial/DeepLearningTutorial
Richard Socher, Chris Manning and Yoshua Bengio
Slides
- NAACL2013-Socher-Manning-DeepLearning.pdf (22mb) - 204 slides - Updated slides for the NAACL 2013 tutorial
- http://nlp.stanford.edu/courses/NAACL2013/NAACL2013-Socher-Manning-DeepLearning.pdf
- SocherBengioManning-DeepLearning-ACL2012-20120707.pdf (25MB) - 184 slides
Updated Version of Tutorial at NAACL 2013
Videos
- High quality video of the 2013 NAACL tutorial version are up here: http://techtalks.tv/events/312/573/
- Low quality version of the 2012 ACL version: on youtube
Reasoning With Neural Tensor Networks
for Knowledge Base Completion
http://nlp.stanford.edu/~socherr/SocherChenManningNg_NIPS2013.pdf
http://wordnet.princeton.edu/
http://en.wikipedia.org/wiki/Markov_random_field
http://nlp.stanford.edu/software/CRF-NER.shtml
softmax in NLP
RECURSIVE DEEP LEARNING
FOR NATURAL LANGUAGE PROCESSING
AND COMPUTER VISION
A DISSERTATION
SUBMITTED TO THE DEPARTMENT OF COMPUTER SCIENCE
AND THE COMMITTEE ON GRADUATE STUDIES
OF STANFORD UNIVERSITY
IN PARTIAL FULFILLMENT OF THE REQUIREMENTS
FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY
Richard Socher
August 2014
http://nlp.stanford.edu/~socherr/thesis.pdf
www.socher.org
GloVe: Global Vectors for Word Representation
Stanford NLP
GloVe
http://stanford.edu/~jpennin/papers/glove.pdfBest word vectors so far? http://stanford.edu/~jpennin/papers/glove.pdf … 11% more accurate than word2vec, fast to train, statistically efficient, good task accuracy
Related Tutorials
• See “Neural Net Language Models” Scholarpedia entry
• Deep Learning tutorials:
http://deeplearning.net/tutorials
• Stanford deep learning tutorials with simple programming
assignments and reading list
http://deeplearning.stanford.edu/wiki/
• Recursive Autoencoder class project
http://cseweb.ucsd.edu/~elkan/250B/learningmeaning.pdf
• Graduate Summer School: Deep Learning, Feature Learning
http://www.ipam.ucla.edu/programs/gss2012/
• ICML 2012 RepresentaGon Learning
tutorial
http://www.iro.umontreal.ca/~bengioy/talks/deep-learning-tutorial-2012.html
• More reading (including tutorial references):
http://nlp.stanford.edu/courses/NAACL2013/
Papers
Parsing Natural Scenes and Natural Language
with Recursive Neural Networks
http://www-nlp.stanford.edu/pubs/SocherLinNgManning_ICML2011.pdf
Recursive Deep Models for Semantic Compositionality
Over a Sentiment Treebank
http://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf
Recursive Deep Models for Semantic Compositionality
Over a Sentiment Treebank
http://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf
http://www.scalanlp.org/api/breeze/index.html#breeze.linalg.softmax$
http://en.wikipedia.org/wiki/Natural_language_processing
http://en.wikipedia.org/wiki/Natural_language_understanding
http://en.wikipedia.org/wiki/Mathematica
http://en.wikipedia.org/wiki/CUDA
http://blog.wolfram.com/2010/11/15/the-free-form-linguistics-revolution-in-mathematica/
http://www.wolfram.com/language/?source=nav
Deeply Moving: Deep Learning for Sentiment Analysis
http://nlp.stanford.edu/sentiment/
Stanford Named Entity Recognizer (NER)
http://nlp.stanford.edu/sentiment/
Stanford Named Entity Recognizer (NER)
http://nlp.stanford.edu/software/CRF-NER.shtml
Stanford NER is also known as CRFClassifier. The software provides a general implementation of (arbitrary order) linear chain Conditional Random Field (CRF) sequence models
http://www.i-programmer.info/news/105-artificial-intelligence/6264-machine-learning-applied-to-natural-language.html
Representing words as high dimensional vectors
https://plus.google.com/+ResearchatGoogle/posts/VwBUvQ7PvnZ
Efficient Estimation of Word Representations in Vector Space(http://goo.gl/ZvBp8F)
http://arxiv.org/pdf/1301.3781.pdf
http://radimrehurek.com/2014/02/word2vec-tutorial/
REPRESENTATIONS
And Their Application To HTM
(DRAFT)
SUBUTAI AHMAD AND JEFF HAWKINS
NUMENTA TECHNICAL REPORT
NTA-2014-01
OCTOBER 28, 2014
©Numenta, Inc. 2014
http://numenta.com/assets/pdf/whitepapers/SDR_Properties%20draft%2010-28-14.pdf
Arbitrary names converted into an SDR
http://comments.gmane.org/gmane.comp.ai.nupic/757
[nupic-discuss] How are SDRs created in higher layers?
http://comments.gmane.org/gmane.comp.ai.nupic/3969
Online Prediction Framework OPF
Online Prediction Framework (OPF) is a framework for working with and deriving predictions from online learning algorithms, including Numenta’s Cortical Learning Algorithm (CLA). OPF is designed to work in conjunction with a larger architecture, as well as in a standalone mode (i.e. directly from the command line). It is also designed such that new model algorithms and functionalities can be added with minimal code changes.
http://www.cortical.io/contexts.html
Retina can be found in Information Retrieval literature under the name of Word Space. This was first described by Hinrich Schütze; also see hisDistributional SemanticsWord Space (1993)
by Hinrich Schütze
Advances in Neural Information Processing Systems 5
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.8856
http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=162AD9E06D0E3F582827B36498DF356D?doi=10.1.1.41.8856&rep=rep1&type=pdf
Hinrich Schütze
http://www.web-geek.com/Computers/Artificial_Intelligence/People.html
collaborated with Stanford NLP - Christopher Manning
Manning, Christopher Stanford University. Probabilistic parsing, grammar induction, text categorization and clustering, electronic dictionaries, information extraction and presentation, and linguistic typology.
Schütze, Hinrich Stanford University. Statistical NLP, text mining, Co-author of "Foundations of Statistical Natural Language Processing" with Christopher Manning.
Magnus Sahlgren's dissertation named The Word-Space Modelhttp://su.diva-portal.org/smash/get/diva2:189276/FULLTEXT01
http://web.stanford.edu/~jpennin/papers/glove.pdf
GloVe: Global Vectors for Word Representation
Jeffrey Pennington, Richard Socher, Christopher D. Manning
Computer Science Department, Stanford University, Stanford, CA 94305
jpennin@stanford.edu, richard@socher.org, manning@stanford.edu
test metric, tests GloVe vs Word2Vec
Split the Elements of a Character Vector
https://stat.ethz.ch/R-manual/R-devel/library/base/html/strsplit.html
Implications of the NuPIC Geospatial Encoder
http://inbits.com/2014/08/implications-of-the-geospatial-encoder/
SUN, OCT 07, 2012
Wait, The Brain Is A Bloom Filter? - @Petrillic
Ian Danforth
Stanford NER is also known as CRFClassifier. The software provides a general implementation of (arbitrary order) linear chain Conditional Random Field (CRF) sequence models
Google Word2Vec
https://code.google.com/p/word2vec/http://www.i-programmer.info/news/105-artificial-intelligence/6264-machine-learning-applied-to-natural-language.html
Representing words as high dimensional vectors
https://plus.google.com/+ResearchatGoogle/posts/VwBUvQ7PvnZ
Efficient Estimation of Word Representations in Vector Space(http://goo.gl/ZvBp8F)
http://arxiv.org/pdf/1301.3781.pdf
http://radimrehurek.com/2014/02/word2vec-tutorial/
all things numenta and cortical -
The Path to Machine Intelligence
http://numenta.com/assets/pdf/whitepapers/Numenta%20-%20Path%20to%20Machine%20Intelligence%20White%20Paper.pdf
http://numenta.com/assets/pdf/whitepapers/hierarchical-temporal-memory-cortical-learning-algorithm-0.2.1-en.pdf
PROPERTIES OF SPARSE DISTRIBUTED http://numenta.com/assets/pdf/whitepapers/hierarchical-temporal-memory-cortical-learning-algorithm-0.2.1-en.pdf
REPRESENTATIONS
And Their Application To HTM
(DRAFT)
SUBUTAI AHMAD AND JEFF HAWKINS
NUMENTA TECHNICAL REPORT
NTA-2014-01
OCTOBER 28, 2014
©Numenta, Inc. 2014
http://numenta.com/assets/pdf/whitepapers/SDR_Properties%20draft%2010-28-14.pdf
Numenta open source project Nupic
http://numenta.org/Numenta · GitHub
https://github.com/numenta
https://github.com/numenta/nupic/wiki/Using-NuPIC
NUPic NLP
https://github.com/numenta/nupic/wiki/Natural-Language-Processing
https://github.com/numenta/nupic/wiki/Encoders
https://www.youtube.com/watch?v=3gjVVNPnPYA&feature=youtu.be&t=2m40s
https://github.com/numenta/nupic/wiki/Using-NuPIC
NUPic NLP
https://github.com/numenta/nupic/wiki/Natural-Language-Processing
NuPIC is not currently tuned for NLP, but should be capable of some basic NLP functions. If letters are used as categories, it should be able to recognize common word and sentence structures. However, without a hierarchy, it will not be able to formulate a deep understanding of input text, because it is limited to one small region of the brain within it's model.
However, there could still be some interesting experiments performed even with this limitation. For example, words would be encoded into SDRs externally, through the cortical.io API, and feed directly into the CLA using a "pass-through" encoder.
https://github.com/numenta/nupic/wiki/Encoders
https://www.youtube.com/watch?v=3gjVVNPnPYA&feature=youtu.be&t=2m40s
Arbitrary names converted into an SDR
http://comments.gmane.org/gmane.comp.ai.nupic/757
[nupic-discuss] How are SDRs created in higher layers?
http://comments.gmane.org/gmane.comp.ai.nupic/3969
Online Prediction Framework OPF
Online Prediction Framework (OPF) is a framework for working with and deriving predictions from online learning algorithms, including Numenta’s Cortical Learning Algorithm (CLA). OPF is designed to work in conjunction with a larger architecture, as well as in a standalone mode (i.e. directly from the command line). It is also designed such that new model algorithms and functionalities can be added with minimal code changes.
all things cortical
http://www.cortical.io/contexts.html
Retina can be found in Information Retrieval literature under the name of Word Space. This was first described by Hinrich Schütze; also see hisDistributional SemanticsWord Space (1993)
by Hinrich Schütze
Advances in Neural Information Processing Systems 5
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.8856
http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=162AD9E06D0E3F582827B36498DF356D?doi=10.1.1.41.8856&rep=rep1&type=pdf
Hinrich Schütze
http://www.web-geek.com/Computers/Artificial_Intelligence/People.html
collaborated with Stanford NLP - Christopher Manning
Manning, Christopher Stanford University. Probabilistic parsing, grammar induction, text categorization and clustering, electronic dictionaries, information extraction and presentation, and linguistic typology.
Schütze, Hinrich Stanford University. Statistical NLP, text mining, Co-author of "Foundations of Statistical Natural Language Processing" with Christopher Manning.
Magnus Sahlgren's dissertation named The Word-Space Modelhttp://su.diva-portal.org/smash/get/diva2:189276/FULLTEXT01
http://web.stanford.edu/~jpennin/papers/glove.pdf
GloVe: Global Vectors for Word Representation
Jeffrey Pennington, Richard Socher, Christopher D. Manning
Computer Science Department, Stanford University, Stanford, CA 94305
jpennin@stanford.edu, richard@socher.org, manning@stanford.edu
test metric, tests GloVe vs Word2Vec
On the importance of comparing apples to apples: a case study using the GloVe model
Yoav Goldberg, 10 August 2014
https://docs.google.com/document/d/1ydIujJ7ETSZ688RGfU5IMJJsbxAi-kRl8czSwpti15s/mobilebasic?pli=1all things R
Split the Elements of a Character Vector
https://stat.ethz.ch/R-manual/R-devel/library/base/html/strsplit.html
Implications of the NuPIC Geospatial Encoder
http://inbits.com/2014/08/implications-of-the-geospatial-encoder/
SUN, OCT 07, 2012
Wait, The Brain Is A Bloom Filter? - @Petrillic
Ian Danforth
Engineeringhttp://numenta.com/blog/wait-the-brain-is-a-bloom-filter.html
http://doubleclix.wordpress.com/2013/04/14/is-our-neocortex-a-giant-semantic-bloom-filter-of-natural-intelligence-machine-learning-jeff-hawkins/
http://doubleclix.wordpress.com/category/machine-learning/
Felix Andrews
Hackathon demo: cortical.io encoder
http://numenta.com/assets/pdf/whitepapers/SDR_Properties%20draft%2010-28-14.pdf
5. References Numenta 2014 Page 24
[15] Olshausen, Bruno A., and David J. Field. "Sparse coding with an
overcomplete basis set: A strategy employed by V1." Vision research
37.23 (1997): 3311-3325.
[16] Olshausen, Bruno A., and David J. Field. "Sparse coding of sensory
inputs." Current opinion in neurobiology 14.4 (2004): 481-487.
[17] Tibshirani, Robert. "Regression shrinkage and selection via the
lasso." Journal of the Royal Statistical Society. Series B
(Methodological) (1996): 267-288.
[18] Vinje, William E., and Jack L. Gallant. "Sparse coding and decorrelation
in primary visual cortex during natural vision." Science 287.5456
(2000): 1273-1276.
numenta open source SDR
IS OUR NEOCORTEX A GIANT SEMANTIC BLOOM FILTER ? OF NATURAL INTELLIGENCE, MACHINE LEARNING & JEFF HAWKINShttp://doubleclix.wordpress.com/2013/04/14/is-our-neocortex-a-giant-semantic-bloom-filter-of-natural-intelligence-machine-learning-jeff-hawkins/
http://doubleclix.wordpress.com/category/machine-learning/
Felix Andrews
Hackathon demo: cortical.io encoder
Last weekend I joined Numenta’s Fall 2014 Hackathon. A fantastic event. It underscores Numenta’s approach of being totally open with their work and supportive of the community.
http://www.neurofractal.org/felix/
Towards exhaustive pairwise matching in large image ...
dl.acm.org/citation.cfm?id=2403335
Association for Computing Machineryby K Srijan - 2012 - Related articlesOct 7, 2012 - Michael Mitzenmacher, Compressed bloom filters, IEEE/ACM ...... large portion of the cerebral cortex devoted to analyzing retinal signals. Although ... to design and develop bio-inspired models for form and motion processing.- Bloom Filter
5. References Numenta 2014 Page 24
[15] Olshausen, Bruno A., and David J. Field. "Sparse coding with an
overcomplete basis set: A strategy employed by V1." Vision research
37.23 (1997): 3311-3325.
[16] Olshausen, Bruno A., and David J. Field. "Sparse coding of sensory
inputs." Current opinion in neurobiology 14.4 (2004): 481-487.
[17] Tibshirani, Robert. "Regression shrinkage and selection via the
lasso." Journal of the Royal Statistical Society. Series B
(Methodological) (1996): 267-288.
[18] Vinje, William E., and Jack L. Gallant. "Sparse coding and decorrelation
in primary visual cortex during natural vision." Science 287.5456
(2000): 1273-1276.
Cortical concepts, theory and technology
http://www.cortical.io/contexts_representations.html
http://cortical-io.uservoice.com/knowledgebase/articles/346291-introduction-to-cortical-io-s-semantic-fingerprint
http://cortical-io.uservoice.com/knowledgebase/articles/346291-introduction-to-cortical-io-s-semantic-fingerprint
No comments:
Post a Comment