Saturday, August 2, 2014

Research papers on unsupervised learning


Large-scale Deep Unsupervised Learning using Graphics Processors

http://robotics.stanford.edu/~ang/papers/icml09-LargeScaleUnsupervisedDeepLearningGPU.pdf


Building high-level features using large scale unsupervised learninghttp://research.google.com/pubs/pub38115.html

Artificial Intelligence and Machine Learning
http://research.google.com/pubs/ArtificialIntelligenceandMachineLearning.html


Label Partitioning For Sublinear Ranking
http://www.thespermwhale.com/jaseweston/papers/label_partitioner.pdf


deep neural network for unsupervised learning


Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations
http://web.eecs.umich.edu/~honglak/icml09-ConvolutionalDeepBeliefNetworks.pdf

Greedy layer-wise training of deep networks

http://papers.nips.cc/paper/3048-greedy-layer-wise-training-of-deep-networks.pdf

Y Bengio, P Lamblin, D Popovici… - Advances in neural …, 2007 - books.google.com
... Instead, the network without pre-training sees a “random” transformation of the input ...
representationally efficient than shallow ones such as SVMs and one-hidden- layer neural nets.
We study Deep Belief Networks applied to supervised learning tasks, and the prin- ciples that ...

Building high-level features using large scale unsupervised learning

http://research.google.com/pubs/pub38115.html

QV Le - Acoustics, Speech and Signal Processing (ICASSP), …, 2013 - ieeexplore.ieee.org
... Fig. 3. Visualization of the cat face neuron (top left) and hu- man body neuron (top right), and
top stimuli for some of the neurons in the network (bottom). ... [9] DC Ciresan, U. Meier, LM
Gambardella, and J. Schmidhu- ber, “Deep big simple neural nets excel on handwritten digit ...

Unsupervised feature learning for audio classification using convolutional deep belief networks

http://papers.nips.cc/paper/3674-unsupervised-feature-learning-for-audio-classification-using-convolutional-deep-belief-networks.pdf


A unified architecture for natural language processing: Deep neural networks with multitasklearning

R CollobertJ Weston - … international conference on Machine learning, 2008 - dl.acm.org
... help develop a unified architecture which would presumably be necessary for deeper semantic
tasks ... This is achieved by training a deep neural network, building upon work by (Bengio & ... We
define a rather general convolutional network architec- ture and describe its application ...

[HTML] Sparse feature learning for deep belief networks

Y Boureau, YL Cun - … in neural information processing systems, 2008 - papers.nips.cc
... A future avenue of work is to understand the reasons for this “coincidence”, and deeper
connections between these two strategies. ... Reducing the dimensionality of data with neural
networks. Science, 313(5786):504–507, 2006. ... Greedy layer-wise training of deep networks...

[PDF] An analysis of single-layer networks in unsupervised feature learning

A Coates, AY Ng, H Lee - International Conference on …, 2011 - machinelearning.wustl.edu
... algo- rithms to build single-layer models that are composed to build deeper structures. ... Neural
Network [16] 93.4% (6.6%) Deep Boltzmann Machine [26] 92.8% (7.2%) Deep Belief Network
[20] 95.0% (5.0%) (Best result of [11]) 94.4% (5.6%) Deep neural network [27] 97.13 ...

[HTML] Sparse deep belief net model for visual area V2

H LeeC Ekanadham, AY Ng - Advances in neural information …, 2008 - papers.nips.cc
... study that is done in a similar spirit, only extending the comparisons to a deeper area in ... While
the former responses suggest a simple linear computation of V1 neural responses, the latter
responses ... 6 Conclusions We presented a sparse variant of the deep belief network model. ...


Neural Networks and the Backpropagation Algorithm
Posted on December 9, 2012 by j2kun
Neurons, as an Extension of the Perceptron Model

Google DeepMind Code
Iam Trask
How to Code and Understand DeepMind's Neural Stack Machine
Learning to Transduce with Unbounded Memory
Posted by iamtrask on February 25, 2016



Higher Order Cognition using Computers: Learning Abstract Concepts with Recursive Graph-based Self Organizing Maps
http://s3-eu-west-1.amazonaws.com/braintree-ai/pdf/Higher_Order_Cognition_using_Computers_AlifeXV.pdf

Peter J. Bentley1, 2, Alexander Kurashov1 and Soo Ling Lim1, 2 1 Braintree Limited, London, United Kingdom 2 Department of Computer Science, University College London, United Kingdom p.bentley@cs.ucl.ac.uk

Abstract

Abstract concepts are rules about relationships such as identity or sameness. Instead of learning that specific objects belong to specific categories, the abstract concept of same/different applies to any objects that an organism might encounter, even if those objects have never been seen before. In this paper we investigate learning of abstract concepts by computer, in order to recognize same/different in novel data never seen before. To do so, we integrate recursive self-organizing maps with the data they are processing into a single graph to enable a brain-like self-adaptive learning system. We perform experiments on simple same/different datasets designed to resemble those used in animal experiments and then show an example of a practical application of same/different learning using the approach.

Peter J. Bentley, Alexander Kurashov, and Soo Ling Lim (2016). Higher Order Cognition using Computers: Learning Abstract Concepts with Recursive Graph-based Self Organizing Map. International Conference on the Synthesis and Simulation of Living Systems (ALIFE), in press.
From Complex Adaptive Systems

Proceedings of the Artificial Life Conference 2016
https://mitpress.mit.edu/sites/default/files/titles/free_download/9780262339360_ALIFE_2016.pdf






No comments:

Post a Comment