Thought this was cool: Resources about deeplearning
This post is not finished. I will add more resources in future….
Restrict Boltzmann Machine
- A Practical Guide to Training Restricted Boltzmann Machines This article help you implement the RBM
- A Fast Learning Algorithm for Deep Belief Nets How to train deep network by stacked RBM
- Biasing Restricted Boltzmann Machines to Manipulate Latent Selectivity and Sparsity How to make RBM sparse so that every hidden unit can represent a simple features.
- Sparse deep belief net model for visual area V2 How to training the sparse RBM
RBM in Text Mining
- Replicated Softmax: an Undirected Topic Model This paper discuss how to use softmax activation function model multi-nominal distribution
- Training Restricted Boltzmann Machines on Word Observations Softmax have high time complexity. This paper discuss how to improve performance of softmax
A Neural Autoregressive Topic Model This paper consider order of words in a document. Given first n – 1 words, predict the nst word. This model’s performance is good
- This paper’s idea comes from this page The Neural Autoregressive Distribution Estimator , this paper introduce how to convert RBM to bayesian network
The sparse coding algorithm is consist of two steps :
- Given basis, learning sparse representation of samples
- Given sparse representation of samples, learning basis
The first step is a quadratic optimization problem under L1 regularization. And this step is very time consuming. Following methods have been proposed to solve this problem:
Auto Encoder is a neural network which try to re-construct the input in the output layer.
[Book] Monte Carlo Statistical Methods
from xlvector – Recommender System: http://xlvector.net/blog/?p=888