File Exchange

image thumbnail

Restricted Boltzmann Machine

version 1.3.0 (123 KB) by BERGHOUT Tarek
contrastive divergence for training an RBM is presented in details.


Updated 15 Apr 2019

View License

Restricted Boltzmann machines (RBMs) are the first neural networks used for unsupervised learning, created by Geoff Hinton (university of Toronto).
The aim of RBMs is to find patterns in data by reconstructing the inputs using only two layers (the visible layer and the hidden layer). By moving forward an RBM translates the visible layer into a set of numbers that encodes the inputs, in backward pass it takes those set of numbers and translates them to the visible layer to regenerate the inputs.
In this code we introduce to you very simple algorithms that depend on contrastive divergence training. The details of this method are explained step by step in the comments inside the code.
For any information please contact me via:
To learn about RBM :

Cite As

BERGHOUT Tarek (2019). Restricted Boltzmann Machine (, MATLAB Central File Exchange. Retrieved .

Comments and Ratings (1)

Noor Abbas



new descriptif image


in the last code we trained by mistake the RBM with scalar units in visible and hidden layers, as we change the representation of these units into binary units during training and we'v got a much more improvements in accurcy

MATLAB Release Compatibility
Created with R2013b
Compatible with any release
Platform Compatibility
Windows macOS Linux

Discover Live Editor

Create scripts with code, output, and formatted text in a single executable document.

Learn About Live Editor