At this moment, I am finishing my PhD in the Botlzmann Machines topic under Prof. Ferran Mazzanti from UPC guidance. A Boltzmann Machine is a neural network with the ability of learning and extrapolating probability distributions. The original model was born as a parallel implementation of the Simulated Annealing optimization algorithm, but it was later shown that it could be applied a leanring algorithm to become a Hopfield like model. Learning in BMs is often carried out by a gradient descent process that needs some Monte Carlo simulations over the neural network to compute the quantitites needed to update its connections; this makes the learning process slow. We are currently working in some mathematical methods that can be used to speed up this process for any BM topology.
The PhD started with the analysis of a process known as decimation that could be used to analytically compute the quantities to carry out the learning process to the BM; this work is due to L. Saul and M. Jordan. However, this method could only be applied to certain topologies of BM, referred to as Boltzmann Trees; thus presenting a limitation to its applicability.
We have proposed an extension to this method which consists on applying a Walsh-Hadamard transform over the neural network thus allowing any topology to be decimated. This method can be used on High Order Boltzmann Machines-this is, BM whose weights connect more than two units-and it is able of analitically computing any order of BM to find the exact value needed to carry out the learning process.