Generative models are machine learning models capable of producing data samples according to the properties of the training distribution. This feature makes them particularly interesting from a computational neuroscience perspective, since it allows to investigate the functional role of spontaneous brain activity. A widely studied class of generative models is that of Restricted Boltzmann Machines (RBM), which are used as building blocks for unsupervised deep learning architectures. Despite their relevance, the generative capabilities of RBMs have not been systematically studied. In this work we propose a method to explore the generation of data samples from RBMs trained on a classic dataset (MNIST) composed of handwritten digits. Our approach exploits the gradual tuning of a temperature parameter to investigate whether the model dynamics could be driven into meta-stable states, which would allow to effectively generate heterogeneous data samples by visiting nearby attractor basins. Even if the proposed method indeed allows to generate different digits starting from a certain (and possibly biased) network state, it is not capable of producing samples belonging to all digit classes in one single generation round.

Generative models are machine learning models capable of producing data samples according to the properties of the training distribution. This feature makes them particularly interesting from a computational neuroscience perspective, since it allows to investigate the functional role of spontaneous brain activity. A widely studied class of generative models is that of Restricted Boltzmann Machines (RBM), which are used as building blocks for unsupervised deep learning architectures. Despite their relevance, the generative capabilities of RBMs have not been systematically studied. In this work we propose a method to explore the generation of data samples from RBMs trained on a classic dataset (MNIST) composed of handwritten digits. Our approach exploits the gradual tuning of a temperature parameter to investigate whether the model dynamics could be driven into meta-stable states, which would allow to effectively generate heterogeneous data samples by visiting nearby attractor basins. Even if the proposed method indeed allows to generate different digits starting from a certain (and possibly biased) network state, it is not capable of producing samples belonging to all digit classes in one single generation round.

Investigating the dynamics of spontaneous activity in energy-based neural networks

TAUSANI, LORENZO
2021/2022

Abstract

Generative models are machine learning models capable of producing data samples according to the properties of the training distribution. This feature makes them particularly interesting from a computational neuroscience perspective, since it allows to investigate the functional role of spontaneous brain activity. A widely studied class of generative models is that of Restricted Boltzmann Machines (RBM), which are used as building blocks for unsupervised deep learning architectures. Despite their relevance, the generative capabilities of RBMs have not been systematically studied. In this work we propose a method to explore the generation of data samples from RBMs trained on a classic dataset (MNIST) composed of handwritten digits. Our approach exploits the gradual tuning of a temperature parameter to investigate whether the model dynamics could be driven into meta-stable states, which would allow to effectively generate heterogeneous data samples by visiting nearby attractor basins. Even if the proposed method indeed allows to generate different digits starting from a certain (and possibly biased) network state, it is not capable of producing samples belonging to all digit classes in one single generation round.
2021
Investigating the dynamics of spontaneous activity in energy-based neural networks
Generative models are machine learning models capable of producing data samples according to the properties of the training distribution. This feature makes them particularly interesting from a computational neuroscience perspective, since it allows to investigate the functional role of spontaneous brain activity. A widely studied class of generative models is that of Restricted Boltzmann Machines (RBM), which are used as building blocks for unsupervised deep learning architectures. Despite their relevance, the generative capabilities of RBMs have not been systematically studied. In this work we propose a method to explore the generation of data samples from RBMs trained on a classic dataset (MNIST) composed of handwritten digits. Our approach exploits the gradual tuning of a temperature parameter to investigate whether the model dynamics could be driven into meta-stable states, which would allow to effectively generate heterogeneous data samples by visiting nearby attractor basins. Even if the proposed method indeed allows to generate different digits starting from a certain (and possibly biased) network state, it is not capable of producing samples belonging to all digit classes in one single generation round.
Spontaneous activity
Energy-based models
RBM
Generative models
File in questo prodotto:
File Dimensione Formato  
Tausani_Lorenzo.pdf

accesso riservato

Dimensione 10.4 MB
Formato Adobe PDF
10.4 MB Adobe PDF

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/42071