- Author
- Date
- 7-11-2016
- Title
- Deep Spiking Networks
- Number of pages
- 16
- Publisher
- Ithaca, NY: ArXiv
- Document type
- Working paper
- Faculty
- Faculty of Science (FNWI)
- Institute
- Informatics Institute (IVI)
- Abstract
-
We introduce an algorithm to do backpropagation on a spiking network. Our network is "spiking" in the sense that our neurons accumulate their activation into a potential over time, and only send out a signal (a "spike") when this potential crosses a threshold and the neuron is reset. Neurons only update their states when receiving signals from other neurons. Total computation of the network thus scales with the number of spikes caused by an input rather than network size. We show that the spiking Multi-Layer Perceptron behaves identically, during both prediction and training, to a conventional deep network of rectified-linear units, in the limiting case where we run the spiking network for a long time. We apply this architecture to a conventional classification problem (MNIST) and achieve performance very close to that of a conventional Multi-Layer Perceptron with the same architecture. Our network is a natural architecture for learning based on streaming event-based data, and is a stepping stone towards using spiking neural networks to learn efficiently on streaming data.
- Link
- Submitted manuscript
- Language
- English
- Note
- February 2016 version also on Arxiv.org
- Persistent Identifier
- https://hdl.handle.net/11245.1/cff00ce7-dba6-4962-8116-13712c1682b2
- Downloads
-
1602.08323v2.pd(Submitted manuscript)
Disclaimer/Complaints regulations
If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library, or send a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible.