Abstract
The recent development of power-efficient neuromorphic hardware offers great opportunities for applications where power consumption is a main concern, ranging from mobile platforms to server farms. However, it remains a challenging task to design spiking neural networks (SNN) to do pattern recognition on such hardware. We present a SNN for digit recognition which relies on mechanisms commonly used on neuromorphic hardware, i.e. exponential synapses with spiketiming- dependent plasticity, lateral inhibition, and an adaptive threshold. Unlike most other approaches, we do not present any class labels to the network; the network uses unsupervised learning. The performance of our network scales well with the number of neurons used. Intuitively, the used algorithm is comparable to k-means and competitive learning algorithms such as vector quantization and self-organizing maps, each neuron learns a representation of a part of the input space, similar to a centroid in k-means. Our architecture achieves 95% accuracy on the MNIST benchmark, which outperforms other unsupervised learning methods for SNNs. The fact that we used no domainspecific knowledge points toward a more general applicability of the network design.