Header

UZH-Logo

Maintenance Infos

Randomized unregulated step descent for limited precision synaptic elements


Muller, Lorenz K; Nair, Manu V; Indiveri, Giacomo (2017). Randomized unregulated step descent for limited precision synaptic elements. In: IEEE International Symposium on Circuits and Systems (ISCAS) 2017, Baltimore, USA, 29 May 2017 - 6 January 2017.

Abstract

Training neural networks with low-resolution synaptic weights raised much interest recently and inference in neural networks with binary activation and binary weights has been shown to be able to achieve near state-of-the-art performance in a wide range of tasks. However, the current methods for training such networks rely on high-resolution gradients or update probabilities. Low resolution training methods would be useful for neuromorphic architectures that support lower power hardware implementations as well as emerging memory technologies based on memristive devices that do not always support fine-grained state changes. In this paper, we propose a training method, Randomized Unregulated Step Descent (RUSD), as an alternative to gradient descent that uses only a single bit of information about the gradient; we show how it is compatible with low-resolution integer arithmetic platforms and is resilient to some of the prominent non-idealities of memristive memories. We verify the performance of RUSD several standard machine-learning benchmarks.

Abstract

Training neural networks with low-resolution synaptic weights raised much interest recently and inference in neural networks with binary activation and binary weights has been shown to be able to achieve near state-of-the-art performance in a wide range of tasks. However, the current methods for training such networks rely on high-resolution gradients or update probabilities. Low resolution training methods would be useful for neuromorphic architectures that support lower power hardware implementations as well as emerging memory technologies based on memristive devices that do not always support fine-grained state changes. In this paper, we propose a training method, Randomized Unregulated Step Descent (RUSD), as an alternative to gradient descent that uses only a single bit of information about the gradient; we show how it is compatible with low-resolution integer arithmetic platforms and is resilient to some of the prominent non-idealities of memristive memories. We verify the performance of RUSD several standard machine-learning benchmarks.

Statistics

Citations

Altmetrics

Downloads

0 downloads since deposited on 23 Feb 2018
0 downloads since 12 months

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Event End Date:6 January 2017
Deposited On:23 Feb 2018 09:55
Last Modified:14 Mar 2018 18:01
Publisher:Proceedings of IEEE International Symposium on Circuits and Systems (ISCAS) 2017
Series Name:Proceedings-IEEE International Symposium on Circuits and Systems
OA Status:Closed
Free access at:Official URL. An embargo period may apply.
Publisher DOI:https://doi.org/10.1109/ISCAS.2017.8050217
Official URL:http://ieeexplore.ieee.org/document/8050217/

Download