Simultaneous localization and mapping (SLAM) is one of the core tasks of mobile autonomous robots. Looking for power efficient and embedded solutions for SLAM is an important challenge when building controllers for small and agile robots. Biological neural systems of even simple animals are until now unprecedented in their ability to localize themselves in an unknown environment. Neuromorphic engineering offers ultra low-power and compact computing hardware, in which biologically inspired neuronal architectures for SLAM can be realised. In this paper, we propose an on chip approach for one of the components of SLAM: path integration. Our solution takes inspiration from biology and uses motor command information to estimate the orientation of an agent solely in a spiking neural network. We realise this network on a neuromorphic device that implements artificial neurons and synapses with analog electronics. The neural network receives visual input from an event-based camera and uses this information to correct the on-chip spiking neurons estimate of the robot's orientation. This system can be easily integrated with other localization and mapping components on chip and is a step towards a fully neuromorphic SLAM.