Event-driven spike-based processing systems offer new possibilities for real-time vision. Signals are encoded asynchronously in time thus preserving the time information of the occurrence of an event. We examine this form of coding using experimental data from a multi-layered multi-chip system which consists of an artificial retina, a convolution filterbank and a winner-take-all network which detect the position of a moving object. The spike outputs of the convolution stage can be described by an inhomogeneous Poisson distribution of Gaussian profile, although the underlying building blocks are completely deterministic and exhibit only a small amount of variation. We discuss a method for measuring the accuracy of the asynchronous spiking representation in both time and value, thereby quantifying the performance of the winner-takeall network in determining the position of a ball rotating in front of the system.