Good vision proficiency and a complex set of eye movements are frequently coexisting. Even during fixation, our eyes keep moving in microscopic and erratic fashion, thus avoiding stationary scenes from fading perceptually by preventing retinal adaptation. We artificially replicate the functionalities of biological vision by exploiting this active strategy with an event-based camera. The resulting neuromorphic active system redistributes the low temporal frequency power of a static image into a range the sensor can detect and encode in the timing of events. A spectral analysis of its output attested both whitening and amplification effects already postulated in biology depending on whether or not the stimulus’ contrast matched the 1/k falloff typical of natural images. Further evaluations revealed that the isotropic statistics of fixational eye movements is crucial for equalizing the response of the system to all possible stimulus orientations. Finally, the design of a biologically-rea listic spiking neural network allowed the detection of stimulus’ local orientation by anisotropic spatial summation of synchronous activity with both ON/OFF polarities.