Abstract
Animal‐borne accelerometers have been used across more than 120 species to infer biologically significant information such as energy expenditure and broad behavioural categories. While the accelerometer's high sensitivity to movement and fast response times present the unprecedented opportunity to resolve fine‐scale behaviour, leveraging this opportunity will require overcoming the challenge of developing general, automated methods to analyse the nonstationary signals generated by nonlinear processes governing erratic, impulsive movement characteristic of fine‐scale behaviour.
We address this issue by conceptualising fine‐scale behaviour in terms of characteristic microevents: impulsive movements producing brief (<1 s) shock signals in accelerometer data. We propose a ‘seek‐and‐learn’ approach: a novel microevent detection step first locates where shock signals occur (‘seek’) by searching for peaks in envelopes of acceleration data. Robust machine learning (‘learn’) employing meaningful features then separates microevents. We showcase the application of our method on tri‐axial accelerometer data collected on 10 free‐living meerkats Suricata suricatta for four fine‐scale foraging behaviours – searching for digging sites, one‐armed digging, two‐armed digging and head jerks during prey ingestion. Annotated videos served as groundtruth, and performance was benchmarked against that of a variety of classical machine learning approaches.
Microevent identification (μEvId) with eight features in a three‐node hierarchical classification scheme employing logistic regression at each node achieved a mean overall accuracy of >85% during leave‐one‐individual‐out cross‐validation, and exceeded that of the best classical machine learning approach by 8.6%. μEvId was found to be robust not only to inter‐individual variation but also to large changes in model parameters.
Our results show that microevents can be modelled as impulse responses of the animal body‐and‐sensor system. The microevent detection step retains only informative regions of the signal, which results in the selection of discriminative features that reflect biomechanical differences between microevents. Moving‐window‐based classical machine learning approaches lack this prefiltering step, and were found to be suboptimal for capturing the nonstationary dynamics of the recorded signals. The general, automated technique of μEvId, together with existing models that can identify broad behavioural categories, provides future studies with a powerful toolkit to exploit the full potential of accelerometers for animal behaviour recognition.