Digital Transformation in Animal and Agricultural Sciences

Visions-based detection of pain and nest-building behaviors in sows within commercial farrowing pens

Authors: ,

Abstract

Pain indicators and preparturient nest-building are precursors of farrowing, yet con-
tinuous quantification remains challenging. We present a non-invasive computer-
vision system that detects pain-associated behaviors (back-arching, tail-flicking,
back leg forward, trembling) and nest-building behaviors (manipulation of pen
components, pawing, exploration) from top-view videos. We analyzed 748 h of
RGB footage (25 fps) from 11 sows on a single farm, spanning 64 h pre-farrowing
to 4 h post birth of the first piglet. Using a defined ethogram, 46,010 events were
annotated with inter-annotator agreement κ = 0.724. To assess generalization, we
used a sow-level split, i.e., 8 in the training set and 3 in validation. Behaviors were
detected with a modified DeepEthogram architecture, combining RGB data and
optical flow. Both streams were processed by separate ResNet3D-34 encoders.
Optical flow was estimated using a state-of-the-art DPFlow model. Training em-
ployed focal loss to address class imbalance, alongside geometric and photometric
augmentations for robustness to camera placement and lighting. Clips of 11 frames
at 8.33 fps (≈1.32 s) were used. On held-out sows, per-class F1 scores were 0.875
for manipulation of pen, 0.634 for pawing, 0.820 for exploration, 0.639 for back-
arching, 0.778 for tail-flicking, 0.871 for back leg forward, and 0.443 for trembling.
These results indicate that pen-installed vision can identify key behaviors in a
non-invasive way, supporting scalable monitoring. Limitations include modest
dataset size and limited diversity, i.e., a single farm and a single breed. Ongoing
work will expand the dataset and leverage behavior dynamics for time-to-farrowing
estimation.

Keywords:

How to Cite: Helf, P. & Oczak, M. (2026) “Visions-based detection of pain and nest-building behaviors in sows within commercial farrowing pens”, Proceedings of the Austrian Symposium on AI, Robotics, and Vision. 3(1).