Monitoring sow postures, feeding and nursing activity using the combination of deep learning and image segmentation methods
Résumé
Monitoring sow activity is valuable in moving towards more flexible housing during lactation as it strongly influences piglet survival. Detecting sows that are calm, do not suffer from health problems and nurse their litter efficiently is necessary to develop welfare-friendly systems. We developed procedures to automatically analyse sow activity including postures, feeding and nursing. The method was trained on nearly one million images collected from 10 sows over 5 days each. Sow activity was recorded using two colour cameras positioned to observe the sow from the front and from the back. Three neural networks were developed from use of the top front view, the top rear view and both angles of view. They were combined so that the lack of consistency in prediction from the 2 single-view analyses triggered the 3 rd analysis. The sequential analysis of few successive images allows to confirm each detection. The software use neural networks to identify 10 main activities. Image segmentation was added to measure the intensity of nursing activity with distinction of pre and post massage from milk ejection. The sensitivity for the different postures identification varies from 90 to 98%. The sensitivity values for sitting, and feeding (head in trough) increased significantly by using two cameras. The image processing software can be used with only the rear camera or with both cameras to increase the accuracy of prediction. In the near future, the software will be tested on large image databases before being made available. The analysis is already optimised for real-time monitoring.
Origine | Fichiers produits par l'(les) auteur(s) |
---|