Content area

Abstract

Size estimation is a hard computer vision problem with widespread applications in quality control in manufacturing and processing plants, livestock management, and research on animal behaviour. Image-based size estimation is typically facilitated by either well-controlled imaging conditions, the provision of global cues, or both. Reference-free size estimation remains challenging, because objects of vastly different sizes can appear identical if they are of similar shape. Here, we explore the feasibility of implementing automated and reference-free body size estimation to facilitate large-scale experimental work in a key model species in sociobiology: the leaf-cutter ants. Leaf-cutter ants are a suitable testbed for reference-free size estimation, because their workers differ vastly in both size and shape; in principle, it is therefore possible to infer body mass - a proxy for size - from relative body proportions alone. Inspired by earlier work by E.O. Wilson, who trained himself to discern ant worker size from visual cues alone, we deployed deep learning techniques to achieve the same feat automatically, quickly, at scale, and from reference-free images: Wilson Only Looks Once (WOLO). Using 150,000 hand-annotated and 100,000 computer-generated images, a set of deep convolutional neural networks were trained to estimate the body mass of ant workers from image cutouts. The best-performing WOLO networks achieved errors as low as 11% on unseen data, approximately matching or exceeding human performance, measured for a small group of both experts and non-experts, but were about 1000 times faster. Further refinement may thus enable accurate, high throughput, and non-intrusive body mass estimation in behavioural work, and so eventually contribute to a more nuanced and comprehensive understanding of the rules that underpin the complex division of labour that characterises polymorphic insect societies.

Competing Interest Statement

The authors have declared no competing interest.

Footnotes

* In response to reviewer feedback we: Added a comprehensive Data Availability Statement; Defined a clear goal added in the introduction (Line 100); Shifted focus to a single VGG-style CNN for regression and classification (L170 to L176); Simplified evaluation using standard metrics: categorical accuracy, relative error (MAPE), and prediction stability (L219 to L278); Moved data collection details to Supplementary Information (L681 to L829); Presented detailed results in supplementary tables; main text focuses on key findings; Retrained all networks, simplified evaluation, and performed overall manuscript streamlining; Revised equations for clarity (e.g., distinguishing ground truth and estimates); Clarified terminology (e.g., epochs vs iterations, accuracy definitions); Addressed linguistic issues (e.g., reduced excessive hyphenation); Corrected inaccuracies (e.g., cross-entropy phrasing, loss discussion).

* https://zenodo.org/records/11167521

* https://zenodo.org/records/11167946

* https://zenodo.org/records/14747391

* https://zenodo.org/records/14746456

* https://github.com/FabianPlum/WOLO

Details

1009240
Title
WOLO: Wilson Only Looks Once – Estimating ant body mass from reference-free images using deep convolutional neural networks
Publication title
bioRxiv; Cold Spring Harbor
Publication year
2025
Publication date
Jan 27, 2025
Section
New Results
Publisher
Cold Spring Harbor Laboratory Press
Source
BioRxiv
Place of publication
Cold Spring Harbor
Country of publication
United States
University/institution
Cold Spring Harbor Laboratory Press
Publication subject
ISSN
2692-8205
Source type
Working Paper
Language of publication
English
Document type
Working Paper
Publication history
 
 
Milestone dates
2024-05-17 (Version 1); 2024-06-26 (Version 2)
ProQuest document ID
3160209611
Document URL
https://www.proquest.com/working-papers/wolo-wilson-only-looks-once-estimating-ant-body/docview/3160209611/se-2?accountid=208611
Copyright
© 2025. This article is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (“the License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-01-28
Database
ProQuest One Academic