Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Data
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific data
  3. data descriptors
  4. article
BaleUAVision: Hay Bales UAV Captured Dataset
Download PDF
Download PDF
  • Data Descriptor
  • Open access
  • Published: 29 January 2026

BaleUAVision: Hay Bales UAV Captured Dataset

  • Georgios D. Karatzinis  ORCID: orcid.org/0000-0003-4674-41631,
  • Socratis Gkelios1 &
  • Athanasios Ch. Kapoutsis  ORCID: orcid.org/0000-0002-1688-036X1 

Scientific Data , Article number:  (2026) Cite this article

  • 1207 Accesses

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Agriculture
  • Databases
  • Environmental sciences

Abstract

Efficient hay bale detection and counting are essential tasks within modern precision agriculture, significantly impacting yield estimation, logistics, and sustainable resource management. To address current limitations in dataset quality and environmental representation, we introduce BaleUAVision, a comprehensive dataset consisting of 2,599 high-resolution RGB images, each containing numerous human-annotated hay bales. Captured by Unmanned Aerial Vehicles (UAVs) across 16 diverse agricultural fields in Northern Greece, the dataset includes varying flight altitudes (50–100 meters), diverse speeds (3.7–5 m/s), and overlapping strategies to ensure robust data representation. BaleUAVision provides rich annotations through polygon-based semantic segmentation in multiple formats (COCO, CSV, JSON, YOLO, segmentation masks) and high-quality orthomosaics for precise spatial analysis. Technical validation demonstrated the dataset’s effectiveness in training robust hay bale detection models using YOLOv11, achieving high precision and recall under varying geographic and altitude conditions. Specifically, the dataset supported effective generalization across geographically distinct areas (Xanthi and Drama regions) and varying altitudes, highlighting its utility in real-world UAV operations. The dataset and supplementary tools, scripts, and analyses are publicly available on Zenodo and GitHub respectively, following FAIR principles to support wide-reaching applicability within the research community.

Similar content being viewed by others

Multiaxial vibration data for blade fault diagnosis in multirotor unmanned aerial vehicles

Article Open access 07 August 2025

Towards a FAIR metadata framework for drone and uncrewed aerial vehicle data

Article Open access 08 December 2025

Forest Inspection Dataset: A Synthetic UAV Dataset for Semantic Segmentation of Forest Environments

Article Open access 24 January 2026

Data availability

All data comprising the BaleUAVision dataset are publicly available in the Zenodo repository29.

Code availability

All related code, including data statistics, insights, usage python scripts, and indicative examples are provided at the living page of the project30.

References

  1. Henning, J. & Lawrence, L. Production and Management of Hay and Haylage (Academic Press, 2025).

  2. Pölönen, I., Suokannas, A. & Juntunen, A. Cloud-based approach for tracking and monitoring of hay bales in smart agriculture. Technology Innovation Management Review 11 (2021).

  3. Wolfert, S., Ge, L., Verdouw, C. & Bogaardt, M.-J. Big data in smart farming–a review. Agricultural Systems 153, 69–80 (2017).

    Google Scholar 

  4. Mulla, D. J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosystems Engineering 114, 358–371 (2013).

    Google Scholar 

  5. Zamar, D. S., Gopaluni, B. & Sokhansanj, S. A constrained k-means and nearest neighbor approach for route optimization in the bale collection problem. In IFAC-PapersOnLine, vol. 50, 12125–12130 (2017).

  6. Tsouros, D. C., Bibi, S. & Sarigiannidis, P. G. A review on uav-based applications for precision agriculture. Information 10, 349 (2019).

    Google Scholar 

  7. Del Cerro, J. et al. Unmanned aerial vehicles in agriculture: A survey. Agronomy 11, 203 (2021).

    Google Scholar 

  8. Raptis, E. K. et al. Cofly: An automated, ai-based open-source platform for uav precision agriculture applications. SoftwareX 23, 101414 (2023).

    Google Scholar 

  9. Krestenitis, M. et al. Overcome the fear of missing out: Active sensing uav scanning for precision agriculture. Robotics and Autonomous Systems 172, 104581 (2024).

    Google Scholar 

  10. Weyler, J. et al. Joint plant and leaf instance segmentation on field-scale uav imagery. IEEE Robotics and Automation Letters 7, 3787–3794 (2022).

    Google Scholar 

  11. Chebrolu, N. et al. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields https://www.ipb.uni-bonn.de/data/sugarbeets2016/ (2017).

  12. Krestenitis, M. et al. Cofly-weeddb: A uav image dataset for weed detection and species identification https://zenodo.org/records/6697343 (2022).

  13. Mujkic, E., Christiansen, M. P. & Ravn, O. Object detection for agricultural vehicles: Ensemble method based on hierarchy of classes. Sensors 23, 7285 (2023).

    Google Scholar 

  14. Kamilaris, A. & Prenafeta-Boldú, F. X. Deep learning in agriculture: A survey. Computers and Electronics in Agriculture 147, 70–90 (2018).

    Google Scholar 

  15. Yamada, W., Zhao, W. & Digman, M. Automated bale mapping using machine learning and photogrammetry. Remote Sensing 13, 4675 (2021).

    Google Scholar 

  16. Zhao, W. et al. Augmenting crop detection for precision agriculture with deep visual transfer learning-a case study of bale detection. Remote Sensing 13, 23 (2020).

    Google Scholar 

  17. Xu, J. et al. Harvestnet: A dataset for detecting smallholder farming activity using harvest piles and remote sensing https://figshare.com/s/45a7b45556b90a9a11d2 (2024).

  18. Kirillov, A. et al. Segment anything. In Proceedings of the IEEE/CVF international conference on computer vision, 4015–4026 (2023).

  19. Osco, L. P. et al. The segment anything model (sam) for remote sensing applications: From zero to one shot. International Journal of Applied Earth Observation and Geoinformation 124, 103540 (2023).

    Google Scholar 

  20. Gui, B., Bhardwaj, A. & Sam, L. Evaluating the efficacy of segment anything model for delineating agriculture and urban green spaces in multiresolution aerial and spaceborne remote sensing images. Remote sensing 16, 414 (2024).

    Google Scholar 

  21. Li, Y., Wang, D., Yuan, C., Li, H. & Hu, J. Enhancing agricultural image segmentation with an agricultural segment anything model adapter. Sensors 23, 7884 (2023).

    Google Scholar 

  22. Wasil, M., Drak, A., Penfold, B., Scarton, L., Johenneken, M., Asteroth, A., Houben, S. Parameter-efficient fine-tuning of vision foundation model for forest floor segmentation from uav imagery. arXiv preprint arXiv:2505.08932 (2025).

  23. Tripathy, P., Baylis, K., Wu, K., Watson, J. & Jiang, R. Investigating the segment anything foundation model for mapping smallholder agriculture field boundaries without training labels. arXiv preprint arXiv:2407.01846 (2024).

  24. Ye, Z. et al. A comparison between pixel-based deep learning and object-based image analysis (obia) for individual detection of cabbage plants based on uav visible-light images. Computers and Electronics in Agriculture 209, 107822 (2023).

    Google Scholar 

  25. Horning, N. et al. Mapping of land cover with open-source software and ultra-high-resolution imagery acquired with unmanned aerial vehicles. Remote Sensing in Ecology and Conservation 6, 487–497 (2020).

    Google Scholar 

  26. Lottes, P. et al. Uav-based crop and weed classification for smart farming. In 2017 IEEE International Conference on Robotics and Automation (ICRA) (2017).

  27. Karatzinis, G. D. et al. Towards an integrated low-cost agricultural monitoring system with unmanned aircraft system. In 2020 International Conference on Unmanned Aircraft Systems (ICUAS) (2020).

  28. Wilkinson, M. D. et al. The fair guiding principles for scientific data management and stewardship. Scientific Data 3, 1–9 (2016).

    Google Scholar 

  29. Karatzinis, G., Gkelios, S. & Kapoutsis, A. Baleuavision: Hay bales uav captured dataset https://doi.org/10.5281/zenodo.15304715 (2025).

  30. Karatzinis, G., Gkelios, S. & Kapoutsis, A. Baleuavision: High-resolution uav dataset for automated hay bale detection and counting. GitHub repository Available at https://github.com/georkara/BaleUAVision (2025).

  31. Ravi, N. et al. Sam 2: Segment anything in images and videos. arXiv preprint arXiv:2408.00714 (2024).

  32. Minderer, M., Gritsenko, A. & Houlsby, N. Scaling open-vocabulary object detection. Advances in Neural Information Processing Systems 36, 72983–73007 (2023).

    Google Scholar 

Download references

Acknowledgements

This research was funded by European Union’s Horizon Europe Innovation Action iDriving, under Grant Agreement No 101147004. Views and opinions expressed are, however, those of the authors only and do not necessarily reflect those of the European Union or CINEA. Neither the European Union nor the granting authority can be held responsible for them.

Author information

Authors and Affiliations

  1. Information Technologies Institute (ITI), Centre of Research and Technology Hellas (CERTH), Thessaloniki, Greece

    Georgios D. Karatzinis, Socratis Gkelios & Athanasios Ch. Kapoutsis

Authors
  1. Georgios D. Karatzinis
    View author publications

    Search author on:PubMed Google Scholar

  2. Socratis Gkelios
    View author publications

    Search author on:PubMed Google Scholar

  3. Athanasios Ch. Kapoutsis
    View author publications

    Search author on:PubMed Google Scholar

Contributions

Conceptualization, G.K.; data acquisition, G.K.; data annotation G.K.; methodology, G.K. and S.G.; technical validation, G.K. and S.G.; formal analysis, G.K. and S.G.; writing—original draft preparation, G.K., S.G. and A.K.; writing—review and editing, G.K., S.G. and A.K.; visualization, G.K., S.G. and A.K. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Georgios D. Karatzinis.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Karatzinis, G.D., Gkelios, S. & Kapoutsis, A.C. BaleUAVision: Hay Bales UAV Captured Dataset. Sci Data (2026). https://doi.org/10.1038/s41597-026-06622-8

Download citation

  • Received: 09 May 2025

  • Accepted: 13 January 2026

  • Published: 29 January 2026

  • DOI: https://doi.org/10.1038/s41597-026-06622-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • Aims and scope
  • Editors & Editorial Board
  • Journal Metrics
  • Policies
  • Open Access Fees and Funding
  • Calls for Papers
  • Contact

Publish with us

  • Submission Guidelines
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Data (Sci Data)

ISSN 2052-4463 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing Anthropocene

Sign up for the Nature Briefing: Anthropocene newsletter — what matters in anthropocene research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: Anthropocene