File(s) stored somewhere else
Please note: Linked content is NOT stored on CQUniversity and we can't guarantee its availability, quality, security or accept any liability.
Deep learning for mango (Mangifera indica) panicle stage classification
journal contributionposted on 2021-03-15, 03:33 authored by Anand KoiralaAnand Koirala, Kerry WalshKerry Walsh, Zhenglin WangZhenglin Wang, Nicholas AndersonNicholas Anderson
Automated assessment of the number of panicles by developmental stage can provide information on the time spread of flowering and thus inform farm management. A pixel-based segmentation method for the estimation of flowering level from tree images was confounded by the developmental stage. Therefore, the use of a single and a two-stage deep learning framework (YOLO and R2CNN) was considered, using either upright or rotated bounding boxes. For a validation image set and for a total panicle count, the models MangoYOLO(-upright), MangoYOLO-rotated, YOLOv3-rotated, R2CNN(-rotated) and R2CNN-upright achieved weighted F1 scores of 76.5, 76.1, 74.9, 74.0 and 82.0, respectively. For a test set of the images of another cultivar and using a different camera, the R2 for machine vision to human count of panicles per tree was 0.86, 0.80, 0.83, 0.81 and 0.76 for the same models, respectively. Thus, there was no consistent benefit from the use of rotated over the use of upright bounding boxes. The YOLOv3-rotated model was superior in terms of total panicle count, and the R2CNN-upright model was more accurate for panicle stage classification. To demonstrate practical application, panicle counts were made weekly for an orchard of 994 trees, with a peak detection routine applied to document multiple flowering events.
Category 3 - Industry and Other Research Income
Number of Pages21
Full Text URL
Additional RightsCC BY 4.0
Author Research Institute
- Institute for Future Farming Systems