1932

Abstract

A major bottleneck in the crop improvement pipeline is our ability to phenotype crops quickly and efficiently. Image-based, high-throughput phenotyping has a number of advantages because it is nondestructive and reduces human labor, but a new challenge arises in extracting meaningful information from large quantities of image data. Deep learning, a type of artificial intelligence, is an approach used to analyze image data and make predictions on unseen images that ultimately reduces the need for human input in computation. Here, we review the basics of deep learning, assessments of deep learning success, examples of applications of deep learning in plant phenomics, best practices, and open challenges.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-arplant-070523-042828
2024-07-22
2025-06-14
Loading full text...

Full text loading...

/deliver/fulltext/arplant/75/1/annurev-arplant-070523-042828.html?itemId=/content/journals/10.1146/annurev-arplant-070523-042828&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Agathokleous E, Saitanis CJ, Fang C, Yu Z. 2023.. Use of ChatGPT: What does it mean for biology and environmental science?. Sci. Total Environ. 888::164154
    [Crossref] [Google Scholar]
  2. 2.
    Aich S, Josuttes A, Ovsyannikov I, Strueby K, Ahmed I, et al. 2018.. DeepWheat: estimating phenotypic traits from crop images with deep learning. . In 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 32332. New York:: IEEE
    [Google Scholar]
  3. 3.
    Amara J, Bouaziz B, Algergawy A. 2017.. A deep learning-based approach for banana leaf diseases classification. . In Datenbanksysteme für Business, Technologie und Web (BTW 2017)—Workshopband, pp. 7988. Bonn, Ger.:: Gesellschaft Informatik e.V.
    [Google Scholar]
  4. 4.
    Arend D, Psaroudakis D, Memon JA, Rey-Mazón E, Schüler D, et al. 2022.. From data to knowledge—big data needs stewardship, a plant phenomics perspective. . Plant J. 111:(2):33547
    [Crossref] [Google Scholar]
  5. 5.
    Aristeidou M, Herodotou C, Ballard HL, Young AN, Miller AE, et al. 2021.. Exploring the participation of young citizen scientists in scientific research: the case of iNaturalist. . PLOS ONE 16:(1):e0245682
    [Crossref] [Google Scholar]
  6. 6.
    Bah MD, Hafiane A, Canals R. 2018.. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. . Remote Sens. 10:(11):1690
    [Crossref] [Google Scholar]
  7. 7.
    Bai X, Zhang C, Xiao Q, He Y, Bao Y. 2020.. Application of near-infrared hyperspectral imaging to identify a variety of silage maize seeds and common maize seeds. . RSC Adv. 10:(20):1170715
    [Crossref] [Google Scholar]
  8. 8.
    Beck MA, Liu C-Y, Bidinosti CP, Henry CJ, Godee CM, Ajmani M. 2020.. An embedded system for the automated generation of labeled plant images to enable machine learning applications in agriculture. . PLOS ONE 15:(12):e0243923
    [Crossref] [Google Scholar]
  9. 9.
    Biswas S, Barma S. 2020.. A large-scale optical microscopy image dataset of potato tuber for deep learning based plant cell assessment. . Sci. Data 7:(1):371
    [Crossref] [Google Scholar]
  10. 10.
    Bradski G, Kaehler A. 2008.. Learning OpenCV: Computer Vision with the OpenCV Library. Sebastopol, CA:: O'Reilly Media
    [Google Scholar]
  11. 11.
    Brown TB, Mann B, Ryder N, Subbiah M, Kaplan J, et al. 2020.. Language models are few-shot learners. . In NIPS’20: Proceedings of the 34th International Conference on Neural Information Processing Systems, ed. H Larochelle, M Ranzato, R Hadsell, MR Balcan, H Lin , pp. 1877901. Red Hook, NY:: Curran Assoc. Inc.
    [Google Scholar]
  12. 12.
    Carroll AA, Clarke J, Fahlgren N, Gehan MA, Lawrence-Dill CJ, Lorence A. 2019.. NAPPN: Who we are, where we are going, and why you should join us!. Plant Phenome J. 2::180006
    [Crossref] [Google Scholar]
  13. 13.
    Casto AL, Schuhl H, Tovar JC, Wang Q, Bart RS, et al. 2021.. Picturing the future of food. . Plant Phenome J. 4:(1):e20014
    [Crossref] [Google Scholar]
  14. 14.
    Chaudhury B, Joshi V, Mitra P, Sahadevan AS. 2023.. Multi task learning for plant leaf segmentation and counting. . In 2023 IEEE Applied Sensing Conference (APSCON), pp. 13. New York:: IEEE
    [Google Scholar]
  15. 15.
    Chen G, Huang S, Cao L, Chen H, Wang X, Lu Y. 2022.. Application of plant phenotype extraction using virtual data with deep learning. . J. Phys. Conf. Ser. 2356:(1):012039
    [Crossref] [Google Scholar]
  16. 16.
    Coppens F, Wuyts N, Inzé D, Dhondt S. 2017.. Unlocking the potential of plant phenotyping data through integration and data-driven approaches. . Curr. Opin. Syst. Biol. 4::5863
    [Crossref] [Google Scholar]
  17. 17.
    Cruz AC, Luvisi A, De Bellis L, Ampatzidis Y. 2017.. X-FIDO: an effective application for detecting olive quick decline syndrome with deep learning and data fusion. . Front. Plant Sci. 8::1741
    [Crossref] [Google Scholar]
  18. 18.
    David E, Madec S, Sadeghi-Tehran P, Aasen H, Zheng B, et al. 2020.. Global Wheat Head Detection (GWHD) dataset: a large and diverse dataset of high-resolution RGB-labelled images to develop and benchmark wheat head detection methods. . Plant Phenom. 2020::3521852
    [Crossref] [Google Scholar]
  19. 19.
    David E, Serouart M, Smith D, Madec S, Velumani K, et al. 2021.. Global Wheat Head Detection 2021: an improved dataset for benchmarking wheat head detection methods. . Plant Phenom. 2021::9846158
    [Crossref] [Google Scholar]
  20. 20.
    De Myttenaere A, Golden B, Le Grand B, Rossi F. 2016.. Mean absolute percentage error for regression models. . Neurocomputing 192::3848
    [Crossref] [Google Scholar]
  21. 21.
    Dwibedi D, Misra I, Hebert M. 2017.. Cut, paste and learn: Surprisingly easy synthesis for instance detection. . 2017 IEEE International Conference on Computer Vision (ICCV), pp. 131019. New York:: IEEE
    [Google Scholar]
  22. 22.
    Eraslan G, Avsec Ž, Gagneur J, Theis FJ. 2019.. Deep learning: new computational modelling techniques for genomics. . Nat. Rev. Genet. 20:(7):389403
    [Crossref] [Google Scholar]
  23. 23.
    Fageria NK, Baligar VC. 2008.. Ameliorating soil acidity of tropical oxisols by liming for sustainable crop production. . Adv. Agron. 99::34599
    [Crossref] [Google Scholar]
  24. 24.
    Fahlgren N, Feldman M, Gehan MA, Wilson MS, Shyu C, et al. 2015.. A versatile phenotyping system and analytics platform reveals diverse temporal responses to water availability in Setaria. . Mol. Plant 8:(10):152035
    [Crossref] [Google Scholar]
  25. 25.
    Fahlgren N, Gehan MA, Baxter I. 2015.. Lights, camera, action: High-throughput plant phenotyping is ready for a close-up. . Curr. Opin. Plant Biol. 24::9399
    [Crossref] [Google Scholar]
  26. 26.
    Feldmann MJ, Gage JL, Turner-Hissong SD, Ubbens JR. 2021.. Images carried before the fire: the power, promise, and responsibility of latent phenotyping in plants. . Plant Phenome J. 4:(1):e20023
    [Crossref] [Google Scholar]
  27. 27.
    Gehan MA, Fahlgren N, Abbasi A, Berry JC, Callen ST, et al. 2017.. PlantCV v2: Image analysis software for high-throughput plant phenotyping. . PeerJ 5::e4088 27. Plant image analysis using computer vision rather than deep learning.
    [Crossref] [Google Scholar]
  28. 28.
    Ghosal S, Blystone D, Singh AK, Ganapathysubramanian B, Singh A, Sarkar S. 2018.. An explainable deep machine vision framework for plant stress phenotyping. . PNAS 115:(18):461318
    [Crossref] [Google Scholar]
  29. 29.
    Giuffrida MV, Doerner P, Tsaftaris SA. 2018.. Pheno-Deep Counter: a unified and versatile deep learning architecture for leaf counting. . Plant J. 96:(4):88090
    [Crossref] [Google Scholar]
  30. 30.
    Goodfellow I, Bengio Y, Courville A. 2016.. Deep Learning. Cambridge, MA:: MIT Press
    [Google Scholar]
  31. 31.
    Huang H, Deng J, Lan Y, Yang A, Deng X, Zhang L. 2018.. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. . PLOS ONE 13:(4):e0196302
    [Crossref] [Google Scholar]
  32. 32.
    James G, Witten D, Hastie T, Tibshirani R. 2013.. An Introduction to Statistical Learning: with Applications in R. New York:: Springer Sci. Bus. Media
    [Google Scholar]
  33. 33.
    Jin X, Jie L, Wang S, Qi HJ, Li SW. 2018.. Classifying wheat hyperspectral pixels of healthy heads and Fusarium head blight disease using a deep neural network in the wild field. . Remote Sens. 10:(3):395
    [Crossref] [Google Scholar]
  34. 34.
    Kienbaum L, Correa Abondano M, Blas R, Schmid K. 2021.. DeepCob: precise and high-throughput analysis of maize cob geometry using deep learning with an application in genebank phenomics. . Plant Methods 17:(1):91
    [Crossref] [Google Scholar]
  35. 35.
    Krizhevsky A, Sutskever I, Hinton GE. 2017.. ImageNet classification with deep convolutional neural networks. . Commun. ACM 60:(6):8490
    [Crossref] [Google Scholar]
  36. 36.
    LeBauer D, Burnette M, Fahlgren N, Kooper R, McHenry K, Stylianou A. 2021.. What does TERRA-REF's high resolution, multi sensor plant sensing public domain data offer the computer vision community?. In 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), pp. 140915. New York:: IEEE
    [Google Scholar]
  37. 37.
    Lee SH, Chan CS, Remagnino P. 2018.. Multi-organ plant classification based on convolutional and recurrent neural networks. . IEEE Trans. Image Process 27:(9):4287301
    [Crossref] [Google Scholar]
  38. 38.
    Lee SH, Chan CS, Wilkin P, Remagnino P. 2015.. Deep-plant: plant identification with convolutional neural networks. . In 2015 IEEE International Conference on Image Processing (ICIP), pp. 45256. New York:: IEEE
    [Google Scholar]
  39. 39.
    Lee SH, Goëau H, Bonnet P, Joly A. 2020.. Attention-based recurrent neural network for plant disease classification. . Front. Plant Sci. 11::601250
    [Crossref] [Google Scholar]
  40. 40.
    Liang Z, Pandey P, Stoerger V, Xu Y, Qiu Y, et al. 2017.. Conventional and hyperspectral time-series imaging of maize lines widely used in field trials. . Gigascience 7:(2):gix117
    [Google Scholar]
  41. 41.
    Livingstone DJ, ed. 2008.. Artificial Neural Networks: Methods and Applications. Totowa, NJ:: Humana Press
    [Google Scholar]
  42. 42.
    Lu Y, Chen D, Olaniyi E, Huang Y. 2022.. Generative adversarial networks (GANs) for image augmentation in agriculture: a systematic review. . Comput. Electron. Agric. 200::107208
    [Crossref] [Google Scholar]
  43. 43.
    Minervini M, Fischbach A, Scharr H, Tsaftaris SA. 2016.. Finely-grained annotated datasets for image-based plant phenotyping. . Pattern Recognit. Lett. 81::8089
    [Crossref] [Google Scholar]
  44. 44.
    Mique EL, Palaoag TD. 2018.. Rice pest and disease detection using convolutional neural network. . In ICISS 2018: Proceedings of the 1st International Conference on Information Science and Systems, pp. 14751. New York:: Assoc. Comput. Mach.
    [Google Scholar]
  45. 45.
    Mittler R. 2006.. Abiotic stress, the field environment and stress combination. . Trends Plant Sci. 11:(1):1519
    [Crossref] [Google Scholar]
  46. 46.
    Mohanty SP, Hughes DP, Salathé M. 2016.. Using deep learning for image-based plant disease detection. . Front. Plant Sci. 7::1419
    [Crossref] [Google Scholar]
  47. 47.
    Mostafa S, Mondal D, Beck MA, Bidinosti CP, Henry CJ, Stavness I. 2022.. Leveraging guided backpropagation to select convolutional neural networks for plant classification. . Front. Artif. Intell. 5::871162 47. Use of class activation mapping to identify important regions of an image for deep learning.
    [Crossref] [Google Scholar]
  48. 48.
    Murphy KP. 2022.. Probabilistic Machine Learning: An Introduction. Cambridge, MA:: MIT Press
    [Google Scholar]
  49. 49.
    O'Mahony N, Campbell S, Carvalho A, Harapanahalli S, Hernandez GV, et al. 2020.. Deep learning versus traditional computer vision. . Adv. Intell. Syst. Comput. 943::128144
    [Crossref] [Google Scholar]
  50. 50.
    Papoutsoglou EA, Faria D, Arend D, Arnaud E, Athanasiadis IN, et al. 2020.. Enabling reusability of plant phenomic datasets with MIAPPE 1.1. . New Phytol. 227:(1):26073 50. Important community guidelines for the minimum information needed to make plant phenotyping data/experiments reproducible.
    [Crossref] [Google Scholar]
  51. 51.
    Partel V, Charan Kakarla S, Ampatzidis Y. 2019.. Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. . Comput. Electron. Agric. 157::33950
    [Crossref] [Google Scholar]
  52. 52.
    Pomyen Y, Wanichthanarak K, Poungsombat P, Fahrmann J, Grapov D, Khoomrung S. 2020.. Deep metabolome: applications of deep learning in metabolomics. . Comput. Struct. Biotechnol. J. 18::281825
    [Crossref] [Google Scholar]
  53. 53.
    Rahman CR, Arko PS, Ali ME, Iqbal Khan MA, Apon SH, et al. 2020.. Identification and recognition of rice diseases and pests using convolutional neural networks. . Biosyst. Eng. 194::11220
    [Crossref] [Google Scholar]
  54. 54.
    Rahnemoonfar M, Sheppard C. 2017.. Deep count: fruit counting based on deep simulated learning. . Sensors 17:(4):905
    [Crossref] [Google Scholar]
  55. 55.
    Ramcharan A, Baranowski K, McCloskey P, Ahmed B, Legg J, Hughes DP. 2017.. Deep learning for image-based cassava disease detection. . Front. Plant Sci. 8::1852
    [Crossref] [Google Scholar]
  56. 56.
    Ren C, Dulay J, Rolwes G, Pauli D, Shakoor N, Stylianou A. 2021.. Multi-resolution outlier pooling for sorghum classification. . In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 292533. New York:: IEEE 56. TERRA-REF open access Sorghum data used for deep learning application.
    [Google Scholar]
  57. 57.
    Rumelhart DE, Hinton GE, Williams RJ. 1986.. Learning representations by back-propagating errors. . Nature 323:(6088):53336
    [Crossref] [Google Scholar]
  58. 58.
    Russakovsky O, Deng J, Su H, Krause J, Satheesh S, et al. 2015.. ImageNet large scale visual recognition challenge. . Int. J. Comput. Vis. 115:(3):21152 58. Led to the development of common pretrained models now commonly used.
    [Crossref] [Google Scholar]
  59. 59.
    Sarker IH. 2021.. Machine learning: algorithms, real-world applications and research directions. . SN Comput. Sci. 2:(3):160
    [Crossref] [Google Scholar]
  60. 60.
    Selvaraj MG, Vergara A, Ruiz H, Safari N, Elayabalan S, et al. 2019.. AI-powered banana diseases and pest detection. . Plant Methods 15:(1):92
    [Crossref] [Google Scholar]
  61. 61.
    Setter TL, Waters I. 2003.. Review of prospects for germplasm improvement for waterlogging tolerance in wheat, barley and oats. . Plant Soil 253:(1):134
    [Crossref] [Google Scholar]
  62. 62.
    Shadrin D, Menshchikov A, Somov A, Bornemann G, Hauslage J, Fedorov M. 2020.. Enabling precision agriculture through embedded sensing with artificial intelligence. . IEEE Trans. Instrum. Meas. 69:(7):410313
    [Crossref] [Google Scholar]
  63. 63.
    Shen X, Jiang C, Wen Y, Li C, Lu Q. 2022.. A brief review on deep learning applications in genomic studies. . Front. Syst. Biol. 2::877717
    [Crossref] [Google Scholar]
  64. 64.
    Shrawat AK, Carroll RT, DePauw M, Taylor GJ, Good AG. 2008.. Genetic engineering of improved nitrogen use efficiency in rice by the tissue-specific expression of alanine aminotransferase. . Plant Biotechnol. J. 6:(7):72232
    [Crossref] [Google Scholar]
  65. 65.
    Szegedy C, Ioffe S, Vanhoucke V, Alemi A. 2016.. Inception-v4, inception-ResNet and the impact of residual connections on learning. . In AAAI’17: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, pp. 427884. Washington, DC:: AAAI Press
    [Google Scholar]
  66. 66.
    Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, et al. 2014.. Going deeper with convolutions. . In 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 19. New York:: IEEE
    [Google Scholar]
  67. 67.
    Thorp HH. 2023.. ChatGPT is fun, but not an author. . Science 379:(6630):313
    [Crossref] [Google Scholar]
  68. 68.
    Ubbens J, Cieslak M, Prusinkiewicz P, Stavness I. 2018.. The use of plant models in deep learning: an application to leaf counting in rosette plants. . Plant Methods 14::6
    [Crossref] [Google Scholar]
  69. 69.
    Ubbens JR, Stavness I. 2017.. Deep plant phenomics: a deep learning platform for complex plant phenotyping tasks. . Front. Plant Sci. 8::1190 69. DPP network used for a variety of tasks: leaf number, genotype classification, and age estimation.
    [Crossref] [Google Scholar]
  70. 70.
    Unger S, Rollins M, Tietz A, Dumais H. 2021.. iNaturalist as an engaging tool for identifying organisms in outdoor activities. . J. Biol. Educ. 55:(5):53747
    [Crossref] [Google Scholar]
  71. 71.
    van de Koot WQM, van Vliet LJJ, Chen W, Doonan JH, Nibau C. 2021.. Development of an image analysis pipeline to estimate sphagnum colony density in the field. . Plants 10:(5):840
    [Crossref] [Google Scholar]
  72. 72.
    van der Walt S, Schönberger JL, Nunez-Iglesias J, Boulogne F, Warner JD, et al. 2014.. scikit-image: image processing in Python. . PeerJ 2::e453
    [Crossref] [Google Scholar]
  73. 73.
    Van Horn G, Mac Aodha O, Song Y, Cui Y, Sun C, et al. 2017.. The iNaturalist species classification and detection dataset. . In 2018 IEEE/CVF Converence on Computer Vision and Pattern Recognition, pp. 876978. New York:: IEEE 73. iNaturalist image data, an important community resource and example of community engagement.
    [Google Scholar]
  74. 74.
    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, et al. 2017.. Attention is all you need. . In NIPS’17: Proceedings of the 31st Annual Conference on Neural Information Processing Systems, ed. U von Luxburg, I Guyon , pp. 600010. Red Hook, NY:: Curran Assoc. Inc.
    [Google Scholar]
  75. 75.
    Wang S, Su Z. 2019.. Metamorphic testing for object detection systems. . In 2020 35th IEEE/ACM International Conference on Automated Software Engineering, pp. 105365. New York:: IEEE
    [Google Scholar]
  76. 76.
    Wangai PW, Burkhard B, Müller F. 2016.. A review of studies on ecosystem services in Africa. . Int. J. Sustain. Built Environ. 5:(2):22545
    [Crossref] [Google Scholar]
  77. 77.
    Wen B, Zeng W-F, Liao Y, Shi Z, Savage SR, et al. 2020.. Deep learning in proteomics. . Proteomics 20:(21–22):e1900335
    [Crossref] [Google Scholar]
  78. 78.
    Weyler J, Milioto A, Falck T, Behley J, Stachniss C. 2021.. Joint plant instance detection and leaf count estimation for in-field plant phenotyping. . IEEE Robot. Autom. Lett. 6:(2):3599606
    [Crossref] [Google Scholar]
  79. 79.
    White AE, Dikow RB, Baugh M, Jenkins A, Frandsen PB. 2020.. Generating segmentation masks of herbarium specimens and a data set for training segmentation models using deep learning. . Appl. Plant Sci. 8:(6):e11352
    [Crossref] [Google Scholar]
  80. 80.
    Wilkinson MD, Dumontier M, Aalbersberg IJJ, Appleton G, Axton M, et al. 2016.. The FAIR Guiding Principles for scientific data management and stewardship. . Sci. Data 3::160018 80. Important community guidelines for data management that are important for deep learning.
    [Crossref] [Google Scholar]
  81. 81.
    Wittmann J. 2023.. Science fact versus science fiction: a ChatGPT immunological review experiment gone awry. . Immunol. Lett. 256–257::4247
    [Crossref] [Google Scholar]
  82. 82.
    Zhang X, Han L, Dong Y, Shi Y, Huang W, et al. 2019.. A deep learning-based approach for automated yellow rust disease detection from high-resolution hyperspectral UAV images. . Remote Sens. 11:(13):1554
    [Crossref] [Google Scholar]
  83. 83.
    Zou J, Huss M, Abid A, Mohammadi P, Torkamani A, Telenti A. 2019.. A primer on deep learning in genomics. . Nat. Genet. 51:(1):1218
    [Crossref] [Google Scholar]
/content/journals/10.1146/annurev-arplant-070523-042828
Loading
/content/journals/10.1146/annurev-arplant-070523-042828
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error