• Media type: E-Article
  • Title: Citrus yield prediction using deep learning techniques : a combination of field and satellite data
  • Contributor: Moussaid, Abdellatif [Author]; El Fkihi, Sanaa [Author]; Zennayi, Yahya [Author]; Kassou, Ismail [Author]; Bourzeixb, François [Author]; Lahlouc, Ouiam [Author]; El Mansouri, Loubna [Author]; Imani, Yasmina [Author]
  • Published: 2023
  • Published in: Journal of open innovation ; 9(2023), 2 vom: Juni, Artikel-ID 100075, Seite 1-9
  • Language: English
  • DOI: 10.1016/j.joitmc.2023.100075
  • Identifier:
  • Keywords: Deep learning ; Machine learning ; Citrus yield prediction ; Precision farming ; Spectral data ; Open innovation ; Aufsatz in Zeitschrift
  • Origination:
  • Footnote:
  • Description: The goal of this paper is to develop a deep learning model for predicting citrus yield. The data used consists of two sources: (1) field data that includes information on fertilization and phytosanitary treatment products, water quantities used for irrigation, climatic data (temperature, precipitation, humidity, wind speed, and solar radiation), parcel sizes, and rootstock types for each parcel. (2) The second source comprises images representing the normalized difference vegetation index (NDVI) and the normalized difference water index (NDWI), extracted from Sentinel-2 images taken before the harvest period. The data was collected over a period of 5 years, from 2015 to 2019, and pertains to 50 parcels within a Moroccan orchard. Following data preparation, we constructed a deep learning neural network model with multiple layers and parameters. This model takes the information from each parcel as input for training purposes. Subsequently, we evaluated the model using new data obtained from additional parcels located at various sites within our orchard. The test phase resulted in the following scores: 0.0458 (Mean Squared Error), 0.1450 (Mean Absolute Error), and 0.10 (Percentage Error). These scores reflect the strong predictive capability of our approach.
  • Access State: Open Access
  • Rights information: Attribution (CC BY)