Document Type : Original Research Paper
Authors
1 Department of GIS and Remote Sensing, Faculty of Natural Resources and Environment, Islamic Azad University-Science and Research Branch, Tehran, Iran
2 Department of Surveying Engineering, Faculty of Civil Engineering, Shahid Rajaee Teacher Training University, Tehran, Iran
Abstract
Background and Objectives: Rice, recognized as a strategic product for food security, holds a significant position not only in national economies but also globally. The importance of rice in meeting the dietary needs of populations and its role in achieving food security have led to a serious and substantial emphasis on this staple crop. In this regard, accurate and up-to-date data collection on the status of rice fields, especially information related to the quantity and quality of products, is crucial. Remote sensing technologies have been proposed as an efficient and effective solution in this context, enabling cost-effective data collection over extensive areas. Among these technologies, drones, due to their superior spatial resolution and higher precision in various monitoring tasks compared to satellites, offer relative advantages. This research employs an advanced approach called deep learning to estimate the cultivation area of rice seedlings or seedbeds using RGB images captured by drones in the Wufeng region of Taichung Province, Taiwan. The method leverages the capabilities of deep neural networks as an effective tool for analyzing complex data, achieving high accuracy in distinguishing various types of rice seedling or seedbed cultivation areas.
Methods: In this study, an advanced deep learning technique called DenseNet is employed for modeling and predicting the rice seedling or seedbed cultivation area in RGB images taken by drones. This method, utilizing complex algorithms and a set of processing layers, can extract high-level abstract concepts from the data. One unique feature of DenseNet is its use of a layer-to-layer algorithm instead of traditional layer concatenation approaches, resulting in reduced weights and parameters, as well as increased network efficiency. The ability of deep learning to process data in real-time immediately after image acquisition demonstrates the dynamic potential of DenseNet in quickly and accurately processing information. This capability allows real-time analysis and prediction of the rice seedling or seedbed cultivation area, providing the necessary information for optimal farm management.
Findings: The results obtained from this research demonstrate a confirmation of an accuracy exceeding 99.8% on validation data. This exceptionally high percentage indicates the remarkable capability of the DenseNet deep learning method in accurately estimating the cultivation area of rice seedlings or seedbeds. This high accuracy not only showcases the excellent performance of the model in identifying and predicting the rice cultivation area but also instills confidence in users. The presented model has successfully achieved precise detection and assessment of the rice seedling or seedbed cultivation area. This practical application provides valuable tools for farmers and farm managers to gain more accurate and timely awareness of their farm's status, facilitating better decision-making in cultivation and productivity.
Conclusion: This study convincingly shows the viability of employing drones in conjunction with sophisticated deep learning techniques for accurately estimating the cultivation area of rice seedlings or seedbeds. This approach proves feasible, especially in geographical areas similar to Wufeng in Taichung Province, Taiwan. The integration of drones and deep learning represents a notable technological leap in monitoring capabilities, offering substantial assistance to pertinent authorities involved in agricultural management and ensuring food security.
Keywords
Main Subjects
COPYRIGHTS
© 2023 The Author(s). This is an open-access article distributed under the terms and conditions of the Creative Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) (https://creativecommons.org/licenses/by-nc/4.0/)