Abstract
The Red Palm Weevil (RPW) is one of the most destructive pests affecting palm trees worldwide, leading to severe agricultural and economic losses. Early detection was essential for effective intervention; however, conventional methods such as pheromone traps and manual inspections frequently failed to identify early stage infestations.To address this challenge, this study developed an automated RPW detection framework using thermal image processing and deep learning.A Convolutional Neural Network(CNN) model was trained to analyze thermal images and detect early signs of infestation.The proposed model achieved a detection accuracy of 98.5%,outperforming traditional machine learning techniques in both precision and response speed. For real-time deployment, the trained CNN was integrated into a Raspberry Pi 4B, enabling a low cost, scalable, and non-invasive monitoring solution suitable for field applications.While the system demonstrated strong performance under controlled conditions, the work also identified key limitations, including the limited penetration depth of thermal imaging and the need for large-scale field validation to ensure robustness under diverse real-world environments.
Introduction
The Red Palm Weevil (RPW), Rhynchophorus Ferrugineus, is a very destructive insect that silently infects palm trees by tunnelling into their trunks, inflicting severe internal damage frequently not detectable until it is too late. Conventional methods of detection such as visual observation, pheromone traps, and acoustic sensors are either delayed or are not accurate enough, and thus early intervention becomes challenging. This proposed work is based on the hypothesis that thermal image anomalies specifically subtle heat variations produced by internal RPW metabolic activity can be used to accurately detect early stage infestations before external symptoms appear. In order to counter these limitations, this project proposes an automated detection system that utilises thermal imaging coupled with a Convolutional Neural Network (CNN) and sophisticated image feature extraction techniques1.
Compared to existing RPW detection approaches that rely on acoustic sensors, pheromone traps, RGB images, or standalone machine learning models, the proposed framework introduces a distinct contribution by integrating thermal imaging with Convolutional Neural Network(CNN) based analysis in an IoT enabled environment. Figure 1a, b and c show the infested coconut palm tree affected by the Red Palm Weevil. The thermal images are preprocessed and examined to obtain features like entropy, edge density, contrast, correlation, texture descriptors like Gray-level Co-occurance Matrix(GLCM) and Local Binary Pattern (LBP), Hu and Haralick moments, Histogram of Gradient (HOG) features, and thermal indicators such as hotspot patterns and mean intensity. These are combined with a CNN to effectively classify palm trees as RPW infested or not.A custom CNN was chosen over lightweight models such as MobileNet and EfficientNet because thermal images contain low texture patterns, and the shallower CNN demonstrated higher accuracy and lower computational cost, making it more suitable for Raspberry Pi based IoT deployment. The system is embedded in a Flask-based web platform, through which users can upload thermal images, view processed outcomes visualized, and receive real-time predictions. The early detection method not only improves the accuracy of pest diagnosis but also facilitates environmentally friendly agriculture by minimizing unnecessary pesticide application and allowing for targeted intervention2.This framework provides a non-invasive and automated tool for proactive RPW monitoring, enhancing efficiency in agricultural pest management and supporting environmentally friendly intervention strategies.
Traditional Red Palm Weevil (RPW) detection methods involve mostly manual detection, visual survey, and the application of pheromone traps, with the tendency of cause delayed and incorrect identification. They are exhaustive, time-demanding, and highly dependent on the skillfulness of field officers. They also have limited utility in the beginning stages of infestation, during which the occurrence of the weevil is hardly detectable3.The conventional methods also fail in large-scale monitoring situations because they have limited capacity to monitor huge areas and because RPW damage is complex in nature and often goes unnoticed until extensive damage is caused. Conversely, contemporary methods, including the machine learning model utilized in this project, provide quicker, more precise, and automated RPW detection from thermal images and extract significant features such as entropy, edge density, and shape complexities, which are more efficient and trustworthy for initial detection and extensive monitoring. Several studies have proposed different techniques for the detection of Red Palm Weevil (RPW) infestation in palm trees, highlighting the importance of early diagnosis techniques4. Detection mechanisms proposed are either image processing-based5 acoustic sensing and processing-based6,7. proposed a machine learning-based approach for the detection of the most prevalent palm tree diseases, for instance, RPW pests, leaf spots, and blight spots, based on a thermal image database. Various image processing methods were used in an attempt to retrieve images with a comparative analysis later conducted on the Support Vector Machine (SVM) and the VGG-based Convolutional Neural Network (VGG-CNN) classifiers. From their findings, it was found that while the SVM had a 92.8% accuracy, the VGG-CNN recorded a 97.9% detection accuracy. Alsanea et al.8 proposed a real-time RPW pest detection and localization model based on images. Utilising a Region-based CNN (R-CNN) for localization and a CNN for feature extraction and classification, they achieved a 100% detection rate on the tested dataset. Al-Saqer SM and Hassan GM explained using two supervised models of learning, Scaled Conjugate Gradient and Conjugate Gradient with Powell/Beale restarts appropriately to detect RPW from an image database. Their 3-layer ANN models were able to detect an accuracy of 93% and 93.5% respectively. The shortcomings of existing algorithms in detection techniques are well addressed in Table 1.
To provide stronger comparative results, Table 1 highlights the strengths and limitations of prior RPW detection methods. But a broader performance interpretation is essential to contextualize the novelty of the proposed system. Earlier methods such as visual surveys and pheromone traps, while low-cost, failed to support early stage detection. Machine learning and deep leaning models such as SVM, Random Forest, VGG reported moderate accuracy, but they often suffered from computational complexity, high deployment cost, and limited scalability for real time field applications. The shortcoming of the traditional RPW detection methods such as delayed diagnosis, high labor dependency, and reduced reliability during early infestation make them time consuming, error prone and ineffective in identifying internal damage at its initial stages. To address this research gap, the present work proposes a smart IoT based thermal imaging approach designed for early and accurate detection of RPW infestation.
The novelty and main contributions of the proposed work are highlighted as follows:
-
The system is implemented on a Raspberry Pi 4b, providing a cost effective, real time and scalable solution suitable for direct field deployment, an advantage over many existing laboratory based approaches.
-
The proposed thermal-CNN framework demonstrates strong potential for practical in-field pest monitoring and can be extended to other agricultural bio-sensing and early diagnosis applications, making it a versatile tool for precision agriculture.
System framework
The architecture of the proposed Red Palm Weevil (RPW) detection system has been designed with an image acquisition stage, where thermal images of palm trunks are captured using a USB based long-wave infrared thermal camera module interfaced with a Raspberry Pi.The images are then organised into a dataset with labels such as RPW infested or healthy (Non-RPW) to support supervised learning as shown in Fig. 2.
Thermal camera specification
Thermal images were captured using a compact USB type infrared thermal camera in Fig. 3. The device operates in the long-wave infrared (LWIR) spectral range of 8–14 μm, suitable for detecting biological heat anomalies inside palm trunks. The module provides a thermal resolution of 320 × 240 pixels, a temperature detection range of −4℃to 300℃and supports USB 2.0 plug and play connectivity.The camera was calibrated using two point calibration with ambient and warm reference surfaces.Temperature normalization to ensure consistent, reproducible thermal readings across environmental conditions. This calibration ensured high reliability of thermal signatures during in-field RPW detection.Table 2 presents the detailed specifications of the thermal camera.
In the pre-processing stage, thermal images are enhanced and standardised through re-sizing, noise removal, contrast and brightness adjustment, and data augmentation through rotation, flipping or scaling to increase data-set diversity.
Ambient Temperature Normalisation:
A field temperatures vary with time of day and environmental heat sources, each thermal frame undergoes normalisation using the formula.
Equation 1 shows the ambient temperature is estimated using neighborhood averaging. Where Tambient was estimated through neighborhood averaging and Tpixel is the range of actual values.Tref is the normalisation highlights minute temperature deviations caused by RPW larval movement. Although thermal images are unaffected by visible light, environmental factors such as sun exposure, bark heating and wind induced temperature fluctuation introduce noise. The attribute selection identifies and extracts the most relevant features such as color, texture, and shape based features to improve classification accuracy. The data-set is then split into training and testing subsets, typically with 70% of the data for training and 30% for testing and validation. The CNN model is applied to train the model. The trained model performs RPW detection on unseen images by analyzing extracted features and predicting the presence or absence of RPW, producing the final label of RPW or Non RPW to indicate the status of the palm. The proposed detection system integrates image processing and deep learning for accurate identification of RPW activity. It consists of four main stages: data Acquisition, Feature extraction, Model Training, Model Evaluation and Classification shown in Fig. 3.
Data acquisition
The image data that has been recorded from the field. A locally maintained data-set comprising approximately 6000 RPW and Non-RPW thermal images is used for training and testing, enabling great performance.RPW thermal image samples are extracted from test databases with both infested and non-infested tree recordings.Thermal images were captured at a fixed sensor to trunk distance of 1.2–1.5 m during early morning (6.30–8.30.30AM) to ensure stable ambient conditions.Image of Red Palm Weevil (RPW) and Non-RPW activity were collected from three different trunk locations of coconut trees showing potential signs of infestation, with the kind support of Mr.Thangadurai, on a 17-acre coconut grove in Orathanadu, Thanjavur District. Figures 4a, b and c and 5a, b and c shows the thermal images of RPW infested trunk showing axial and bottom views.
Feature extraction
The raw thermal images were preprocessed and filtered to enhance their quality before feature analysis. It extracts important features like entropy, edge density, contrast, correlation, energy, homogeneity, shape complexities, and texture features like Haralick and HHOG. These features are very important in determining minute differences between healthy and infested palms.
Model training
The extracted features are applied to train a machine learning model, a convolutional neural network (CNN) that can learn intricate patterns using the features14. The CNN was trained using a curated dataset of 6000 thermal images to identify characteristic temperature irregularities and texture deviations present in infested palms.The dataset of 6,000 thermal images was split into 70% training, 15% validation, and 15% test sets. Random sampling was used without stratification to create these splits.
Model evaluation
The trained model is tested and validated against classical techniques for RPW detection in order to assure its accuracy and resilience in RPW detection across infestation stages. The model performance is checked in terms of various metrics including accuracy, precision, recall, and F1-score. Through the utilization of these extracted features, the model is able to discriminate between infested and healthy palms efficiently and provide rapid accurate detection. This is an AI-based method that facilitates early identification and can be used for bulk monitoring in agronomic settings.
RPW detection pipeline using CNN
The proposed CNN architecture for RPW detection from thermal images comprises several key components. The input layer accepts pre-processed thermal images of size 224 × 224 × 1. Convolutional layers then perform feature extraction by applying multiple learn-able filters to detect local patterns such as temperature anomalies.The proposed CNN model consists of four convolutional blocks, each containing a Conv2D layer followed by ReLU activation and Max-Pooling. The first block uses 32 filters, the second 64 filters, the third 128 filters, and the fourth 256 filters, all with a kernel size of 3 × 3.To reduce over-fitting, a dropout layer (rate = 0.3) is added after the third and fourth blocks as shown in Figs. 6 and 7.
A ReLU activation function introduces non-linearity after each convolution, enabling the network to learn complex patterns.Pooling layers reduce the spatial dimension of the feature maps while retaining the most prominent features. The fully connected (FC) layer maps the high level learned features into a lower dimensional output. A soft-max layer converts the raw logistic into class probabilities for RPW or Non-RPW and finally the classification layer outputs the predicted class, indicating whether the palm tree is infested or healthy.The total number of trainable parameters in the model is 1.32 million, making it lightweight and suitable for real-time deployment.
The following essential elements make up the CNN architecture for RPW detection from thermal images:
Input layer
Accepts preprocessed thermal images of size 224 × 224 × 1224 times 224 times 1224 × 224 × 1.
Layers of Convolution
Apply several learnable filters to extract features, such as temperature anomalies.
Function of Activation (ReLU)
After convolution, non-linearity is introduced: f(x) = max(0,x).
Layers of pooling (Max pooling)
While keeping the most noticeable features, reduce the feature maps’ spatial dimensions.
Layer fully connected (FC)
Converts the learned high-level features into an output space with fewer dimensions.
The softmax layer
Transforms into RPW or non-RPW class probabilities.
Layer of classification
produces the final class forecast.
The pre-processed thermal images are fed into a specialized convolutional neural network (CNN) designed to learn hierarchical feature representations of the input data. The convolutional layers recognize regional spatial patterns, such as thermal anomalies, textual inconsistencies, and contour features indicative of pest infestation. Pooling layers, typically max pooling, reduce the spatial dimensions of feature maps while retaining dominant features15. The resulting feature maps are then flattened and passed through fully connected layers to learn high level abstractions with ReLU activation functions applied after each convolution to introduce nonlinearity. The final output layer using either a sigmoid or softmax classifier depending on whether the task is binary or multi-class, provides the probability of RPW infestation in the image16.
Figure 8 shows the proposed CNN flowchart.The CNN is trained on a curated dataset of 6000 thermal images (4500 RPW and 1500 Non RPW) using binary cross entropy loss, the Adam optimizer with a learning rate of 0.0001, a batch size of 32 and for 50 epochs. To interpret the CNN decisions and visualize the most influential regions, Grad CAM (Gradient weighted class Activation Mapping) is employed. To improve model interpretability, Grad-CAM(Gradient-Weighted Class Activation Mapping) was employed to visualize the regions that contribute most to the CNN’s decision.However, we plan to incorporate Grad-CAM methods in future work to improve transparency.
Results and discussion
The proposed RPW detection model processes thermal image data through a spectrogram based feature extraction pipeline and classifies them using a Convolutional Neural Network (CNN). The model is trained using categorical cross-entropy loss and optimized with the Adam algorithm. The deep learning-based framework effectively distinguishes RPW infested samples from non-infested ones, thereby supporting early pest monitoring and control. Table 3 presents the dataset statistics, and the class distribution is illustrated in Fig. 9.
Data acquisition and filtering
Thermal images were captured under controlled environmental conditions to ensure consistency and reliability. The dataset includes16 thermal signatures from both RPW-infested and healthy palm trees. These images form the foundation for subsequent pre-processing steps, including filtering and segmentation as illustrated in Figs. 10 and 11.
This work seeks to investigate the performance of an artificial intelligence-based detection system for recognizing images of the Red Palm Weevil (RPW) and classifying palm trees as healthy (Non-RPW) or infested (RPW). To present a balanced assessment, the RPW classification model was trained on a dataset consisting of RPW samples and Non-RPW samples. Table 4shows the infested and non infested thermal images of coconut palm. A brood of Red Palm Weevil larvae was discovered deep inside the coconut trunk, indicating an advanced stage of infestation. Thermal imaging successfully identified the presence of a brood of larvae within the infested palm region. The brood of RPW larvae feeds continuously on the internal tissues, causing severe structural damage to the tree. Early detection of the brood helps in preventing the spread of infestation to nearby palms. The suggested CNN-based model is assessed on the basis of standard performance metrics, such as accuracy, precision, recall, and F1-score. Complete details on the dataset utilized as well as the evaluation outcomes of the suggested model will be elaborated in the following sections17.
Evaluation results of deep learning method
In the current research work, the raw thermal images are subjected to a sequence of preprocessing operations, i.e., resizing, conversion into grayscale images, and normalization. The preprocessed thermal images are subsequently input into a Convolutional Neural Network (CNN) to ascertain the presence or absence of RPW infestation. The CNN structure is designed to automatically identify spatial features such as thermal intensity patterns and temperature anomalies for RPW infestation.
Table 5 shows the performance of the model was tested through a classification report and accuracy scores. The CNN model was performing with 98.5% total accuracy, proving to be exceedingly efficient in classifying RPW and Non-RPW. Visual inspection techniques and the classical image-based diagnostic detection techniques of RPW infestation have some important limitations. These are late diagnoses usually only after extensive internal damage has already been done—inability to access real-time data in field conditions, and a high incidence of false positives due to visually mimetic diseases or abiotic effects like sunlight-heated heat or other abiotic stresses. These limitations make classical visual examination futile in the context of early intervention, which is imperative in an attempt to control RPW infestation prior to extensive, irreversible harm being inflicted on the vascular system of the tree. The proposed CNN overcomes these limitations by learning temperature based structural features.To further strengthen the evaluation, additional performance analyses were incorporated. A confusion matrix was generated to visualize class-wise prediction distribution, clearly highlighting the true positive, true negative, false positives and false negatives for both RPW and Non RPW categories. Furthermore, the model’s discriminative ability was quantified using the Receiver Operating Characteristic (ROC) curve and the corresponding Area Under the Curve (AUC), achieving an AUC value of 0.99. A precision recall curve was also included to provide deeper insight into model performance under class imbalance conditions. To ensure robustness, 5 fold cross validation was performed. Additionally, an error analysis was conducted to identify common false positives and false negatives. False negatives mainly occurred in images with weak thermal contrast, while false positives were mostly due to sun exposed regions or environmental heat reflections. Representative misclassified samples were evaluated to aid future dataset improvements.
This disparity results from the fact that validation accuracy and validation loss assess different facets of model performance. While loss measures the degree of confidence in the model’s predictions, accuracy only shows the percentage of correctly classified samples. In this research, a number of predictions were made with high probability spread across classes, despite the fact that 98.5% of the samples were correctly classified.This explains the elevated validation loss because it leads to a higher loss value even though the classification is correct.To do this, thermal imaging has been a promising non-invasive approach. RPW activity produces localized internal metabolic heat that produces minute surface-level thermal abnormalities. These are generally invisible to the naked eye but are detectable with high-resolution infrared detectors. Much previous image-based work used hand-crafted features or generic machine learning classifiers, which could not generalize over a large range of thermal environments18. Some more recent approaches used deep learning and transfer learning models but were over-fit or not domain-specifically optimized for thermal data. Table 6 shows the comparison of different classifiers.
In contrast to these approaches, our method employs a specifically tailored Convolutional Neural Network (CNN) trained end-to-end from thermal palm stem images to automatically learn low-level and high-level thermal features typical of RPW infestation. Pre-processing techniques such as bilateral filtering, histogram equalization, and Otsu’s thresholding were employed to enhance image quality and separate the region of interest19. During training, the precision (99.2%) was higher than the recall (87.6%),indicating a mild class imbalance in the dataset. Although the model achieved a validation accuracy of 98.5%, the validation loss(0.97) was relatively higher. The 5 fold cross validation procedure, where the data set is randomly shuffled, divided into five equal subsets, and iteratively trained on four folds while the remaining fold is used for validation. The process repeats five times and the final performance is computed as the average of all evaluation metrics.
Overview of process.
Shuffle the dataset to ensure random distribution of samples.split into five equal folds [20].
Iteration 1
Train on folds 1,2,3,5 and validate on fold 4.
Iteration 2
Train on folds 1,4,3,5 and validate on fold 2.
Iteration 3
Train on folds 1,2,4,5 and validate on fold 3.
Iteration 4
Train on folds 1,3,4,5 and validate on fold 2.
Iteration 5
Train on folds 2,3,4,5 and validate on fold 1.
Compute the performance metrics from all five iterations.Average the results to obtain the final model performance.The proposed CNN model performed better than the baseline classifiers and popular pre-trained models such as VGG16, Random forest, SVM on precision, recall, and F1-score.Moreover, model interpretability was boosted using the Grad-CAM technique, which indicated the particular thermal areas contributing to the classification decision21. This provided for visual validation and made the model’s outputs more explainable to plant pathologists and agricultural experts. Figure 12 shows the confusion matrix of the classifier.The ROC of the RPW and Non RPW are included in Figs. 13 and 14. Figure 15 shows the bar chart comparing model performance for RPW detection.
Raspberry Pi experimental setup
The Raspberry Pi system is being powered with the standard 5 V power supply that plugs directly into its USB-C port22. This provides enough voltage and current to run the Raspberry Pi 4B along with all peripheral devices attached to it. The USB mouse and USB keyboard that will be used to interact with the system connect into the existing USB ports present on the Raspberry Pi in Fig. 16. Model inference on the Raspberry Pi 4B required a compact model size (≈ 5–25 MB) with memory optimized using INT8 quantization (8 bit integer values), consumed about 4–8 W, and raised the temperature by 10–30 °C.
The input devices enable you to move about the graphical user interface, input commands, and manage any installed applications or programs running on the Pi, your red palm weevil detection being one of them23. A cell phone or a USB camera is also plugged into one of the USB ports. This tool becomes an important factor in your project for detecting the red palm weevil. It either works as a camera to take photos of palm trees and possible weevil infestations or is employed to transfer previously taken photos for processing24. The pictures are then processed by machine learning algorithms, like Convolutional Neural Networks (CNN) or Support Vector Machines (SVM), that directly run on the Raspberry Pi 4b25. The HDMI port of the Raspberry Pi is utilized to output video. Because the monitor is only VGA-capable, an HDMI-to-VGA adapter is used in the connection. It allows the Raspberry Pi’s desktop environment to be displayed on the monitor, where you can see image processing output, detection outputs, and system logs in real time26. In total, this hardware setup converts the Raspberry Pi into a small but powerful embedded AI system that can identify red palm weevil infestations through image analysis methods as shown in Figs. 17 and 18.
In addition to the physical setup, quantitative hardware performance metrics are essential for validating the practical feasibility of the Raspberry Pi based implementation. Important parameters such as inference latency (the time taken from image input to classification output), model size in megabytes, memory footprints during execution, and the energy consumption or heat generation of the Raspberry Pi during continuous inference should be evaluated and reported27. The inference latency defined as the time interval between acquiring an input image and generating the predicted label was recorded on the Raspberry Pi 4B. The non-quantanized model required an average of ~ 412ms per image, corresponding to an effective throughput of approximately 2.4 FPS. Including these measurements strengthens the claim that the system is suitable for real-time field deployment.
Figure 19a and b shows the verification of Red Palm weevil infestation through stem dissection. The infestation status of each palm was verified using a systematic and reliable ground process. First, trained agricultural specialists examined the trees for visible signs of Red Palm Weevil (RPW)activity, including boreholes, frass deposits, crown collapse, and early tissue decay. When visual inspection suggested probable infestation, selective stem dissection was carried out to directly confirm the presence of larvae, pupa or internal feeding tunnels. In cases where destructive sampling was not possible, non-invasive acoustic monitoring was used to detect the distinctive feeding and movement sounds produced by RPW larvae inside the trunk.
The real-time Red Palm Weevil (RPW) detection is integrated into the experimental setup. Thermal images are shown on the monitor, demonstrating the detection outcomes of the suggested deep learning based model that blends image classification with thermal feature extraction28. The Raspberry Pi acts as the data acquisition module, sending real-time image data (taken by a connected camera) to the PC via HDMI and USB29. It processes the data using CNN model to classify and display outcomes such as “Detected RPW”. Raspberry Pi connected to a smart phone, which acts as a portable input source for capturing RPW related images in field conditions.
Conclusion
This study introduced a CNN-based deep learning framework for early detection of Red Palm Weevil infestations using thermal imaging. Unlike conventional image-based approaches that depend on visible symptoms, our model leverages raw thermal data processed and enhanced through a dedicated pipeline to identify temperature anomalies caused by RPW larvae activity before severe damage occurs.Carefully curated thermal images, along with extensive pre-processing and augmentation, enabled the training of a robust CNN architecture that achieved a 98.5% accuracy rate, outperforming both traditional machine learning and popular deep learning baselines. By eliminating reliance on post-infestation symptoms and enabling early-stage intervention, this thermal imaging-based approach offers a critical advantage in managing RPW outbreaks. It supports the use of site-specific pest control strategies, reducing dependence on blanket pesticide applications and promoting environmentally sustainable agricultural practices.The key findings of this improved RPW detection framework is that it reduced false positives when compared with acoustic algorithms. The proposed model achieved a high accuracy of 98.5%, demonstrating strong detection capability, However, the study has limitations. Model validation was conducted only on a restricted palm species and within a single geographic region, which may affect generalizability. Additionally, environmental factors such as humidity, surface moisture, and seasonal temperature variations may influence thermal patterns and detection consistency.These constraints highlight the need for broader validation and environmental adaptation.
Limitation and future scope
Although the proposed framework demonstrated promising results under controlled testing conditions, several limitations remain.The performance of thermal based detection may vary with external environmental factors such as humidity, ambient temperature, direct sunlight exposure, and seasonal fluctuation, which can influence thermal contrast between healthy and infested palm trees.Additionally, the dataset used in this study was limited to a specific palm species and geographic region. Therefore, further validation across multiple palm varieties and diverse climatic conditions is required to ensure generalizability.Future research may explore the integration of multimodal sensing, combining thermal imaging with acoustic signatures or biochemical markers to enhance early detection sensitivity.Incorporating the model into UAV platforms or IoT enabled smart farming system may support scalable, automated, and continuous monitoring at the plantation level. Expanding the dataset, improving environmental compensation algorithms and optimizing the model for low power devices will further strengthen the applicability of the system for precision agriculture.The proposed RPW detection system aligns with key Sustainable Development Goals (SDG), particularly SDG 2 (Zero Hunger) by protecting crops and preventing yield losses, SDG 12(Responsible Consumption and Production) by enabling targeted and sustainable pest management. Through early and non-destructive detection, the system contributes to sustainable agricultural practices and long term environmental protection.
Data availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
References
Arasi, M. A. et al. Enhancing red palm weevil detection using bird swarm algorithm with deep learning model. IEEE Access. 12, 1542–1551. https://doi.org/10.1109/ACCESS.2023.3348412 (2023).
Baydoun, M. & Al-Alaoui, M. A. July. Modified edge detection for segmentation. In 2015 International Symposium on Signals,Circuits and Systems (ISSCS) (pp. 1–4). IEEE. (2015). https://doi.org/10.1109/isscs.2015.7204001
Kalra, A. & Chhokar, R. L. September. A Hybrid approach using sobel and canny operator for digital image edge detection. In 2016 international conference on micro-electronics and telecommunication engineering (ICMETE) (pp. 305–310). (2016). https://doi.org/10.1109/icmete.2016.49
Kapadia, H., Patel, R., Shah, Y., Patel, J. B. & Patel, P. V. May. An Improved Image Pre-processing Method for Concrete Crack Detection. In International Conference on ISMAC in Computational Vision and Bio-Engineering (pp. 1611–1621). Cham: Springer International Publishing. (2018). https://doi.org/10.1007/978-3-030-00665-5_149
Fletcher, R. R. et al. November. The use of mobile thermal imaging and deep learning for prediction of surgical site infection. In 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC) (pp. 5059–5062). IEEE. (2021). https://doi.org/10.1109/embc46164.2021.9630094
Ejiyi, C. J. et al. ResfEANet: ResNet-fused external attention network for tuberculosis diagnosis using chest X-ray images. Comput. Methods Programs Biomed. Update. 5, 100133. https://doi.org/10.1016/j.cmpbup.2023.100133 (2024).
Dehner, C., Olefir, I., Chowdhury, K. B., Jüstel, D. & Ntziachristos, V. Deep-learning-based electrical noise removal enables high spectral optoacoustic contrast in deep tissue. IEEE Trans. Med. Imaging. 41 (11), 3182–3193. https://doi.org/10.1109/tmi.2022.3180115 (2022).
Alsanea, M., Habib, S. & Khan, N. F. A deep-learning model for real-time red palm weevil detection and localization. J. Imaging. 8, 170 (2022).
Kalra, R., Modi, J. N. & Vyas, R. Involving postgraduate's students in undergraduate small group teaching promotes active learning in both. Int. J. Appl. Basic. Med. Res. 5 (Suppl 1), S14–S17 (2015).
Eldin, H. A. et al. A survey on detection of Red Palm Weevil inside palm trees: Challenges and applications. In Proceedings of the 9th International Conference on Software and Information Engineering (119–125) (2020).
Alaa, H. et al. An intelligent approach for detecting palm trees diseases using image processing and machine learning. Int. J. Adv. Comput. Sci. Appl. 11 (7), 434–441 (2020).
Al-Saqer, S. M. & Hassan, G. M. Artificial Neural Networks Based Red Palm Weevil (Rynchophorus Ferrugineous, Olivier) Recognition System. Am. J. Agric. Biol. Sci. 6 (3), 356–364 (2011).
Upadhyay, N., Sharma, D. K. & Bhargava, A. 3SW-Net: A feature fusion network for semantic weed detection in precision agriculture. Food Anal. Methods. 18, 2241–2257 (2025).
Karthik, K. G. V. S., Nithin, K., Dhanush, B., Praveen, K. & Sarath, S. July. Data augmentation of neonatal thermal images using deep learning. In 2021 12th International Conference on Computing Communication and Networking Technologies (ICCCNT) (pp. 1–6). IEEE. (2021). https://doi.org/10.1109/icccnt51525.2021.9579769
Huang, C. H., Lin, C. F., Chen, C. A., Hwang, C. H. & Huang, D. C. May. Real-time rehabilitation exercise performance evaluation system using deep learning and thermal image. In 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) (pp. 1–6). IEEE. (2020). https://doi.org/10.1109/i2mtc43012.2020.9129146
Selvakarthi, D. et al. Experimental analysis using deep learning techniques for safety and riskless transport-a sustainable mobility environment for post covid-19. In 2021 6th International Conference on Inventive Computation Technologies (ICICT) (pp. 980–984). IEEE. (2021). https://doi.org/10.1109/icict50816.2021.9358749
Rezaei, A. et al. An unobtrusive human activity recognition system using low resolution thermal sensors, machine and deep learning. IEEE Trans. Biomed. Eng. 70 (1), 115–124. https://doi.org/10.1109/tbme.2022.3186313 (2022).
Chen, D. Y., Zou, H. S. & Hsieh, A. T. September. Thermal image based remote heart rate measurement on dynamic subjects using deep learning. In 2020 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-Taiwan) (pp. 1–2). IEEE. (2020). https://doi.org/10.1109/icce-taiwan49838.2020.9258129
Gallagher, J. E. & Oughton, E. J. Assessing thermal imagery integration into object detection methods on air-based collection platforms. Sci. Rep. 13 (1), 8491. https://doi.org/10.1038/s41598-023-34791-8 (2023).
Behera, S. K., Rath, A. K. & Sethy, P. K. Maturity status classification of Papaya fruits based on machine learning and transfer learning approach. Inform. Process. Agric. 8 (2), 244–250. https://doi.org/10.1016/j.inpa.2020.05.003 (2021).
Nijaguna, G. S., Babu, J. A., Parameshachari, B. D., de Prado, R. P. & Frnda, J. Quantum fruit fly algorithm and ResNet50-VGG16 for medical diagnosis. Appl. Soft Comput. 136, 110055. https://doi.org/10.1016/j.asoc.2023.110055 (2023).
Saulle, C. C., Claus, A., Gonçalves, A. G. & De Mio, L. L. M. Photoinactivation by cationic porphyrins reduces germination and severity of phakopsora pachyrhizi, the cause of Asian soybean rust. J. Plant Dis. Prot. 132 (5). https://doi.org/10.1007/s41348-025-01160-8 (2025).
Cabrera, V. A., Sosa, C., Rondan Dueñas, J. C. & Lax, P. Structural modifications and development of galls in tobacco (Nicotiana tabacum) induced by the false root-knot nematode, Nacobbus celatus. J. Plant Dis. Prot. 132 (5). https://doi.org/10.1007/s41348-025-01159-1 (2025).
Saranya, S. & Martin, B. February. Exploring the Role of Deep Learning in Coconut Palm Diseases and Detecting Red Palm Weevil in Early Stage Amidst Advances and Challenges—a Review. In 2024 Second International Conference on Emerging Trends in Information Technology and Engineering (ICETITE) (pp. 1–8). IEEE. (2024). https://doi.org/10.1109/ic-ETITE58242.2024.10493357
Irdani, T., Cutino, I., Strangi, A. & Torre, R. Fungal endophytes from wild solanum torvum seeds: diversity and antagonism against plant pathogens. J. Plant Dis. Prot. 132 (5). https://doi.org/10.1007/s41348-025-01151-9 (2025).
Keerthana, B. et al. Acoustic detection of Callosobruchus maculatus (F.)(Coleoptera: Chrysomelidae) in green gram and Cowpea using MEMS microphone system. J. Plant Dis. Prot. 132 (5). https://doi.org/10.1007/s41348-025-01152-8 (2025).
Yeleç, G., Mamay, M., Özgen, İ. & Saeed, A. Artificial overwintering shelters enhance monitoring of beneficial and harmful insects in pomegranate orchards. J. Plant Dis. Prot. 132 (5). https://doi.org/10.1007/s41348-025-01153-7 (2025).
Saranya, S. & Martin, B. Machine learning-enabled acoustic sensing for RPW infestation detection. Sci. Rep. 15 (1). https://doi.org/10.1038/s41598-025-22306-6 (2025).
Martin, B., Sugumaran, G. S., Subhadra, M. & Marshiana, D. February. Generalized Innovative Approach to Detect Pest Signal based on Frequency using STM32. In 2025 International Conference on Electronics and Renewable Systems (ICEARS) (pp. 146–153). IEEE. (2025). https://doi.org/10.1109/ICEARS64219.2025.10940375
Acknowledgements
We sincerely thank SASTRA Deemed to be University, India for providing available resources.
Funding
We appreciate CSIR-ASPIRE EMR-II 22WS (0039)/2023-24/EMR-II/ASPIRE, Government of India for financial assistance.
Author information
Authors and Affiliations
Contributions
Saranya S. contributed towards the methodology design, formal analysis, original draft writing.Betty Martin provided supervision, project administration, validation, and contributed to the review and editing of the manuscript.Pakalapati Jathin Chowdary assisted in editing, analysis and writing the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Consent to publish
All authors have agreed to the publication of this manuscript.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Martin, B., Saranya, S. & Chowdary, P.J. Smart IoT thermal imaging approach for early identification of Red Palm Weevil (RPW) infestation on palms. Sci Rep 16, 5392 (2026). https://doi.org/10.1038/s41598-025-32783-4
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s41598-025-32783-4


















