Using UAV Image Histogram Kurtosis and Skewness to Automatically Differentiate Typhoon-Damaged Rice Field Regions from Undamaged Regions

Document Type

Conference Proceeding

Publication Date



Typhoons can cause extreme damages to rice fields causing revenue loss for farmers if not addressed early. However, timely damage assessment is difficult due to the scale of rice fields. In this study, we utilized images captured by a commercially available unmanned aerial vehicle (UAV) to create a model that can identify rice plant lodging caused by typhoons. Local officials helped gather and establish ground truth data captured in a period of seven to ten days after a typhoon. Rectangular annotations of the 79 identified damaged portions were extracted. These, together with an equal number of randomly selected undamaged portions with matching areas under the same image comprised the image clippings dataset. The corresponding 8-feature numeric dataset was derived from the kurtosis and skewness of the image histograms in four color channels (red, blue, green and greyscale) produced after a 3x3 median filter was applied to every image clipping instance. Using the numeric dataset, several machine learning models were explored to classify the damaged and damaged clippings. Results from Tenfold Stratified Cross Validation showed the MLPClassifier has an accuracy of 79.25% and F-score of 80.24%, with the Support Vector Machine at 79.21% accuracy and 78.72% F-score, while the Random Forest and Naive Bayes Classifier performed almost similarly with a 74.75% and 76.08% accuracy and 74.69% and 74.62% F-score. This shows that a model can be created to distinguish damages caused by typhoons on images captured by commercial UAVs.