Lightweight Networks for COVID-19 Detection from Chest X-Ray Images inside a Low-Tier Android Device
Document Type
Conference Proceeding
Publication Date
2022
Abstract
The efforts to inoculate majority of the population have been slower than expected and this is especially true for lower income countries. This problem has caused a lot of worries and further accentuates the importance of timely and effective mass testing considering the emergence of newer variants. The RT-PCR is still the gold standard diagnostic test for COVID-19 detection, but its limitations has led researchers and scientists to explore supplementary screening methods. One effective tool to consider is Chest X-Ray (CXR) imaging and combining it with deep learning has piqued attention from the artificial intelligence (AI) community. To further contribute to this research area, this work focuses on creating, evaluating, and comparing lightweight and mobile-phone-suitable COVID-detecting models. These transfer learning models together with their corresponding dynamic-range quantized versions are first tested according to their classification performance. Afterwards, the models are pushed in a low-tier phone to measure their resource consumption and inference timings. Results show that the utilization of EfficientNetB0 and MobileNetV3 (Small & Large) architectures for transfer learning without any quantization can produce at least 91 % overall average accuracy for 3-class classification scheme. For systems requiring more efficient models, using the quantized versions of the transfer learning models particularly with EfficientNetB0 and MobileNetV3Large as foundation can render at most 0.79 % accuracy loss but still show more than 95% f1-scores for the COVID-19 class.
Recommended Citation
Bacad, D.J.A., & Abu, P.A.R. (2022). Lightweight Networks for COVID-19 Detection from Chest X-Ray Images inside a Low-Tier Android Device. TENCON 2022 - 2022 IEEE Region 10 Conference (TENCON), Hong Kong, Hong Kong, 2022, 1-7. https://doi.org/10.1109/TENCON55691.2022.9978124