Microbubble Measurements using Image Processing with the YOLOv8 Comparison Model
DOI:
https://doi.org/10.24853/jasat.6.3.109-116Keywords:
Microbubble, Image Processing, YOLOv8, Matrix EvaluationAbstract
Gas-liquid two-phase is a situation where the gas phase of a liquid coexists together. The presence of gas that forms a region in the liquid environment causes the formation of bubble flow. The parameters of the bubble flow carry important information about the behavior and characteristics of the bubble. This research was carried out by detecting the size and area of the bubble using YOLOv8-based image processing by comparing the model's performance to produce improvements in inference time, increase accuracy, and reduce computational load. Bubble images were collected by adding 0.4 mm copper wire as a comparison to convert mm to pixels; then, the images were labeled and trained with various YOLOv8 models. Confusion matrix, precision and recall are used as comparative evaluation materials for the YOLOv8 model to obtain good model performance. In this study, the AUC of the Precision and Recall curve closest to the value 1 is the YOLOv8m model of 0.990. The comparison results of the matrix evaluation with the best model are the YOLOv8m model with mAP of 99.00% and F1-score of 96.86%. Microbubble measurements are calculated from the output of the YOLOv8 model by converting pixel units to mm. The model used in bubble measurements is the model with the best evaluation results and the model that gets the smallest radius value by considering measurement uncertainty, namely YOLOv8m with a minimum radius of 0.66 ± 0.04 mm..Downloads
References
D. W. Moore, “The velocity of rise of distorted gas bubbles in a liquid of small viscosity,” J Fluid Mech, vol. 23, no. 4, pp. 749–766, 1965, doi: 10.1017/S0022112065001660.
D. Eskin, E. Meretskaya, and A. Vikhansky, “A model of breakup of a rising bubble in a turbulent flow,” Chem Eng Sci, vol. 226, p. 115846, Nov. 2020, doi: 10.1016/J.CES.2020.115846.
G. H. Yeoh and J. Y. Tu, “Numerical modelling of bubbly flows with and without heat and mass transfer,” Appl Math Model, vol. 30, no. 10, pp. 1067–1095, Oct. 2006, doi: 10.1016/J.APM.2005.06.012.
T. Vu Quoc, H. Nguyen Dac, T. Pham Quoc, D. Nguyen Dinh, and T. Chu Duc, “A printed circuit board capacitive sensor for air bubble inside fluidic flow detection,” Microsystem Technologies, vol. 21, no. 4, pp. 911–918, Apr. 2015, doi: 10.1007/S00542-014-2141-8/METRICS.
T. Haas, C. Schubert, M. Eickhoff, and H. Pfeifer, “BubCNN: Bubble detection using Faster RCNN and shape regression network,” Chem Eng Sci, vol. 216, p. 115467, Apr. 2020, doi: 10.1016/J.CES.2019.115467.
A. Andruszkiewicz, K. Eckert, S. Eckert, and S. Odenbach, “Gas bubble detection in liquid metals by means of the ultrasound transit-time-technique,” The European Physical Journal Special Topics 2013 220:1, vol. 220, no. 1, pp. 53–62, Mar. 2013, doi: 10.1140/EPJST/E2013-01796-0.
D. Reis, J. Kupec, J. Hong, and A. Daoudi, “Real-Time Flying Object Detection with YOLOv8,” May 2023, Accessed: Feb. 21, 2024. [Online]. Available: https://arxiv.org/abs/2305.09972v1
Z. Zheng, P. Wang, W. Liu, J. Li, R. Ye, and D. Ren, “Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 07, pp. 12993–13000, Apr. 2020, doi: 10.1609/AAAI.V34I07.6999.
X. Li et al., “Generalized Focal Loss: Learning Qualified and Distributed Bounding Boxes for Dense Object Detection,” Adv Neural Inf Process Syst, vol. 33, pp. 21002–21012, 2020.
Z. Wei, C. Duan, X. Song, Y. Tian, and H. Wang, “AMRNet: Chips Augmentation in Aerial Images Object Detection,” Sep. 2020, Accessed: Feb. 21, 2024. [Online]. Available: https://arxiv.org/abs/2009.07168v2
A. Inui et al., “Detection of Elbow OCD in the Ultrasound Image by Artificial Intelligence Using YOLOv8,” Applied Sciences 2023, Vol. 13, Page 7623, vol. 13, no. 13, p. 7623, Jun. 2023, doi: 10.3390/APP13137623.
G. Yang, J. Wang, Z. Nie, H. Yang, and S. Yu, “A Lightweight YOLOv8 Tomato Detection Algorithm Combining Feature Enhancement and Attention,” Agronomy 2023, Vol. 13, Page 1824, vol. 13, no. 7, p. 1824, Jul. 2023, doi: 10.3390/AGRONOMY13071824.
D. Wang and L. Wang, “On OCT image classification via deep learning,” IEEE Photonics J, vol. 11, no. 5, Oct. 2019, doi: 10.1109/JPHOT.2019.2934484.
T. Mahmood, M. Arsalan, M. Owais, M. B. Lee, and K. R. Park, “Artificial Intelligence-Based Mitosis Detection in Breast Cancer Histopathology Images Using Faster R-CNN and Deep CNNs,” Journal of Clinical Medicine 2020, Vol. 9, Page 749, vol. 9, no. 3, p. 749, Mar. 2020, doi: 10.3390/JCM9030749.
M. Tripathi, “Analysis of Convolutional Neural Network based Image Classification Techniques,” Journal of Innovative Image Processing, 2021, doi: 10.36548/jiip.2021.2.003.
Q. Liu, M. Wang, Z. Liu, B. Su, and N. Hanajima, “Defect Detection of Micro-Precision Glass Insulated Terminals,” 2021, doi: 10.2991/jrnal.k.210521.005.
Y. Qian and F. Chen, “Optimization of Excess Bounding Boxes in Micro-part Detection and Segmentation,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11902 LNCS, pp. 738–749, 2019, doi: 10.1007/978-3-030-34110-7_62/COVER.
Y. Deng, “Uncertainty measure in evidence theory,” Science China Information Sciences, vol. 63, no. 11, pp. 1–19, Nov. 2020, doi: 10.1007/S11432-020-3006-9/METRICS.
J. Julian, F. Wahyuni, and F. D. Ulhaq, “Reliability Analysis of pH Measurement on TLC4502 with E201C Electrodes based on ATmega328P Microcontroller: Approach to Analysis of Variation with ANOVA,” ELKHA : Jurnal Teknik Elektro, vol. 15, no. 1, pp. 32–40, Apr. 2023, doi: 10.26418/ELKHA.V15I1.62982.
Kumari, Shruti, et al. "YOLOv8 Based Deep Learning Method for Potholes Detection." 2023 IEEE International Conference on Computer Vision and Machine Intelligence (CVMI). IEEE, 2023.
Downloads
Published
Issue
Section
License
COPYRIGHT POLICY
The author(s) of an article published in the Journal of Applied Sciences and Advanced Technology (JASAT) retains ownership of the intellectual property rights in work (s).
PUBLISHING RIGHTS
The author(s) of an article published in the Journal of Applied Sciences and Advanced Technology (JASAT) have unrestricted publication rights. The authors give the Journal of Applied Sciences and Advanced Technology (JASAT) the right to publish the article and designate the Faculty of Engineering Universitas Muhammadiyah Jakarta Publishing as the original publisher of the article.
LICENSING POLICY
JASAT is an open-access journal that follows the Creative Commons Non-Commercial 4.0 International License (CC BY-NC 4.0), which states that:
Under this license, the reusers must give appropriate credit, provide a link to the license, and indicate if changes were made. Users may do so in any reasonable manner, but not in any way that suggests the licensor endorses users or their use.
Please take the time to read the whole license agreement (https://creativecommons.org/licenses/by-nc/4.0/). As long as reusers follow the license conditions, the owner cannot withdraw these freedoms. The following components are included under this license:
Attribution: Users must provide appropriate attribution, including a link to the license, and indicate whether or not they made any modifications. Users are free to do so reasonably, but not in a manner that indicates the licensee approves of their usage.
NonCommercial: Users may not use the material for commercial purposes.