Osagie, Efosa ORCID: https://orcid.org/0009-0004-3462-7175 and Balasundaram, Rebecca
(2026)
Investigating Epistemic Uncertainty in PCB Defect Detection: A Comparative Study Using Monte Carlo Dropout.
Journal of Experimental and Theoretical Analyses, 4 (12).
pp. 93-108.
Preview |
Text
Investigating Epistemic Uncertainty in PCB Defect Detection Accepted Manuscript.pdf - Published Version Available under License Creative Commons Attribution. | Preview |
Abstract
Deep learning models have become central to automated Printed Circuit Board (PCB) defect detection. However, recent work has raised concerns about how reliably these models express confidence in their predictions, particularly when deployed in safety-critical inspection systems. This study conducts an empirical investigation of epistemic uncertainty across representative architectures used in PCB inspection: the two-stage Faster R-CNN detector, the one-stage YOLOv8 detector, and their corresponding classification counterparts, ResNet-50 and YOLOv8-Cls. Monte Carlo Dropout (MCD) was applied during inference to compute predictive entropy, mutual information, softmax variance, and bounding-box variability across multiple stochastic forward passes on both multiclass and binary inspection datasets. On the multiclass SolDef_AI dataset, Faster R-CNN achieved substantially stronger detection performance (mAP = 0.7607, F1 = 0.9304) and lower predictive entropy, with more stable localisation. In contrast, YOLOv8 produced markedly weaker performance (mAP = 0.2369, F1 = 0.3130) alongside higher entropy and greater bounding-box variability. On the binary Jiafuwen datasets, the YOLOv8-Cls model achieved higher overall performance (F1 = 0.6493) compared with the ResNet-50 classifier (F1 = 0.4904), reflecting its strength in simpler binary inspection tasks. Across uncertainty metrics, predictive entropy and mutual information were more sensitive to dataset size, showing higher and more variable values in the smaller multiclass dataset, whereas softmax variance and bounding-box variability appeared more architecture-dependent. These findings demonstrate that architectural choice, dataset structure, and task formulation jointly influence both performance and uncertainty behaviour. By integrating conventional metrics with uncertainty estimates, this study provides a transparent benchmark for assessing model confidence in automated optical inspection of PCBs.
| Item Type: | Article |
|---|---|
| Status: | Published |
| DOI: | 10.3390/jeta4010011 |
| Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science Q Science > QA Mathematics > QA76 Computer software Q Science > QA Mathematics > QA76.9.H85 Human-Computer Interaction; Virtual Reality; Mixed Reality; Augmented Reality ; Extended Reality |
| School/Department: | London Campus |
| URI: | https://ray.yorksj.ac.uk/id/eprint/14062 |
University Staff: Request a correction | RaY Editors: Update this record
Altmetric
Altmetric