The Power of Certainty: How Confident Models Lead to Better Segmentation

Authors: Tugberk Erol, Tuba Caglikantar, Duygu Sarikaya
Affiliations:

Abstract:
Deep learning models have shown strong performance in automatic polyp detection and segmentation during colonoscopy. However, large models are prone to overfitting and poor generalization across diverse datasets. Knowledge distillation and self-distillation address these limitations, but often require high memory and computational cost. We propose Dynamic Confidence-Based Self-Distillation (DCSD), a novel approach that leverages only previous iteration data during training without extra computation at test time. By weighting the distillation loss with a dynamic confidence coefficient, DCSD improves model reliability, consistency, and generalization across multiple datasets.


Highlights


Methodology

Architecture

Dynamic Confidence-Based Self-Distillation (DCSD)

Conclusion

DCSD is a simple yet effective confidence-weighted self-distillation approach that improves segmentation performance and generalization without extra inference cost. Future work aims to explore adaptive temperature scaling and confidence estimation for better robustness in clinical settings.

Contact: tugberk.erol@gazi.edu.tr
arXiv: arXiv:2507.10490v1