Treffer: Knowledge distillation and teacher-student learning in medical imaging: Comprehensive overview, pivotal role, and future directions.
Original Publication: London : Oxford University Press, [1996-
Weitere Informationen
Knowledge Distillation (KD) is a technique to transfer the knowledge from a complex model to a simplified model. It has been widely used in natural language processing and computer vision and has achieved advanced results. Recently, the research of KD in medical image analysis has grown rapidly. The definition of knowledge has been further expanded by combining with the medical field, and its role is not limited to simplifying the model. This paper attempts to comprehensively review the development and application of KD in the medical imaging field. Specifically, we first introduce the basic principles, explain the definition of knowledge and the classical teacher-student network framework. Then, the research progress in medical image classification, segmentation, detection, reconstruction, registration, radiology report generation, privacy protection and other application scenarios is presented. In particular, the introduction of application scenarios is based on the role of KD. We summarize eight main roles of KD techniques in medical image analysis, including model compression, semi-supervised method, weakly supervised method, class balancing, etc. The performance of these roles in all application scenarios is analyzed. Finally, we discuss the challenges in this field and propose potential solutions. KD is still in a rapid development stage in the medical imaging field, we give five potential development directions and research hotspots. A comprehensive literature list of this survey is available at https://github.com/XiangQA-Q/KD-in-MIA.
(Copyright © 2025. Published by Elsevier B.V.)
Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.