Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/138665
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Structured Knowledge Distillation for Dense Prediction
Author: Liu, Y.
Shu, C.
Wang, J.
Shen, C.
Citation: IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023; 45(6):7035-7049
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Issue Date: 2023
ISSN: 0162-8828
2160-9292
Statement of
Responsibility: 
Yifan Liu, Changyong Shu, Jingdong Wang, and Chunhua Shen
Abstract: In this work, we consider transferring the structure information from large networks to compact ones for dense prediction tasks in computer vision. Previous knowledge distillation strategies used for dense prediction tasks often directly borrow the distillation scheme for image classification and perform knowledge distillation for each pixel separately, leading to sub-optimal performance. Here we propose to distill structured knowledge from large networks to compact networks, taking into account the fact that dense prediction is a structured prediction problem. Specifically, we study two structured distillation schemes: i) pair-wise distillation that distills the pair-wise similarities by building a static graph; and ii) holistic distillation that uses adversarial training to distill holistic knowledge. The effectiveness of our knowledge distillation approaches is demonstrated by experiments on three dense prediction tasks: semantic segmentation, depth estimation and object detection. Code is available at https://git.io/StructKD
Keywords: Structured knowledge distillation; adversarial training; knowledge transferring; dense prediction
Rights: © 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
DOI: 10.1109/tpami.2020.3001940
Published version: http://dx.doi.org/10.1109/tpami.2020.3001940
Appears in Collections:Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.