本文仅作学术分享,如有侵权,请联系后台删除
References:
[1] Mwiti D. Research Guide: Model Distillation Techniques for Deep Learning [[自备梯子link]]
[2] Hinton G, Vinyals O, Dean J. Distilling the Knowledge in a Neural Network NIPSW 2014 [[paper]]
[3] Bucilua C, Caruana R, Niculescu-Mizil A. Model Compression ACM ICKDD 2006 [[paper]]
[4] Li J, Zhao R, Huang J-T, Gong Y. Learning Small-Size DNN with Output-Distribution-Based Criteria InterSpeech 2014 [[paper]]
[5] Ba L J, Caruana R. Do Deep Nets Really Need to be Deep? NIPS 2014 [[paper]]
[6] Romero A, Ballas N, Kahou S E, Chassang A, Gatta C, Bengio Y. FitNets Hints for Thin Deep Nets ICLR 2015 [[paper]]
[7] Zagoruyko S, Komodakis N. Paying More Attention to Attention Improving the Performance of Convolutional Neural Networks via Attention Transfer ICLR 2017 [[paper]] [[code]]
[8] Simonyan K, Vedaldi A, Zisserman Z. Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps ICLRW 2014 [[paper]]
[9] Chen G, Choi W,Yu X, Han T, Chandraker M. Learning Efficient Object Detection Models with Knowledge Distillation NIPS 2017 [[paper]]
[10] Huang Z, Wang N. Like What You Like Knowledge Distill via Neuron Selectivity Transfer arXiv 2017 [[paper]]
[11] Chen Y, Wang N, Zhang Z. DarkRank Accelerating Deep Metric Learning via Cross Sample Similarities Transfer AAAI 2018 [[paper]]
[12] Yim J, Joo D, Bae J, Kim J. A Gift from Knowledge Distillation Fast Optimization, Network Minimization and Transfer Learning CVPR 2017 [[paper]]
[13] Tian Y, Krishnan D, Isola P. Contrastive Representation Distillation arXiv 2019 [[paper]] [[code]]
[14] Mirzadeh S-I, Farajtabar M, Li A, Ghasemzadeh H. Improved Knowledge Distillation via Teacher Assistant Bridging the Gap Between Student and Teacher arXiv 2019 [[paper]]
[15] Cho J H, Hariharan B. On the Efficacy of Knowledge Distillation ICCV 2019 [[paper]]
[16] Heo B, Kim J, Yun S, Park H, Kwak N, Choi J Y. A Comprehensive Overhaul of Feature Distillation ICCV 2019 [[paper]]
[17] Heo B, Lee M, Yun S, Choi J Y. Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons AAAI 2019 [[paper]]
[18] Saputra M R U, de Gusmao P P B, Almalioglu Y, Markham A, Trigoni N. Distilling Knowledge from a Deep Pose Regressor Network ICCV 2019 [[paper]]
[19] Jin X., Peng B, Wu Y, Liu Y, Liu J, Liang D, Yan J, Hu X. Knowledge Distillation via Route Constrained Optimization arXiv 2019 [[paper]]
[20] Liu Y, Jia X, Tan M, Vemulapalli R, Zhu Y, Green B, Wang X. Search to Distill Pearls are Everywhere but not the Eyes arXiv 2019 [[paper]]
[21] Phuong M, Lampert C. Towards Understanding Knowledge Distillation ICML 2019 [[paper]] (研究蒸馏理论)
[22] Lopez-Paz D, Bottou L, Schölkopf B, Vapnik V. Unifying Distillation and Privileged Information ICLR 2016 [[paper]] (研究蒸馏理论)
[23] Furlanello T, Lipton Z, Tschannen M, Itti L, Anandkumar A. Born-Again Neural Networks ICML 2018 [[paper]]
[24] Mishra A, Marr D. Apprentice Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy ICLR 2018 [[paper]]
[25] Polino A, Pascanu R, Alistarh D. Model Compression via Distillation and Quantization ICLR 2018 [[paper]]
[26] Liu Y, Chen K, Liu C, Qin Z, Luo Z, Wang J. Structured Knowledge Distillation for Semantic Segmentation CVPR 2019 [[paper]] [[code]]
[27] Liu Y, Shu C, Wang J, Shen C. Structured Knowledge Distillation for Dense Prediction arXiv 2019 [[paper]]
<pre style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="max-width: 100%;min-height: 1em;letter-spacing: 0.544px;white-space: normal;color: rgb(0, 0, 0);font-family: -apple-system-font, system-ui, "Helvetica Neue", "PingFang SC", "Hiragino Sans GB", "Microsoft YaHei UI", "Microsoft YaHei", Arial, sans-serif;widows: 1;line-height: 1.75em;box-sizing: border-box !important;overflow-wrap: break-word !important;"><strong style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;letter-spacing: 0.5px;font-size: 14px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><strong style="max-width: 100%;font-size: 16px;letter-spacing: 0.544px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;letter-spacing: 0.5px;box-sizing: border-box !important;overflow-wrap: break-word !important;">—</span></strong>完<strong style="max-width: 100%;font-size: 16px;letter-spacing: 0.544px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;letter-spacing: 0.5px;font-size: 14px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><strong style="max-width: 100%;font-size: 16px;letter-spacing: 0.544px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;letter-spacing: 0.5px;box-sizing: border-box !important;overflow-wrap: break-word !important;">—</span></strong></span></strong></span></strong></section><section style="max-width: 100%;letter-spacing: 0.544px;white-space: normal;font-family: -apple-system-font, system-ui, "Helvetica Neue", "PingFang SC", "Hiragino Sans GB", "Microsoft YaHei UI", "Microsoft YaHei", Arial, sans-serif;widows: 1;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section powered-by="xiumi.us" style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="margin-top: 15px;margin-bottom: 25px;max-width: 100%;opacity: 0.8;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="max-width: 100%;letter-spacing: 0.544px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section powered-by="xiumi.us" style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="margin-top: 15px;margin-bottom: 25px;max-width: 100%;opacity: 0.8;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><section style="margin-bottom: 15px;padding-right: 0em;padding-left: 0em;max-width: 100%;min-height: 1em;color: rgb(127, 127, 127);font-size: 12px;font-family: sans-serif;line-height: 25.5938px;letter-spacing: 3px;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;color: rgb(0, 0, 0);box-sizing: border-box !important;overflow-wrap: break-word !important;"><strong style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;font-size: 16px;font-family: 微软雅黑;caret-color: red;box-sizing: border-box !important;overflow-wrap: break-word !important;">为您推荐</span></strong></span></section><section style="margin: 5px 32px;padding-right: 0em;padding-left: 0em;max-width: 100%;min-height: 1em;font-family: sans-serif;letter-spacing: 0px;opacity: 0.8;line-height: normal;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="font-size: 14px;">我,斯坦福AI读博,李飞飞是师娘,5年5篇顶会论文,依然一度抑郁怀疑人生</span><br style="max-width: 100%;box-sizing: border-box !important;overflow-wrap: break-word !important;" /></section><section style="margin-top: 5px;margin-bottom: 5px;padding-right: 0em;padding-left: 0em;max-width: 100%;min-height: 1em;font-family: sans-serif;letter-spacing: 0px;opacity: 0.8;line-height: normal;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;color: rgb(87, 107, 149);font-size: 14px;box-sizing: border-box !important;overflow-wrap: break-word !important;">MIT最新深度学习入门课,安排起来!</span></section><section style="margin-top: 5px;margin-bottom: 5px;padding-right: 0em;padding-left: 0em;max-width: 100%;min-height: 1em;font-family: sans-serif;letter-spacing: 0px;opacity: 0.8;line-height: normal;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="max-width: 100%;color: rgb(87, 107, 149);-webkit-tap-highlight-color: rgba(0, 0, 0, 0);cursor: pointer;font-size: 14px;box-sizing: border-box !important;overflow-wrap: break-word !important;">有了这个神器,轻松用 Python 写个 App</span></section><section style="margin: 5px 32px;padding-right: 0em;padding-left: 0em;max-width: 100%;min-height: 1em;font-family: sans-serif;letter-spacing: 0px;opacity: 0.8;line-height: normal;box-sizing: border-box !important;overflow-wrap: break-word !important;"><span style="font-size: 14px;">Manning大神牵头,斯坦福开源Python版NLP库Stanza:涵盖66种语言</span><br /></section></section></section></section></section></section></section></section></section>
本篇文章来源于: 深度学习这件小事
本文为原创文章,版权归知行编程网所有,欢迎分享本文,转载请保留出处!
内容反馈