--- _id: '6569' abstract: - lang: eng text: 'Knowledge distillation, i.e. one classifier being trained on the outputs of another classifier, is an empirically very successful technique for knowledge transfer between classifiers. It has even been observed that classifiers learn much faster and more reliably if trained with the outputs of another classifier as soft labels, instead of from ground truth data. So far, however, there is no satisfactory theoretical explanation of this phenomenon. In this work, we provide the first insights into the working mechanisms of distillation by studying the special case of linear and deep linear classifiers. Specifically, we prove a generalization bound that establishes fast convergence of the expected risk of a distillation-trained linear classifier. From the bound and its proof we extract three keyfactors that determine the success of distillation: data geometry – geometric properties of the datadistribution, in particular class separation, has an immediate influence on the convergence speed of the risk; optimization bias– gradient descentoptimization finds a very favorable minimum of the distillation objective; and strong monotonicity– the expected risk of the student classifier always decreases when the size of the training set grows.' article_processing_charge: No author: - first_name: Phuong full_name: Bui Thi Mai, Phuong id: 3EC6EE64-F248-11E8-B48F-1D18A9856A87 last_name: Bui Thi Mai - first_name: Christoph full_name: Lampert, Christoph id: 40C20FD2-F248-11E8-B48F-1D18A9856A87 last_name: Lampert orcid: 0000-0001-8622-7887 citation: ama: 'Phuong M, Lampert C. Towards understanding knowledge distillation. In: Proceedings of the 36th International Conference on Machine Learning. Vol 97. ML Research Press; 2019:5142-5151.' apa: 'Phuong, M., & Lampert, C. (2019). Towards understanding knowledge distillation. In Proceedings of the 36th International Conference on Machine Learning (Vol. 97, pp. 5142–5151). Long Beach, CA, United States: ML Research Press.' chicago: Phuong, Mary, and Christoph Lampert. “Towards Understanding Knowledge Distillation.” In Proceedings of the 36th International Conference on Machine Learning, 97:5142–51. ML Research Press, 2019. ieee: M. Phuong and C. Lampert, “Towards understanding knowledge distillation,” in Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, United States, 2019, vol. 97, pp. 5142–5151. ista: 'Phuong M, Lampert C. 2019. Towards understanding knowledge distillation. Proceedings of the 36th International Conference on Machine Learning. ICML: International Conference on Machine Learning vol. 97, 5142–5151.' mla: Phuong, Mary, and Christoph Lampert. “Towards Understanding Knowledge Distillation.” Proceedings of the 36th International Conference on Machine Learning, vol. 97, ML Research Press, 2019, pp. 5142–51. short: M. Phuong, C. Lampert, in:, Proceedings of the 36th International Conference on Machine Learning, ML Research Press, 2019, pp. 5142–5151. conference: end_date: 2019-06-15 location: Long Beach, CA, United States name: 'ICML: International Conference on Machine Learning' start_date: 2019-06-10 date_created: 2019-06-20T18:23:03Z date_published: 2019-06-13T00:00:00Z date_updated: 2023-10-17T12:31:38Z day: '13' ddc: - '000' department: - _id: ChLa file: - access_level: open_access checksum: a66d00e2694d749250f8507f301320ca content_type: application/pdf creator: bphuong date_created: 2019-06-20T18:22:56Z date_updated: 2020-07-14T12:47:33Z file_id: '6570' file_name: paper.pdf file_size: 686432 relation: main_file file_date_updated: 2020-07-14T12:47:33Z has_accepted_license: '1' intvolume: ' 97' language: - iso: eng month: '06' oa: 1 oa_version: Published Version page: 5142-5151 publication: Proceedings of the 36th International Conference on Machine Learning publication_status: published publisher: ML Research Press quality_controlled: '1' scopus_import: '1' status: public title: Towards understanding knowledge distillation type: conference user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87 volume: 97 year: '2019' ...