- Publikationen ca: 1
- Fragen & Antworten
Witold / Chen Pedrycz
Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems
The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model.
