Knowledge distillation
Also known as: distillation, model distillation
Training a smaller student model to mimic a larger teacher’s outputs or internal signals, preserving task quality while shrinking size and inference cost.
Also known as: distillation, model distillation
Training a smaller student model to mimic a larger teacher’s outputs or internal signals, preserving task quality while shrinking size and inference cost.
Contact if you need a term added for a security or procurement review.