Dixit, Shantanu; Akhtar, Md Shad (Advisor)
(IIIT-Delhi, 2023-11-29)
Knowledge distillation is a technique that involves transferring knowledge from a larger teacher model to a smaller student model. The latest developments in meta-learning-based knowledge distillation emphasize the ...