Knowledge Distillation in Keras
I have spent a significant amount of time building complex deep learning models that perform brilliantly but are far too heavy for mobile devices. In my experience, Knowledge Distillation is the most effective way to shrink a massive “Teacher” model into a compact “Student” model while keeping the accuracy high. In this tutorial, I will … Read more >>