Explorations in Knowledge Distillation
By Chris Zhu
You don’t need too many resources to give back and start mentoring others. Similarly, you don’t need fancy hardware to train your machine learning models using techniques like this: https://medium.mage.ai/explorations-in-knowledge-distillation

