Skip to content

ShubhamDesai/Pruning_and_Knowledge_Distillation

 
 

Repository files navigation

We aimed to evaluate different model compression techniques such as Pruning, Knowledge Distillation, KD + Pruning and Dynamic KD (changing temperature). As a part of evaluation metrics we compared the models based on accuracy, parameter count, flops, model size, training time, average inference time and CO2 emissions

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%