We aimed to evaluate different model compression techniques such as Pruning, Knowledge Distillation, KD + Pruning and Dynamic KD (changing temperature). As a part of evaluation metrics we compared the models based on accuracy, parameter count, flops, model size, training time, average inference time and CO2 emissions
forked from Satyajeet2000/Pruning_and_Knowledge_Distillation
-
Notifications
You must be signed in to change notification settings - Fork 0
ShubhamDesai/Pruning_and_Knowledge_Distillation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Jupyter Notebook 100.0%