-
Notifications
You must be signed in to change notification settings - Fork 131
A Gift from Knowledge Distillation #366
Replies: 1 comment · 24 replies
-
Hi @lansiyuan3 With the current PyPI package (torchdistill==0.3.3), you can run it using this config file using this script Note that you'll need to tune the hyperparameters as it's just a sample config (as explained here) because the original paper did not test the method for ImageNet dataset |
Beta Was this translation helpful? Give feedback.
All reactions
-
First of all, I used thisconfig file and this script. All the experiments are based onthis |
Beta Was this translation helpful? Give feedback.
All reactions
-
Again, you are not using legacy scripts (examples/legacy/) with PyPI version of torchdistill v0.3.3. training_box.forward_process is non-legacy implementation, which is not ready to be used right now I've already emphasized this point at https://github.com/yoshitomo-matsubara/torchdistill/tree/main#important-notice , and you repeatedly ignore the message |
Beta Was this translation helpful? Give feedback.
All reactions
-
I have made some modifications according to your request, and I think this should be correct. about this script and yaml |
Beta Was this translation helpful? Give feedback.
All reactions
-
No, I think you still use the code you copied from this repository and made some modifications. Following pip3 install torch==1.12.1 torchvision==0.13.1
pip3 install torchdistill
git clone https://github.com/yoshitomo-matsubara/torchdistill.git I confirmed that the legacy script works for both training with/without teacher model with CIFAR-10 dataset Training with teacher
Training without teacher
Why don't you describe what modifications you made exactly? You just keep saying you made modifications, but never explain what modifications you made and how or differences from the legacy script in this repo either. You didn't even confirm that you used torchdistill=v0.3.3. I am not your personal tutor, and If you refuse to do those, I can not answer your questions anymore. |
Beta Was this translation helpful? Give feedback.
All reactions
-
Sorry to bother you, but I think I have figured out what the problem is and sincerely apologize for my low-level mistake |
Beta Was this translation helpful? Give feedback.
-
HI @yoshitomo-matsubara :
I'm looking for the code related to the paper A Gift from Knowledge Distillation, but I don't clear the location of the paper in your project. Or how you categorize these algorithms in general
Beta Was this translation helpful? Give feedback.
All reactions