combine two loss #412
-
Hi @yoshitomo-matsubara , I cannot find the file of combining two losses in your legacy folder. For example, what should I do when I want to train the models with cross entropy loss and KD loss in your new manner (\ie, legacy of Important Notice). |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 5 replies
-
Please use Q&A category for questions. (I changed the category here and the previous discussion) There are many legacy config files that use a linear combination of multiple losses. The KD loss that combines cross entropy and KL divergence is defined as Note: Please read README carefully for types of config files |
Beta Was this translation helpful? Give feedback.
-
I get it. Thank you! But your examples are different types of losses (e.g. features). I want to combine multiple losses only according to outputs, not features. |
Beta Was this translation helpful? Give feedback.
-
And why we get a better result by appending a crossentropy loss in configs/sample/ilsvrc2012/kd/resnet18_from_resnet34.yaml |
Beta Was this translation helpful? Give feedback.
Technically you can do that by extracting the last layer's output as a "feature"