You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
there is only traditional max, entropy calibrator but there are already so many PTQs like Adaround, Qdrop, FQ-ViT for int8 ViT PTQ.
wiil TRT support more recent PTQs for int8 ViT in future?
Describe the solution you'd like
Describe alternatives you've considered
Additional context
The text was updated successfully, but these errors were encountered:
This would be a question for nvidia/tensorrt-model-optimizier . We accept the quantization products of that tool. Not sure what their roadmap for these are.
Is your feature request related to a problem? Please describe.
there is only traditional max, entropy calibrator but there are already so many PTQs like Adaround, Qdrop, FQ-ViT for int8 ViT PTQ.
wiil TRT support more recent PTQs for int8 ViT in future?
Describe the solution you'd like
Describe alternatives you've considered
Additional context
The text was updated successfully, but these errors were encountered: