Awesome Knowledge Distillation
-
Updated
Nov 27, 2024
Awesome Knowledge Distillation
A school management Software
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
DALI: a large Dataset of synchronised Audio, LyrIcs and vocal notes.
Improving Multi-hop Knowledge Base Question Answering by Learning Intermediate Supervision Signals. WSDM 2021.
[ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Chenyu You, Xiaohui Xie, Zhangyang Wang
It is an online Student-Teacher portal wherein teachers can upload various assignments related to their subjects which the student can download.
College Based Data Management System
PyTorch implementation of "Distilling the Knowledge in a Neural Network"
Deep Neural Network Compression based on Student-Teacher Network
The Pytorch implementation of Graph convolution network (Kipf et.al. 2017) with vanilla Teacher-Student architecture of knowledge distillation (Hinton et.al 2015).
Student Teacher interactive platform
Mobile-first education software for teachers.
Semi-supervised teacher-student framework
REST API in Django using Django REST Framework.
Teaching materials for Procedural Programming Lab
[ICLR 2022 workshop PAIR^2Struct] Sparse Logits Suffice to Fail Knowledge Distillation
Code for our JSTARS paper "Semi-MCNN: A semisupervised multi-CNN ensemble learning method for urban land cover classification using submeter HRRS images"
The main objective of this repository is to become familiar with the task of Domain Adaptation applied to the Real-time Semantic Segmentation networks.
Add a description, image, and links to the teacher-student topic page so that developers can more easily learn about it.
To associate your repository with the teacher-student topic, visit your repo's landing page and select "manage topics."