Skip to content

Item-side Ranking Regularized Distillation for Recommender System (Information Sciences 2021)

License

Notifications You must be signed in to change notification settings

SeongKu-Kang/IR-RRD_INS21

Repository files navigation

Item-side Ranking Regularized Distillation for Recommender System

1. Overview

This repository provides the source code of our paper: Item-side Ranking Regularized Distillation for Recommender System, accepted in Information Sciences'21.

In the paper, we propose Item-side ranking Regularization (IR) method for ranking distillation in Recommender System. The proposed IR method utilizes item-side ranking information, effectively preventing the student model from being overfitted and enabling the student model to more accurately learn the teacher’s prediction results.

oveview

2. Evaluation

We evaluate the effectiveness of IR with the state-of-the-art ranking distillation method, RRD (CIKM'20).

2.1. Leave-One-Out (LOO) protocol

We provide the leave-one-out evaluation protocol used in the paper. The protocol is as follows:

  • For each test user
    1. randomly sample two positive (observed) items
      • each of them is used for test/validation purpose.
    2. randomly sample 499 negative (unobserved) items
    3. evaluate how well each method can rank the test item higher than these sampled negative items.

2.2. Metrics

We provide three ranking metrics broadly adopted in the recent papers: HR@N, NDCG@N, MRR@N. The hit ratio simply measures whether the test item is present in the top-N list, which is defined as follows:

Large

where δ is the indicator function, Utest is the set of the test users, pu is the hit ranking position of the test item for the user u. On the other hand, the normalized discounted cumulative gain and the mean reciprocal rank are ranking position-aware metrics that put higher scores to the hits at upper ranks. N@N and M@N are defined as follows:

Large

We also provide the training log and the learning curves of the proposed method. You can find them in /logs folder and the attached jupyter notebook.

Please note that the proposed IR method is for transferring knowledge from model's predictions. Topology Distillation (KDD'21), which is a follow-up study of DE and transfers the latent knowledge, is available in https://github.com/SeongKu-Kang/Topology_Distillation_KDD21

About

Item-side Ranking Regularized Distillation for Recommender System (Information Sciences 2021)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published