This repository was created utilizing the survey report on self-supervised learning for recommender systems.
The existing library was taken, modified, and experimented with, and this repository is called SELFRec.
numba==0.53.1
numpy==1.20.3
scipy==1.6.2
tensorflow==1.14.0
torch>=1.7.0
- Docker Hub: https://hub.docker.com/repository/docker/dodo9249/thingcn
- Wandb: https://wandb.ai/d9249/OptGCN
- conf, which contains the detailed parameters of the models utilized in this study. (You can adjust the weight of Weighted Forwarding.)
- When you run run.sh, all the models in this study are executed.
Model | Paper | Type | Code |
---|---|---|---|
LightGCN | He et al. LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation, SIGIR'20. | Graph | PyTorch |
NCL | Lin et al. Improving Graph Collaborative Filtering with Neighborhood-enriched Contrastive Learning, WWW'22. | Graph + CL | PyTorch |
SGL | Wu et al. Self-supervised Graph Learning for Recommendation, SIGIR'21. | Graph + CL | TensorFlow & Torch |
MixGCF | Huang et al. MixGCF: An Improved Training Method for Graph Neural Network-based Recommender Systems, KDD'21. | Graph + DA | PyTorch |
SimGCL | Yu et al. Are Graph Augmentations Necessary? Simple Graph Contrastive Learning for Recommendation, SIGIR'22. | Graph + CL | PyTorch |
XSimGCL | Yu et al. XSimGCL: Towards Extremely Simple Graph Contrastive Learning for Recommendation, TKDE'23. | Graph + CL | PyTorch |
BUIR | Lee et al. Bootstrapping User and Item Representations for One-Class Collaborative Filtering, SIGIR'21. | Graph + DA | PyTorch |
- CL is short for contrastive learning (including data augmentation); DA is short for data augmentation only
General hyperparameter settings are: epoch: 300, batch_size: 2048, emb_size: 64, learning rate: 0.001, L2 reg: 0.0001.
Model | Hyperparameter settings |
---|---|
LightGCN | layer=3 |
NCL | layer=3, ssl_reg=1e-6, proto_reg=1e-7, tau=0.05, hyper_layers=1, alpha=1.5, num_clusters=2000 |
SGL | layer=3, λ=0.1, ρ=0.1, tau=0.2 |
MixGCF | layer=3, n_nes=64 |
SimGCL | layer=3, λ=0.5, eps=0.1, tau=0.2 |
XSimGCL | layer=3, λ=0.2, eps=0.2, l∗=1, tau=0.15 |
BUIR | layer=3, tau=0.995, drop_rate=0.2 |
The results are obtained on the dataset of
Yelp2018
douban-book
Amazon-Book
Amazon-kindle
FilmTrust
MovieLens-1m
iFashion
gowalla
DataSet | Basic Meta | User Context | ||||||
---|---|---|---|---|---|---|---|---|
Users | Items | Ratings (Scale) | Density | Users | Links (Type) | |||
Douban | 2,848 | 39,586 | 894,887 | [1, 5] | 0.794% | 2,848 | 35,770 | Trust |
Yelp2018 | 19,539 | 21,266 | 450,884 | implicit | 0.11% | 19,539 | 864,157 | Trust |
Amazon-Book | 52,463 | 91,599 | 2,984,108 | implicit | 0.11% | - | - | - |
Amazin-Kindle | 0 | 0 | 0 | a | 0% | - | - | - |
FilmTrust | 0 | 0 | 0 | a | 0% | - | - | - |
iFashion | 0 | 0 | 0 | a | 0% | - | - | - |
MovieLens-1m | 0 | 0 | 0 | a | 0% | - | - | - |
LastFM | 1,892 | 17,632 | 92,834 | implicit | 0.27% | 1,892 | 25,434 | Trust |
SELFRec is a Python framework for self-supervised recommendation (SSR) which integrates commonly used datasets and metrics, and implements many state-of-the-art SSR models. SELFRec has a lightweight architecture and provides user-friendly interfaces. It can facilitate model implementation and evaluation.
Founder and principal contributor: @Coder-Yu @xiaxin1998
Supported by: @AIhongzhi (A/Prof. Hongzhi Yin, UQ)
This repo is released with our survey paper on self-supervised learning for recommender systems. We organized a tutorial on self-supervised recommendation at WWW'22. Visit the tutorial page for more information.
If you find this repo helpful to your research, please cite our paper.
@article{,
title={Optimization methods for Graph Convolution Networks in Recommendation Systems},
author={Sangmin Lee, Namgi Kim},
journal={},
year={2023}
}
@article{yu2022self,
title={Self-Supervised Learning for Recommender Systems: A Survey},
author={Yu, Junliang and Yin, Hongzhi and Xia, Xin and Chen, Tong and Li, Jundong and Huang, Zi},
journal={arXiv preprint arXiv:2203.15876},
year={2022}
}