组合枚举时间间隔对比学习序列推荐
DOI:
作者:
作者单位:

山东理工大学计算机科学与技术学院

作者简介:

通讯作者:

中图分类号:

TP391.3

基金项目:

国家自然科学基金项目(61841602);山东省自然科学基金(ZR2020MF147)


Combinatorial enumeration and time-interval contrastive learning for sequential recommendation
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献()
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对序列推荐任务中对比学习模型生成自监督信号质量不足的问题,提出组合枚举时间间隔对比学习序列推荐模型。通过时间间隔扰动的数据增强操作,以生成保留时序信息的增强序列。为构建多视图增强序列对,提出组合枚举策略最大化地融合用户行为与时间间隔信息。模型采用多头注意力机制对用户行为序列进行编码,并通过多任务联合训练方式优化自监督信号来提升模型性能。所提模型适用于数据稀疏性高、交互行为不均匀的场景,有效解决自监督信号建模难题。在三个真实数据集上的实验结果表明,该模型在命中率(Hit Ratio,HR)和归一化折损累计增益(Normalized Discounted Cumulative Gain,NDCG)指标上均优于当前最先进的对比学习模型。其中,HR@5和NDCG@5分别提升5.61%和8.53%。

    Abstract:

    To address the problem of inadequate self-supervised signal quality in contrastive learning models for sequential recommendation, a combinatorial enumeration and time-interval contrastive learning model was proposed. The model generated enhanced sequences which preserved temporal information through time-interval perturbation-based data augmentation. A combinatorial enumeration strategy was introduced to integrate user behavior and time-interval information, constructing multi-view augmented sequence pairs. The model employed a multi-head attention mechanism to encode user behavior sequences and optimized self-supervised signals through multi-task joint training, which improved overall performance. The proposed model is well-suited for scenarios with high data sparsity and uneven interaction behaviors, effectively addressing challenges in self-supervised signal modeling. Experimental results on three real-world datasets demonstrate that the model outperforms state-of-the-art contrastive learning models in terms of Hit Ratio (HR) and Normalized Discounted Cumulative Gain (NDCG). Specifically, HR@5 and NDCG@5 improve by 5.61% and 8.53%, respectively.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-10-28
  • 最后修改日期:2025-02-28
  • 录用日期:2025-03-05
  • 在线发布日期:
  • 出版日期:
文章二维码