Author
Listed:
- Huaiwen He
(School of Computer, Zhongshan Institute, University of Electronic Science and Technology of China, Zhognshan 528400, China)
- Xiangdong Yang
(School of Computer, Zhongshan Institute, University of Electronic Science and Technology of China, Zhognshan 528400, China
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China)
- Feng Huang
(School of Computer, Zhongshan Institute, University of Electronic Science and Technology of China, Zhognshan 528400, China
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China)
- Feng Yi
(School of Computer, Zhongshan Institute, University of Electronic Science and Technology of China, Zhognshan 528400, China)
- Shangsong Liang
(School of Computer Science and Engineering, Sun Yat-sen University, Guangzhou 510000, China)
Abstract
Capturing long-term dependency from historical behaviors is the key to the success of sequential recommendation; however, existing methods focus on extracting global sequential information while neglecting to obtain deep representations from subsequences. Previous research has revealed that the restricted inter-item transfer is fundamental to sequential modeling, and some potential substructures of sequences can help models learn more effective long-term dependency compared to the whole sequence. To automatically find better subsequences and perform efficient learning, we propose a sequential recommendation model with a gated recurrent unit and Transformers, abbreviated as GAT4Rec, which employs Transformers with shared parameters across layers to model users’ historical interaction sequences. The representation learned by the gated recurrent unit is used as the gating signal to identify the optimal substructure in user sequences. The fused representation of the subsequence and edge information is extracted by the encoding layer to make the corresponding recommendations. Experimental results on four well-known publicly available datasets demonstrate that our GAT4Rec model outperforms other recommendation models, achieving performance improvements of 5.77%, 1.35%, 11.58%, and 1.79% in the normalized discounted cumulative gain metric (NDCG@10), respectively.
Suggested Citation
Huaiwen He & Xiangdong Yang & Feng Huang & Feng Yi & Shangsong Liang, 2024.
"GAT4Rec: Sequential Recommendation with a Gated Recurrent Unit and Transformers,"
Mathematics, MDPI, vol. 12(14), pages 1-23, July.
Handle:
RePEc:gam:jmathe:v:12:y:2024:i:14:p:2189-:d:1433940
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:14:p:2189-:d:1433940. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.