IDEAS home Printed from https://ideas.repec.org/a/abu/abuabu/v2y2024i1p125-140id14.html
   My bibliography  Save this article

PALM: Personalized Attention-based Language Model for Long-tail Query Understanding in Enterprise Search Systems

Author

Listed:
  • Qiwen Zhao
  • Zhongwen Zhou
  • Yibang Liu

Abstract

Enterprise search systems face significant challenges in handling long-tail queries, which constitute a substantial portion of search traffic but often receive inadequate attention in traditional systems. This paper introduces PALM (Personalized Attention-based Language Model), a novel framework designed to enhance long-tail query understanding in enterprise search environments. PALM integrates personalization capabilities with an advanced attention mechanism to improve search accuracy for infrequent queries while maintaining high performance on common queries. The framework employs a unique hierarchical architecture that combines user context, query semantics, and organizational knowledge through a sophisticated attention mechanism. The system features an innovative query embedding approach that adapts to individual user contexts while leveraging collective organizational knowledge. Extensive experiments on a large-scale enterprise dataset, comprising over 5 million queries from 50,000 users, demonstrate PALM's superior performance compared to state-of-the-art baselines. The results show significant improvements across multiple metrics, with a 17.5% increase in MAP for ultra-rare queries and a 10.4% overall improvement in NDCG@10. The framework exhibits robust performance across different organizational units and query types, making it particularly valuable for enterprise environments where query patterns are highly diverse and context-dependent. Our ablation studies confirm the effectiveness of each component in the PALM architecture, while case analyses provide insights into the framework's practical applications.

Suggested Citation

  • Qiwen Zhao & Zhongwen Zhou & Yibang Liu, 2024. "PALM: Personalized Attention-based Language Model for Long-tail Query Understanding in Enterprise Search Systems," Journal of AI-Powered Medical Innovations (International online ISSN 3078-1930), Open Knowledge, vol. 2(1), pages 125-140.
  • Handle: RePEc:abu:abuabu:v:2:y:2024:i:1:p:125-140:id:14
    as

    Download full text from publisher

    File URL: https://japmi.org/index.php/japmi/article/view/14/14
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:abu:abuabu:v:2:y:2024:i:1:p:125-140:id:14. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: By Openjournaltheme (email available below). General contact details of provider: https://japmi.org/index.php/japmi/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.