IDEAS home Printed from https://ideas.repec.org/a/hin/jnlmpe/8407713.html
   My bibliography  Save this article

Learning High-Order Semantic Representation for Intent Classification and Slot Filling on Low-Resource Language via Hypergraph

Author

Listed:
  • Xianglong Qi
  • Yang Gao
  • Ruibin Wang
  • Minghua Zhao
  • Shengjia Cui
  • Mohsen Mortazavi
  • Junwei Ma

Abstract

Representation of language is the first and critical task for Natural Language Understanding (NLU) in a dialogue system. Pretraining, embedding model, and fine-tuning for intent classification and slot-filling are popular and well-performing approaches but are time consuming and inefficient for low-resource languages. Concretely, the out-of-vocabulary and transferring to different languages are two tough challenges for multilingual pretrained and cross-lingual transferring models. Furthermore, quality-proved parallel data are necessary for the current frameworks. Stepping over these challenges, different from the existing solutions, we propose a novel approach, the Hypergraph Transfer Encoding Network “HGTransEnNet. The proposed model leverages off-the-shelf high-quality pretrained word embedding models of resource-rich languages to learn the high-order semantic representation of low-resource languages in a transductive clustering manner of hypergraph modeling, which does not need parallel data. The experiments show that the representations learned by “HGTransEnNet†for low-resource language are more effective than the state-of-the-art language models, which are pretrained on a large-scale multilingual or monolingual corpus, in intent classification and slot-filling tasks on Indonesian and English datasets.

Suggested Citation

  • Xianglong Qi & Yang Gao & Ruibin Wang & Minghua Zhao & Shengjia Cui & Mohsen Mortazavi & Junwei Ma, 2022. "Learning High-Order Semantic Representation for Intent Classification and Slot Filling on Low-Resource Language via Hypergraph," Mathematical Problems in Engineering, Hindawi, vol. 2022, pages 1-16, September.
  • Handle: RePEc:hin:jnlmpe:8407713
    DOI: 10.1155/2022/8407713
    as

    Download full text from publisher

    File URL: http://downloads.hindawi.com/journals/mpe/2022/8407713.pdf
    Download Restriction: no

    File URL: http://downloads.hindawi.com/journals/mpe/2022/8407713.xml
    Download Restriction: no

    File URL: https://libkey.io/10.1155/2022/8407713?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:jnlmpe:8407713. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.