IDEAS home Printed from https://ideas.repec.org/a/spr/infosf/vyid10.1007_s10796-019-09938-6.html
   My bibliography  Save this article

Combining Spatial Optimization and Multi-Agent Temporal Difference Learning for Task Assignment in Uncertain Crowdsourcing

Author

Listed:
  • Yong Sun

    (Nanjing University of Aeronautics and Astronautics
    Chuzhou University)

  • Wenan Tan

    (Nanjing University of Aeronautics and Astronautics)

Abstract

In recent years, spatial crowdsourcing has emerged as an important new framework, in which each spatial task requires a set of right crowd-workers in the near vicinity to the target locations. Previous studies have focused on spatial task assignment in the static crowdsourcing environment. These algorithms may achieve local optimality by neglecting the uncertain features inherent in real-world crowdsourcing environments, where workers may join or leave during run time. Moreover, spatial task assignment is more complicated when large-scale crowd-workers exist in crowdsourcing environments. The large-scale nature of task assignments poses a significant challenge to uncertain spatial crowdsourcing. In this paper, we propose a novel algorithm combining spatial optimization and multi-agent temporal difference learning (SMATDL). The combination of grid-based optimization and multi-agent learning can achieve higher adaptability and maintain greater efficiency than traditional learning algorithms in the face of large-scale crowdsourcing problems. The SMATDL algorithm decomposes the uncertain crowdsourcing problem into numerous sub-problems by means of a grid-based optimization approach. In order to adapt to the change in the large-scale environment, each agent utilizes temporal difference learning to handle its own spatial region optimization in online crowdsourcing. As a result, multiple agents in SMATDL collaboratively learn to optimize their efforts in accomplishing the global assignment problems efficiently. Through extensive experiments, we illustrate the effectiveness and efficiency of our proposed algorithms on the experimental data sets.

Suggested Citation

  • Yong Sun & Wenan Tan, 0. "Combining Spatial Optimization and Multi-Agent Temporal Difference Learning for Task Assignment in Uncertain Crowdsourcing," Information Systems Frontiers, Springer, vol. 0, pages 1-19.
  • Handle: RePEc:spr:infosf:v::y::i::d:10.1007_s10796-019-09938-6
    DOI: 10.1007/s10796-019-09938-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10796-019-09938-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10796-019-09938-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yuxiang Zhao & Qinghua Zhu, 2014. "Evaluation on crowdsourcing research: Current status and future direction," Information Systems Frontiers, Springer, vol. 16(3), pages 417-434, July.
    2. Yong Sun & Wenan Tan & Lingxia Li & Weiming Shen & Zhuming Bi & Xiaoming Hu, 2016. "A new method to identify collaborative partners in social service provider networks," Information Systems Frontiers, Springer, vol. 18(3), pages 565-578, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yong Sun & Wenan Tan, 2020. "Combining Spatial Optimization and Multi-Agent Temporal Difference Learning for Task Assignment in Uncertain Crowdsourcing," Information Systems Frontiers, Springer, vol. 22(6), pages 1447-1465, December.
    2. Livio Cricelli & Michele Grimaldi & Silvia Vermicelli, 2022. "Crowdsourcing and open innovation: a systematic literature review, an integrated framework and a research agenda," Review of Managerial Science, Springer, vol. 16(5), pages 1269-1310, July.
    3. Gaganmeet Kaur Awal & K. K. Bharadwaj, 2019. "Leveraging collective intelligence for behavioral prediction in signed social networks through evolutionary approach," Information Systems Frontiers, Springer, vol. 21(2), pages 417-439, April.
    4. Lee, Jung & Seo, DongBack, 2016. "Crowdsourcing not all sourced by the crowd: An observation on the behavior of Wikipedia participants," Technovation, Elsevier, vol. 55, pages 14-21.
    5. Bal, Anjali S. & Weidner, Kelly & Hanna, Richard & Mills, Adam J., 2017. "Crowdsourcing and brand control," Business Horizons, Elsevier, vol. 60(2), pages 219-228.
    6. Roman Lukyanenko & Andrea Wiggins & Holly K. Rosser, 0. "Citizen Science: An Information Quality Research Frontier," Information Systems Frontiers, Springer, vol. 0, pages 1-23.
    7. Carbajo, Ruth & Cabeza, Luisa F., 2018. "Renewable energy research and technologies through responsible research and innovation looking glass: Reflexions, theoretical approaches and contemporary discourses," Applied Energy, Elsevier, vol. 211(C), pages 792-808.
    8. Dan Li & Longying Hu, 2017. "Exploring the effects of reward and competition intensity on participation in crowdsourcing contests," Electronic Markets, Springer;IIM University of St. Gallen, vol. 27(3), pages 199-210, August.
    9. Marta Poblet & Esteban García-Cuesta & Pompeu Casanovas, 0. "Crowdsourcing roles, methods and tools for data-intensive disaster management," Information Systems Frontiers, Springer, vol. 0, pages 1-17.
    10. Regina Lenart-Gansiniec, 2017. "Factors Influencing Decisions about Crowdsourcing in the Public Sector: A Literature Review," Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, Mendel University Press, vol. 65(6), pages 1997-2005.
    11. Xuefeng Zhang & Bengang Gong & Yaqin Cao & Yi Ding & Jiafu Su, 2022. "Investigating participants’ attributes for participant estimation in knowledge-intensive crowdsourcing: a fuzzy DEMATEL based approach," Electronic Commerce Research, Springer, vol. 22(3), pages 811-842, September.
    12. Sultana Lubna Alam & John Campbell, 2017. "Temporal Motivations of Volunteers to Participate in Cultural Crowdsourcing Work," Information Systems Research, INFORMS, vol. 28(4), pages 744-759, December.
    13. Cayetano Medina-Molina & Manuel Rey-Moreno & J. Augusto Felício & Inmaculada Romano Paguillo, 2019. "Participation in crowdfunding among users of collaborative platforms: the role of innovativeness and social capital," Review of Managerial Science, Springer, vol. 13(3), pages 529-543, June.
    14. Tekic, Anja & Alfonzo Pacheco, Diana Vilma, 2024. "Contest design and solvers' engagement behaviour in crowdsourcing: The neo-configurational perspective," Technovation, Elsevier, vol. 132(C).
    15. Debra Howcroft & Birgitta Bergvall-KÃ¥reborn, 2019. "A Typology of Crowdwork Platforms," Work, Employment & Society, British Sociological Association, vol. 33(1), pages 21-38, February.
    16. Olivera Marjanovic & Vijaya Murthy, 2022. "The Emerging Liquid IT Workforce: Theorizing Their Personal Competitive Advantage," Information Systems Frontiers, Springer, vol. 24(6), pages 1775-1793, December.
    17. Regina Lenart-Gansiniec, 2017. "Virtual Knowledge Sharing in Crowdsourcing: Measurement Dilemmas," Journal of Entrepreneurship, Management and Innovation, Fundacja Upowszechniająca Wiedzę i Naukę "Cognitione", vol. 13(3), pages 95-123.
    18. Heidrun Zeug & Gunter Zeug & Conrad Bielski & Gloria Solano-Hermosilla & Robert M’barek, 2017. "Innovative Food Price Collection in Developing Countries. Focus on Crowdsourcing in Africa," JRC Research Reports JRC103294, Joint Research Centre.
    19. Barbosu, Sandra & Gans, Joshua S., 2022. "Storm crowds: Evidence from Zooniverse on crowd contribution design," Research Policy, Elsevier, vol. 51(1).
    20. Evangelos Mourelatos & Manolis Tzagarakis, 2018. "An investigation of factors affecting the visits of online crowdsourcing and labor platforms," Netnomics, Springer, vol. 19(3), pages 95-130, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:infosf:v::y::i::d:10.1007_s10796-019-09938-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.