IDEAS home Printed from https://ideas.repec.org/a/gam/jsusta/v16y2024i13p5607-d1426231.html
   My bibliography  Save this article

Developing and Validating an Instrument for Assessing Learning Sciences Competence of Doctoral Students in Education in China

Author

Listed:
  • Xin Wang

    (Faculty of Education, Shaanxi Normal University, Xi’an 710062, China
    These authors contributed equally to this work.)

  • Baohui Zhang

    (Faculty of Education, Shaanxi Normal University, Xi’an 710062, China
    These authors contributed equally to this work.)

  • Hongying Gao

    (School of International Studies, Shaanxi Normal University, Xi’an 710062, China)

Abstract

Learning sciences competence refers to a necessary professional competence for educators, which is manifested in their deep understanding of learning sciences knowledge, positive attitudes, and scientific thinking and skills in conducting teaching practice and research. It is of paramount importance for doctoral students in education to develop their competence in the field of learning sciences. This will enhance their abilities to teach and conduct research, and guide their educational research and practice toward greater sustainability. In order to address the shortcomings of current assessment instruments, we constructed a theoretical model for assessing learning sciences competence based on the PISA 2025 framework and Piaget’s theory of knowledge. A three-dimensional assessment framework was designed, along with an initial instrument. Furthermore, the “Delphi method based on large language models (LLM)” was employed to conduct two rounds of expert consultations with the objective of testing and refining the instrument. Throughout this process, we developed a set of guidelines for engaging AI experts to improve interactions with LLM, including an invitation letter to AI experts, the main body of the questionnaire, and the general inquiry about AI experts’ perspectives. In analyzing the results of the Delphi method, we used the “threshold method” to identify and refine the questionnaire items that performed sub-optimally. This resulted in the final assessment instrument for evaluating learning sciences competence among doctoral students in education. The assessment instrument encompasses three dimensions: the knowledge of learning sciences, application of learning sciences, and attitude towards learning sciences, with a total of 40 items. These items integrate Likert scales and scenario-based questions. Furthermore, the study examined potential limitations in the item design, question type selection, and method application of the assessment instrument. The design and development of the assessment instrument provide valuable references for the standardized monitoring and sustainability development of the learning sciences competence of doctoral students in education.

Suggested Citation

  • Xin Wang & Baohui Zhang & Hongying Gao, 2024. "Developing and Validating an Instrument for Assessing Learning Sciences Competence of Doctoral Students in Education in China," Sustainability, MDPI, vol. 16(13), pages 1-21, June.
  • Handle: RePEc:gam:jsusta:v:16:y:2024:i:13:p:5607-:d:1426231
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2071-1050/16/13/5607/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2071-1050/16/13/5607/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Davide Parmigiani & Sarah-Louise Jones & Chiara Silvaggio & Elisabetta Nicchia & Asia Ambrosini & Myrna Pario & Andrea Pedevilla & Ilaria Sardi, 2022. "Assessing Global Competence Within Teacher Education Programs. How to Design and Create a Set of Rubrics With a Modified Delphi Method," SAGE Open, , vol. 12(4), pages 21582440221, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.

      Corrections

      All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jsusta:v:16:y:2024:i:13:p:5607-:d:1426231. See general information about how to correct material in RePEc.

      If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

      If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

      If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

      For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

      Please note that corrections may take a couple of weeks to filter through the various RePEc services.

      IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.