IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v34y2011i3p206-216.html
   My bibliography  Save this article

Self-evaluation of assessment programs: A cross-case analysis

Author

Listed:
  • Baartman, Liesbeth K.J.
  • Prins, Frans J.
  • Kirschner, Paul A.
  • van der Vleuten, Cees P.M.

Abstract

The goal of this article is to contribute to the validation of a self-evaluation method, which can be used by schools to evaluate the quality of their Competence Assessment Program (CAP). The outcomes of the self-evaluations of two schools are systematically compared: a novice school with little experience in competence-based education and assessment, and an innovative school with extensive experience. The self-evaluation was based on 12 quality criteria for CAPs, including both validity and reliability, and criteria stressing the importance of the formative function of assessment, such as meaningfulness and educational consequences. In each school, teachers, management and examination board participated. Results show that the two schools use different approaches to assure assessment quality. The innovative school seems to be more aware of its own strengths and weaknesses, to have a more positive attitude towards teachers, students, and educational innovations, and to explicitly involve stakeholders (i.e., teachers, students, and the work field) in their assessments. This school also had a more explicit vision of the goal of competence-based education and could design its assessments in accordance with these goals.

Suggested Citation

  • Baartman, Liesbeth K.J. & Prins, Frans J. & Kirschner, Paul A. & van der Vleuten, Cees P.M., 2011. "Self-evaluation of assessment programs: A cross-case analysis," Evaluation and Program Planning, Elsevier, vol. 34(3), pages 206-216, August.
  • Handle: RePEc:eee:epplan:v:34:y:2011:i:3:p:206-216
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718911000243
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Johnson, Robert L. & Fisher, Steve & Willeke, Marjorie J. & McDaniel, Fred, 2003. "Portfolio assessment in a collaborative program evaluation: the reliability and validity of a family literacy portfolio," Evaluation and Program Planning, Elsevier, vol. 26(4), pages 367-377, November.
    2. Sanne Akkerman & Wilfried Admiraal & Mieke Brekelmans & Heinze Oost, 2008. "Auditing Quality of Research in Social Sciences," Quality & Quantity: International Journal of Methodology, Springer, vol. 42(2), pages 257-274, April.
    3. Gulikers, Judith T.M. & Baartman, Liesbeth K.J. & Biemans, Harm J.A., 2010. "Facilitating evaluations of innovative, competence-based assessments: Creating understanding and involving multiple stakeholders," Evaluation and Program Planning, Elsevier, vol. 33(2), pages 120-127, May.
    4. Roth, Wolff-Michael, 1998. "Situated cognition and assessment of competence in science," Evaluation and Program Planning, Elsevier, vol. 21(2), pages 155-169, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sanne Akkerman & Larike Bronkhorst & Ilya Zitter, 2013. "The complexity of educational design research," Quality & Quantity: International Journal of Methodology, Springer, vol. 47(1), pages 421-439, January.
    2. Jacques de Wet & Daniela Wetzelhütter & Johann Bacher, 2021. "Standardising the reproduction of Schwartz’s two-dimensional value space using multi-dimensional scaling and goodness-of-fit test procedures," Quality & Quantity: International Journal of Methodology, Springer, vol. 55(4), pages 1155-1179, August.
    3. Fiorenzo Franceschini & Domenico Maisano, 2012. "Quality & Quantity journal: a bibliometric snapshot," Quality & Quantity: International Journal of Methodology, Springer, vol. 46(2), pages 573-580, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:34:y:2011:i:3:p:206-216. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.