IDEAS home Printed from https://ideas.repec.org/a/wly/camsys/v20y2024i2ne1406.html
   My bibliography  Save this article

Impact of summer programmes on the outcomes of disadvantaged or ‘at risk’ young people: A systematic review

Author

Listed:
  • Daniel Muir
  • Cristiana Orlando
  • Becci Newton

Abstract

Review Rationale and Context Many intervention studies of summer programmes examine their impact on employment and education outcomes, however there is growing interest in their effect on young people's offending outcomes. Evidence on summer employment programmes shows promise on this but has not yet been synthesised. This report fills this evidence gap through a systematic review and meta‐analysis, covering summer education and summer employment programmes as their contexts and mechanisms are often similar. Research Objective The objective is to provide evidence on the extent to which summer programmes impact the outcomes of disadvantaged or ‘at risk’ young people. Methods The review employs mixed methods: we synthesise quantitative information estimating the impact of summer programme allocation/participation across the outcome domains through meta‐analysis using the random‐effects model; and we synthesise qualitative information relating to contexts, features, mechanisms and implementation issues through thematic synthesis. Literature searches were largely conducted in January 2023. Databases searched include: Scopus; PsychInfo; ERIC; the YFF‐EGM; EEF's and TASO's toolkits; RAND's summer programmes evidence review; key academic journals; and Google Scholar. The review employed PICOSS eligibility criteria: the population was disadvantaged or ‘at risk’ young people aged 10–25; interventions were either summer education or employment programmes; a valid comparison group that did not experience a summer programme was required; studies had to estimate the summer programme's impact on violence and offending, education, employment, socio‐emotional and/or health outcomes; eligible study designs were experimental and quasi‐experimental; eligible settings were high‐income countries. Other eligibility criteria included publication in English, between 2012 and 2022. Process/qualitative evaluations associated with eligible impact studies or of UK‐based interventions were also included; the latter given the interests of the sponsors. We used standard methodological procedures expected by The Campbell Collaboration. The search identified 68 eligible studies; with 41 eligible for meta‐analysis. Forty‐nine studies evaluated 36 summer education programmes, and 19 studies evaluated six summer employment programmes. The number of participants within these studies ranged from less than 100 to nearly 300,000. The PICOSS criteria affects the external applicability of the body of evidence – allowances made regarding study design to prioritise evidence on UK‐based interventions limits our ability to assess impact for some interventions. The risk of bias assessment categorised approximately 75% of the impact evaluations as low quality, due to attrition, losses to follow up, interventions having low take‐up rates, or where allocation might introduce selection bias. As such, intention‐to‐treat analyses are prioritised. The quality assessment rated 93% of qualitative studies as low quality often due to not employing rigorous qualitative methodologies. These results highlight the need to improve the evidence. Results and Conclusions Quantitative synthesis The quantitative synthesis examined impact estimates across 34 outcomes, through meta‐analysis (22) or in narrative form (12). We summarise below the findings where meta‐analysis was possible, along with the researchers' judgement of the security of the findings (high, moderate or low). This was based on the number and study‐design quality of studies evaluating the outcome; the consistency of findings; the similarity in specific outcome measures used; and any other specific issues which might affect our confidence in the summary findings. Below we summarise the findings from the meta‐analyses conducted to assess the impact of allocation to/participation in summer education and employment programmes (findings in relation to other outcomes are also discussed in the main body, but due to the low number of studies evaluating these, meta‐analysis was not performed). We only cover the pooled results for the two programme types where there are not clear differences in findings between summer education and summer employment programmes, so as to avoid potentially attributing any impact to both summer programme types when this is not the case. We list the outcome measure, the average effect size type (i.e., whether a standardised mean difference (SMD) or log odds ratio), which programme type the finding is in relation to and then the average effect size along with its 95% confidence interval and the interpretation of the finding, that is, whether there appears to be a significant impact and in which direction (positive or negative, clarifying instances where a negative impact is beneficial). In some instances there may be a discrepancy between the 95% confidence interval and whether we determine there to be a significant impact, which will be due to the specifics of the process for constructing the effect sizes used in the meta‐analysis. We then list the I2 statistic and the p‐value from the homogeneity test as indications of the presence of heterogeneity. As the sample size used in the analysis are often small and the homogeneity test is known to be under‐powered with small sample sizes, it may not detect statistically significant heterogeneity when it is in fact present. As such, a 90% confidence level threshold should generally be used when interpreting this with regard to the meta‐analyses below. The presence of effect size heterogeneity affects the extent to which the average effects size is applicable to all interventions of that summer programme type. We also provide an assessment of the relative confidence we have in the generalisability of the overall finding (low, moderate or high) – some of the overall findings are based on a small sample of studies, the studies evaluating the outcome may be of low quality, there may be wide variation in findings among the studies evaluating the outcome, or there may be specific aspects of the impact estimates included or the effect sizes constructed that affect the generalisability of the headline finding. These issues are detailed in full in the main body of the review. – Engagement with/participation in/enjoyment of education (SMD): ∘ Summer education programmes: +0.12 (+0.03, +0.20); positive impact; I2 = 48.76%, p = 0.10; moderate confidence. – Secondary education attendance (SMD): ∘ Summer education programmes: +0.26 (+0.08, +0.44); positive impact; I2 = N/A; p = N/A; low confidence. ∘ Summer employment programmes: +0.02 (−0.03, +0.07); no impact; I2 = 69.98%; p = 0.03; low confidence. – Passing tests (log OR): ∘ Summer education programmes: +0.41 (−0.13, +0.96); no impact; I2 = 95.05%; p = 0.00; low confidence. ∘ Summer employment programmes: +0.02 (+0.00, +0.04); positive impact; I2 = 0.01%; p = 0.33; low confidence. – Reading test scores (SMD): ∘ Summer education programmes: +0.01 (−0.04, +0.05); no impact; I2 = 0.40%; p = 0.48; high confidence. – English test scores (SMD): ∘ Summer education programmes: +0.07 (+0.00, +0.13); positive impact; I2 = 27.17%; p = 0.33; moderate confidence. ∘ Summer employment programmes: −0.03 (−0.05, −0.01); negative impact; I2 = 0.00%; p = 0.76; low confidence. – Mathematics test scores (SMD): ∘ All summer programmes: +0.09 (−0.06, +0.25); no impact; I2 = 94.53%; p = 0.00; high confidence. ∘ Summer education programmes: +0.14 (−0.09, +0.36); no impact; I2 = 94.15%; p = 0.00; moderate confidence. ∘ Summer employment programmes: +0.00 (−0.04, +0.05); no impact; I2 = 0.04%; p = 0.92; moderate confidence. – Overall test scores (SMD): ∘ Summer employment programmes: −0.01 (−0.08, +0.05); no impact; I2 = 32.39%; p = 0.20; high confidence. – All test scores (SMD): ∘ Summer education programmes: +0.14 (+0.00, +0.27); positive impact; I2 = 91.07%; p = 0.00; moderate confidence. ∘ Summer employment programmes: −0.01 (−0.04, +0.01); no impact; I2 = 0.06%; p = 0.73; high confidence. – Negative behavioural outcomes (log OR): ∘ Summer education programmes: −1.55 (−3.14, +0.03); negative impact; I2 = N/A; p = N/A; low confidence. ∘ Summer employment programmes: −0.07 (−0.33, +0.18); no impact; I2 = 88.17%; p = 0.00; moderate confidence. – Progression to HE (log OR): ∘ All summer programmes: +0.24 (−0.04, +0.52); no impact; I2 = 97.37%; p = 0.00; low confidence. ∘ Summer education programmes: +0.32 (−0.12, +0.76); no impact; I2 = 96.58%; p = 0.00; low confidence. ∘ Summer employment programmes: +0.10 (−0.07, +0.26); no impact; I2 = 76.61%; p = 0.02; moderate confidence. – Complete HE (log OR): ∘ Summer education programmes: +0.38 (+0.15, +0.62); positive impact; I2 = 52.52%; p = 0.06; high confidence. ∘ Summer employment programmes: +0.07 (−0.19, +0.33); no impact; I2 = 70.54%; p = 0.07; moderate confidence. – Entry to employment, short‐term (log OR): ∘ Summer employment programmes: −0.19 (−0.45, +0.08); no impact; I2 = 87.81%; p = 0.00; low confidence. ∘ Entry to employment, full period (log OR) ∘ Summer employment programmes: −0.15 (−0.35, +0.05); no impact; I2 = 78.88%; p = 0.00; low confidence. – Likelihood of having a criminal justice outcome (log OR): ∘ Summer employment programmes: −0.05 (−0.15, +0.05); no impact; I2 = 0.00%; p = 0.76; low confidence. – Likelihood of having a drug‐related criminal justice outcome (log OR): ∘ Summer employment programmes: +0.16 (−0.57, +0.89); no impact; I2 = 65.97%; p = 0.09; low confidence. – Likelihood of having a violence‐related criminal justice outcome (log OR): ∘ Summer employment programmes: +0.03 (−0.02, +0.08); no impact; I2 = 0.00%; p = 0.22; moderate confidence. – Likelihood of having a property‐related criminal justice outcome (log OR): ∘ Summer employment programmes: +0.09 (−0.17, +0.34); no impact; I2 = 45.01%; p = 0.18; low confidence. – Number of criminal justice outcomes, during programme (SMD): ∘ Summer employment programmes: −0.01 (−0.03, +0.00); no impact; I2 = 2.17%; p = 0.31; low confidence. – Number of criminal justice outcomes, post‐programme (SMD): ∘ Summer employment programmes: −0.01 (−0.03, +0.00); no impact; I2 = 23.57%; p = 0.37; low confidence. – Number of drug‐related criminal justice outcomes, post‐programme (SMD): ∘ Summer employment programmes: −0.01 (−0.06, +0.06); no impact; I2 = 55.19%; p = 0.14; moderate confidence. – Number of violence‐related criminal justice outcomes, post‐programme (SMD): ∘ Summer employment programmes: −0.02 (−0.08, +0.03); no impact; I2 = 44.48%; p = 0.18; low confidence. – Number of property‐related criminal justice outcomes, post‐programme (SMD): ∘ Summer employment programmes: −0.02 (−0.10, +0.05); no impact; I2 = 64.93%; p = 0.09; low confidence. We re‐express instances of significant impact by programme type where we have moderate or high confidence in the security of findings by translating this to a form used by one of the studies, to aid understanding of the findings. Allocation to a summer education programme results in approximately 60% of individuals moving from never reading for fun to doing so once or twice a month (engagement in/participation in/enjoyment of education), and an increase in the English Grade Point Average of 0.08. Participation in a summer education programme results in an increase in overall Grade Point Average of 0.14 and increases the likelihood of completing higher education by 1.5 times. Signs are positive for the effectiveness of summer education programmes in achieving some of the education outcomes considered (particularly on test scores (when pooled across types), completion of higher education and STEM‐related higher education outcomes), but the evidence on which overall findings are based is often weak. Summer employment programmes appear to have a limited impact on employment outcomes, if anything, a negative impact on the likelihood of entering employment outside of employment related to the programme. The evidence base for impacts of summer employment programmes on young people's violence and offending type outcomes is currently limited – where impact is detected this largely results in substantial reductions in criminal justice outcomes, but the variation in findings across and within studies affects our ability to make any overarching assertions with confidence. In understanding the effectiveness of summer programmes, the order of outcomes also requires consideration – entries into education from a summer employment programme might be beneficial if this leads towards better quality employment in the future and a reduced propensity of criminal justice outcomes. Qualitative Synthesis Various shared features among different summer education programmes emerged from the review, allowing us to cluster specific types of these interventions which then aided the structuring of the thematic synthesis. The three distinct clusters for summer education programmes were: catch‐up programmes addressing attainment gaps, raising aspirations programmes inspiring young people to pursue the next stage of their education or career, and transition support programmes facilitating smooth transitions between educational levels. Depending on their aim, summer education programme tend to provide a combination of: additional instruction on core subjects (e.g., English, mathematics); academic classes including to enhance specialist subject knowledge (e.g., STEM‐related); homework help; coaching and mentoring; arts and recreation electives; and social and enrichment activities. Summer employment programmes provide paid work placements or subsidised jobs typically in entry‐level roles mostly in the third and public sectors, with some summer employment programmes also providing placements in the private sector. They usually include components of pre‐work training and employability skills, coaching and mentoring. There are a number of mechanisms which act as facilitators or barriers to engagement in summer programmes. These include tailoring the summer programme to each young person and individualised attention; the presence of well‐prepared staff who provide effective academic/workplace and socio‐emotional support; incentives of a monetary (e.g., stipends and wages) or non‐monetary (e.g., free transport and meals) nature; recruitment strategies, which are effective at identifying, targeting and engaging participants who can most benefit from the intervention; partnerships, with key actors who can help facilitate referrals and recruitment, such as schools, community action and workforce development agencies; format, including providing social activities and opportunities to support the formation of connections with peers; integration into the workplace, through pre‐placement engagement, such as through orientation days, pre‐work skills training, job fairs, and interactions with employers ahead of the beginning of the summer programme; and skill acquisition, such as improvements in social skills. In terms of the causal processes which lead from engagement in a summer programme to outcomes, these include: skill acquisition, including academic, social, emotional, and life skills; positive relationships with peers, including with older students as mentors in summer education programmes; personalised and positive relationships with staff; location, including accessibility and creating familiar environments; creating connections between the summer education programme and the students' learning at home to maintain continuity and reinforce learning; and providing purposeful and meaningful work through summer employment programmes (potentially facilitated through the provision of financial and/or non‐financial incentives), which makes participants more likely to see the importance of education in achieving their life goals and this leads to raised aspirations. It is important to note that no single element of a summer programme can be identified as generating the causal process for impact, and impact results rather from a combination of elements. Finally, we investigated strengths and weaknesses in summer programmes at both the design and implementation stages. In summer education programmes, design strengths include interactive and alternative learning modes; iterative and progressive content building; incorporating confidence building activities; careful lesson planning; and teacher support which is tailored to each student. Design weaknesses include insufficient funding or poor funding governance (e.g., delays to funding); limited reach of the target population; and inadequate allocation of teacher and pupil groups (i.e., misalignment between the education stage of the pupils and the content taught by staff). Implementation strengths include clear programme delivery guidance and good governance; high quality academic instruction; mentoring support; and strong partnerships. Implementation weaknesses include insufficient planning and lead in time; recruitment challenges; and variability in teaching quality. In summer employment programmes, design strengths include use of employer orientation materials and supervisor handbooks; careful consideration of programme staff roles; a wide range of job opportunities; and building a network of engaged employers. Design weaknesses are uncertainty over funding and budget agreements; variation in delivery and quality of training between providers; challenges in recruitment of employers; and caseload size and management. Implementation strengths include effective job matching; supportive relationships with supervisors; pre‐work training; and mitigating attrition (e.g., striving to increase take up of the intervention among the treatment group). Implementation weaknesses are insufficient monitors for the number of participants, and challenges around employer availability.

Suggested Citation

  • Daniel Muir & Cristiana Orlando & Becci Newton, 2024. "Impact of summer programmes on the outcomes of disadvantaged or ‘at risk’ young people: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 20(2), June.
  • Handle: RePEc:wly:camsys:v:20:y:2024:i:2:n:e1406
    DOI: 10.1002/cl2.1406
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/cl2.1406
    Download Restriction: no

    File URL: https://libkey.io/10.1002/cl2.1406?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. David Card & Jochen Kluve & Andrea Weber, 2010. "Active Labour Market Policy Evaluations: A Meta-Analysis," Economic Journal, Royal Economic Society, vol. 120(548), pages 452-477, November.
    2. Amy Ellen Schwartz & Jacob Leos‐Urbel & Joel McMurry & Matthew Wiswall, 2021. "Making summer matter: The impact of youth employment on academic performance," Quantitative Economics, Econometric Society, vol. 12(2), pages 477-504, May.
    3. Shannon Kugley & Anne Wade & James Thomas & Quenby Mahood & Anne‐Marie Klint Jørgensen & Karianne Hammerstrøm & Nila Sathe, 2017. "Searching for studies: a guide to information retrieval for Campbell systematic reviews," Campbell Systematic Reviews, John Wiley & Sons, vol. 13(1), pages 1-73.
    4. Hennink, Monique & Kaiser, Bonnie N., 2022. "Sample sizes for saturation in qualitative research: A systematic review of empirical tests," Social Science & Medicine, Elsevier, vol. 292(C).
    5. Lance Lochner, 2004. "Education, Work, And Crime: A Human Capital Approach," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 45(3), pages 811-843, August.
    6. Brian Bell & Rui Costa & Stephen Machin, 2022. "Why Does Education Reduce Crime?," Journal of Political Economy, University of Chicago Press, vol. 130(3), pages 732-765.
    7. Alicia Sasser Modestino, 2019. "How Do Summer Youth Employment Programs Improve Criminal Justice Outcomes, and for Whom?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 38(3), pages 600-628, June.
    8. Jonathan M.V. Davis & Sara B. Heller, 2020. "Rethinking the Benefits of Youth Employment Programs: The Heterogeneous Effects of Summer Jobs," The Review of Economics and Statistics, MIT Press, vol. 102(4), pages 664-677, October.
    9. Alexander Gelber & Adam Isen & Judd B. Kessler, 2016. "The Effects of Youth Employment: Evidence from New York City Lotteries," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 131(1), pages 423-460.
    10. Heller, Sara B., 2022. "When scale and replication work: Learning from summer youth employment experiments," Journal of Public Economics, Elsevier, vol. 209(C).
    11. Judd B. Kessler & Sarah Tahamont & Alexander Gelber & Adam Isen, 2022. "The Effects of Youth Employment on Crime: Evidence from New York City Lotteries," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(3), pages 710-730, June.
    12. Heather Wathington & Joshua Pretlow & Elisabeth Barnett, 2016. "A Good Start? The Impact of Texas' Developmental Summer Bridge Program on Student Success," The Journal of Higher Education, Taylor & Francis Journals, vol. 87(2), pages 150-177, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Muir & Becci Newton & Cristiana Orlando, 2023. "PROTOCOL: Impact of summer programmes on the outcomes of disadvantaged or at risk young people: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 19(3), September.
    2. Heller, Sara B., 2022. "When scale and replication work: Learning from summer youth employment experiments," Journal of Public Economics, Elsevier, vol. 209(C).
    3. Jonathan M.V. Davis & Sara B. Heller, 2017. "Rethinking the Benefits of Youth Employment Programs: The Heterogeneous Effects of Summer Jobs," NBER Working Papers 23443, National Bureau of Economic Research, Inc.
    4. Judd B. Kessler & Sarah Tahamont & Alexander Gelber & Adam Isen, 2022. "The Effects of Youth Employment on Crime: Evidence from New York City Lotteries," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(3), pages 710-730, June.
    5. Jonathan M.V. Davis & Sara B. Heller, 2020. "Rethinking the Benefits of Youth Employment Programs: The Heterogeneous Effects of Summer Jobs," The Review of Economics and Statistics, MIT Press, vol. 102(4), pages 664-677, October.
    6. Gaurav Khanna & Carlos Medina & Anant Nyshadham & Jorge Tamayo & Nicolas Torres, 2023. "Formal Employment and Organised Crime: Regression Discontinuity Evidence from Colombia," The Economic Journal, Royal Economic Society, vol. 133(654), pages 2427-2448.
    7. Knutsson, Daniel & Tyrefors, Björn, 2024. "Labor Market Effects of a Youth Summer Employment Program in Sweden," Working Paper Series 1485, Research Institute of Industrial Economics.
    8. Lavecchia, Adam M. & Oreopoulos, Philip & Spencer, Noah, 2024. "The Impact of Comprehensive Student Support on Crime: Evidence from the Pathways to Education Program," IZA Discussion Papers 16724, Institute of Labor Economics (IZA).
    9. repec:spo:wpmain:info:hdl:2441/5lge9h8e809258uvvpjn34ekm4 is not listed on IDEAS
    10. Lehner, Lukas & Kasy, Maximilian, 2022. "Employing the unemployed of Marienthal: Evaluation of a guaranteed job program," INET Oxford Working Papers 2022-29, Institute for New Economic Thinking at the Oxford Martin School, University of Oxford.
    11. Denis Fougère & Arthur Heim, 2019. "L'évaluation socioéconomique de l'investissement social," Working Papers hal-03456048, HAL.
    12. Wang, Chuhong & Liu, Xingfei & Yan, Zizhong & Zhao, Yi, 2022. "Higher education expansion and crime: New evidence from China," China Economic Review, Elsevier, vol. 74(C).
    13. Adam Lavecchia & Philip Oreopoulos & Noah Spencer, 2024. "The Impact of Comprehensive Student Support on Crime," Department of Economics Working Papers 2024-01, McMaster University.
    14. Atwell, Meghan Salas & Jeon, Jeesoo & Cho, Youngmin & Coulton, Claudia & Lewis, Eric & Sorensen, Alena, 2023. "Using integrated data to examine the effects of summer youth employment program completion on educational and criminal justice system outcomes: Evidence from Cuyahoga County, Ohio," Evaluation and Program Planning, Elsevier, vol. 99(C).
    15. repec:hal:spmain:info:hdl:2441/5lge9h8e809258uvvpjn34ekm4 is not listed on IDEAS
    16. Goller, Daniel & Harrer, Tamara & Lechner, Michael & Wolff, Joachim, 2021. "Active labour market policies for the long-term unemployed: New evidence from causal machine learning," Economics Working Paper Series 2108, University of St. Gallen, School of Economics and Political Science.
    17. Md. Abdur Rahman Forhad, 2021. "Minimum Dropout Age and Juvenile Crime in the USA," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 47(3), pages 378-405, June.
    18. Barnes, Stephen & Beland, Louis-Philippe & Joshi, Swarup & Willage, Barton, 2022. "Staying out of trouble? Effect of high school career counseling on crime," Economics of Education Review, Elsevier, vol. 91(C).
    19. Amy Ellen Schwartz & Jacob Leos‐Urbel & Joel McMurry & Matthew Wiswall, 2021. "Making summer matter: The impact of youth employment on academic performance," Quantitative Economics, Econometric Society, vol. 12(2), pages 477-504, May.
    20. McEachin, Andrew & Lauen, Douglas Lee & Fuller, Sarah Crittenden & Perera, Rachel M., 2020. "Social returns to private choice? Effects of charter schools on behavioral outcomes, arrests, and civic participation," Economics of Education Review, Elsevier, vol. 76(C).
    21. Tito Boeri & Jan van Ours, 2013. "The Economics of Imperfect Labor Markets: Second Edition," Economics Books, Princeton University Press, edition 1, number 10142.
    22. Aaron Chalfin & Michael LaForest & Jacob Kaplan, 2021. "Can Precision Policing Reduce Gun Violence? Evidence from “Gang Takedowns” in New York City," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(4), pages 1047-1082, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:camsys:v:20:y:2024:i:2:n:e1406. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1111/(ISSN)1891-1803 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.