Author
Lyn Richards

Pub Date: 11/2009
Pages: 256

Click here for more information.
Lyn Richards
Title: Youth Offender Program Evaluation: Ensuring Quality and Rigor: A Few Lessons Learned from a National Evaluation Study

Authors: Dan Kaczynski (Central Michigan University), Ed Miller (U.S. Army), Melissa A. Kelly (University of Illinois at Chicago)

Reporting the project - and challenges in writing, experience of justifying conclusions

Round I
To disseminate the findings of the evaluation in Round I, the evaluation team prepared site visit reports for internal use only by DOL staff. Although the reports contained information aimed at formative project improvement, the evaluators were not allowed to share their reports or other information with either the project staff or the technical assistance team because the sponsor believed it was necessary to establish a neutral environment. Although the sponsor's intent was to strengthen objectivity, there was an adverse impact on the evaluation. Stakeholders were isolated from data and preliminary findings, which hindered formative improvement, led to a sense of disempowerment, and weakened utilization of findings. The lack of data exchange also served as an obstacle to validity. Since reports were not shared with the sites, there was no opportunity for the sites to review and comment on the information in the reports. Consequently, there were misspelled names and other errors in factual data in the reports.

Round II
The formative evaluation approach used in Round II involved sharing evaluation reports with key demonstration stakeholders to develop a feedback loop for continuous improvement. Included in this group of stakeholders were DOL, the projects, evaluators, and technical assistance specialists. Evaluators reviewed their evaluation findings with the project staff during subsequent evaluation visits, and the projects integrated assessment practices into their ongoing operations. This exchange instigated a continuous improvement approach among projects as they used the evaluations as tools to improve operations. Technical assistance visits provided evaluation staff an opportunity to review each project's progress and examine their needs for additional technical assistance. During the visits, technical assistance staff provided projects with a summary of their observations, including feedback and recommendations to project managers.

Round III
The documentation of data collection procedures was included in the initial evaluation design as a strategy for enhancing rigor. The volume of work generated in practice, though, hindered the preparation of meaningful memos. As memos were included in site data, this factor also had an impact on data collection. During the extended site visits in Round III, the reporting procedures were modified to include submission of a separate field memo report to address what occurred during the visit and to provide a textual snapshot of the visit. The memos were useful in extending the findings and shaping the context of data collection by including interpretive insight and reflections by the evaluator, as well as clarification on design issues. This practice supported a more open and transparent disclosure of design methodology (Anfara, Brown & Mangione, 2002; Bogdan & Biklen, 2003).

Back to Project Home Page