Tips for Writing a Summative Evaluation Report
There is no “right” or “wrong” way to write a summative evaluation report. But there are good practices. I’ve prepared a list that summarizes what I’ve learned about evaluation reports, and some techniques for writing an effective one. In general, this list is mainly intended to help me, as these are points I thought were especially poignant and conducive to a good written report. The list is divided by suggested sections of the document.
- This is the “condensed” version of your entire report. Write this section last, to give yourself time to mentally process and work through all the elements of your evaluation, and get a clear picture in your head of the most important points.
- Remember, some will only read the summary — and in many cases these may unfortunately be the key figures in the organization or among your stakeholders — so keep it concise. Preferably no more than two pages.
- Remember who the stakeholders are, and for whom the report is being written. Target the introduction and only write what will interest them.
- Write the purposes for the evaluation. Answer questions like: “What did the evaluation intend to accomplish?” “Why was an evaluation necessary?”
- Write about the program being evaluated. Identify the origins of the program, objectives and goals, internal activities, technology used, successes, shortcomings, any staff members involved, and so on. This will make sure the evaluator understands what’s going on.
- Implicitly identify the evaluation model being used. This will be addressed in more detail in the next section.
- Briefly outline what will be covered in the rest of the report here.
- Describe the sampling method(s) used. Who is the target population, and how was the sample randomized (if it was randomized)?
- Describe the evaluation model (goal-based, decision-making, discrepancy, etc.), and why it was chosen.
- Describe the data sources, and the instruments used to obtain the data. Explain the right tools for the job. If you gathered qualitative data, describe the interviews, observation, etc. and gathered nominal and ordinal data. If you gathered quantitative data, describe the measurements that gathered interval and ratio data. Include the specific measurement tools in appendices, if necessary.
- Describe the data analysis procedures used, such as statistical calculations and how scores were derived from the data.
- Keep graphs and charts to a minimum, as these will be presented in the next section.
- Outline the objectives one by one, and describe how the program accomplished those objectives.
- Refer to the instruments used in the results, and make it clear which instruments were used to achieve which results.
- Descriptive and well-designed tables and charts are always nice. The more visually explanatory a table or chart is, the better, because it means you need less of a verbal description.
- Write the results only after all the data have been collected and organized into the visual displays, or analyzed for content.
- Describe the implications the results have for the targeted stakeholders.
- Make sure both positive and negative results are written. This may include cost/time/productivity benefits or disadvantages. Make sure any personal biases don’t skew the description of the results.
- Verify that the program actually caused the results, and that extraneous unanticipated factors did not contribute to the results.
- This section can act as both a conclusion as well as a place to put professional recommendations.
- Ensure that every objective and goal stated in the introduction is addressed.
- Although you made sure not to let your biases skew the results, you still have your own biases. Tactfully make clear your own biases in the report, and let the target readers know why your recommendations may differ from another evaluator’s. Justify your recommendations as best as possible, but make sure your unique perspective is clearly presented.