top of page
Featured Posts

4 Reasons Why Nobody Reads (Or Uses) Your Evaluation Report: Here's How to Fix It


Finally. After many months of hard work and thousands of dollars spent, you have an evaluation report that confirms how great your programme is. All these findings that will be of value to so many people.


Being all excited you send the report to everyone; the funding agency, all your partners, staff and other relevant stakeholders. You post in on social media and share the information on your LinkedIn page and other online networking fora and platforms. You may even hold a nice press conference and have an official launch.


Then you sit back and wait for everyone to respond on how great the report was. Persons will use the findings to improve their operations, policies will be influenced by evaluation findings and the donor and the public will continue to support your great programme because the independent evaluation confrmed what you knew all along; that your work makes a real difference.


A week goes by. Nothing. Then several weeks and months roll by and still little or no reactions to the report. You figure that people are busy, they will read it eventually. Right?

Wrong! If it takes that long to receive adequate feedback, the truth is that your evaluation report was dead before the first word was even written. Very few persons read it and even fewer persons will ever use the findings.


The following list discusses the reasons why your evaluation report may have landed in the figurative graveyard.


1. Failure to engage stakeholders from early on which...


spells doom for your report because you never garnered enough interest or commitment. You waited until after the evaluation was conducted to approach the relevant stakeholders. This is too late.


You missed the boat, my friend


As soon as an evaluation is conceived, all the concerned parties should be involved as much as possible in each stage. For example, during the planning phase, stakeholders can help determine the intended use of the evaluation findings, the scopes of work and the methodology. During the later stages of the evaluation stakeholders can review interim findings and contribute to recommendations.


Research has shown that persons are more inclined to support initiatives that they participated in. Naturally, someone would want to at least scan the final report as they played some part in its development. Humans have a natural curiosity to know how the story ends. So engage persons from the start to increase the likelihood that the stick around for your grand finale; the evaluation report.


Another pitfall is that...


2. The Evaluation report was written like an Encyclopaedia...


filled with technical jargons, high-flown language and complex terms. Unless your evaluation report is being submitted to an academic journal for publication, simplicity is best.


You know it is bad when even persons from ancient Egypt can't decipher your hieroglyphic report!


Be concise, use clear language and stick to terms that potential readers will easily understand. You may have to balance having an evaluation report that satisfies the donor reporting requirements and addresses the needs of the other stakeholders. Which leads to the second reason your report got little traction.


3. Failure to identify your target audience...


which results in not pitching your evaluation report accordingly.



Not knowing your target audience may have deadly consequences


An evaluation rarely serves the needs of one stakeholder. The donor, implementing agency, beneficiaries, policy-makers, non-profit organisations, the government, the general public and others all stand to derive value from an evaluation report. Nevertheless, most evaluation reports are written with just one stakeholder in mind. You guessed it, the donor.


As such, these traditional evaluation reports are usually long, lifeless, lacklustre documents that satisfies the donor reporting requirements. It ticks all the right boxes, but is of little appeal to other stakeholders who have different interests.


For example, the programme staff may be interested in the evaluation findings that relate to the operational aspects of the programme, while policy makers are keen to hear more about the impact and effectiveness of the programme.


In other words, before the first word of the document is written, you should have already determined; who the intended users of the evaluation are andhow they will use the findings.


Once this is established, then action-oriented reports can be written to serve each of these target group. An action-oriented report "is intentionally shorter than a traditional formal report and is focused, simple, and geared toward a particular audience". (Hendricks; 1994). Action reports can take different formats, that is, whether written, verbal or electronic.


Different formats for reporting on evaluation findings


Table reproduced from 'Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings', Centers for Disease Control and Prevention, 2013


The essence of action reports are that they home in on the specific area of the evaluation to suit the interests of a particular target audience.


For example, it is likely that the general public are more keen to hear how many children benefited from the programme rather than how the programme was audited twice. As such, the action-oriented report would highlight these outputs and leave out the other financial information on the operational expenditures.


4. Failure to have a dissemination and marketing strategy


So you avoided the pitfalls mentioned so far. You have engaged the relevant stakeholders and have their commitment. Your document is in simple, clear and easy to understand language. Plus, you have several shorter action-oriented reports to suit the different stakeholders. Then why is it the response to the report so lukewarm?


If a tree falls in a forest and no one hears, does it make a sound?


The sad truth is that doing the above things are not enough to get your evaluation report read (or acted upon, which is even harder). You have to get the message out through a channel and on a frequency that your target audience uses and understands. You share the document on LinkedIn, but not everybody is on this professional networking site. Or you exclude social media all together because it is not your thing. Completely ignoring the fact that a segment of your target groups uses social media as their preferred means of communication.


In other words, you have to make a sound and get your message heard by others. You cannot take a haphazard approach to dissemination. If you are to be successful in having your evaluation report read and used, you will need a structured approach.


An approach which has a concrete dissemination plan which addresses the following questions:

  • Who is the target audience?

  • What medium will you use to disseminate findings—hard copy print, electronic, presentations, briefings?

  • When is the best time to disseminate the Report? Perhaps to coincide with a special event?

  • How, where, and when will findings be used?

  • Who is responsible for dissemination?

  • What resources are available to accomplish the work?

  • What are the follow-up activities after release?

  • How will follow-up activities be monitored?


You should also consider using different reporting formats for your different target audience. Do you really expect the average community member or the very busy board member to actually read a 200 page traditional comprehensive report? An action-oriented report that addresses the issues that matter to these two stakeholder groups will be more effective. For example, give the board member a dashboard report that highlights the main figures that he or she needs to inform their decisions at the board meetings.


Table reproduced from 'Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings', Centers for Disease Control and Prevention, 2013


Hope the above tips help to resurrect your evaluation reports. Do share any additional tips and your experiences in the Comments section below.


Publications consulted for this article:


Centers for Disease Control and Prevention. Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings. Atlanta, GA: US Dept of Health and Human Services; 2013.


Hendricks M. Making a Splash: Reporting Evaluation Results Effectively. San Francisco, CA: Jossey-Bass; 1994.


Patton MQ. Utilization-Focused Evaluation. Thousand Oaks, CA: Sage Publications; 2008.


2 Comments


XTGY TPTQ
XTGY TPTQ
3 days ago

EPS Machine EPS Cutting…

EPS Machine Eps Raw…

EPS Machine EPS Recycling…

EPS Machine EPS Mould;

EPS Machine EPS Block…

EPP Machine EPP Shape…

EPTU Machine ETPU Moulding…

EPS Machine Aging Silo…

EPTU Machine ETPU Moulding…

EPS Machine EPS and…

EPS Machine EPS and…

AEON MINING AEON MINING

AEON MINING AEON MINING

KSD Miner KSD Miner

KSD Miner KSD Miner

BCH Miner BCH Miner

BCH Miner BCH Miner

Like

Hi Ann! I love this article of yours - an awesome article! - I love the flow and the approach! kind regards, Dr. Nalini Rajesh, IAF India Chennai Hub

Like
Recent Posts
Search By Tags

​​​Ann-Murray Brown

Monitoring, Evaluation and
Facilitation
bottom of page