Evaluation Planning
Oftentimes planning gets overlooked, as people focus on the nitty gritty of data collection or analysis. Without giving enough to properly plan, resources may be over utilized or underutilized, the wrong outcomes might be assessed, unfeasible targets set, etc. It is critical to the success of your program or project that you spend enough time on this phase.
Considerations
What is it? One size does not fit all when it comes to evaluation, nor do things always work out quite as planned. Surveys may work in some scenarios, while focus groups and interviews may be better suited in other places. The best you can do is to consider as many scenarios as possible, and make contingency plans — especially if working internationally. Special consideration should be given to the cultural context in which the evaluation will be conducted, stakeholders and beneficiaries, logistics for conducting an evaluation, and the collaborations necessary to get it done. Our consultants can walk through the entire evaluation process and help you think through potential obstacles and ways to address them.
How can we help you?
- Identify cultural issues that may or may not influence your evaluation efforts
- Identify key collaborations needed
- Identify potential logistical issues and prepare contingencies
Related Services
Lessons Learned
Lessons learned are experiences, knowledge, understandings, or outcomes gained by experience from a particular project or program that should be taken into account on future projects or programs.
Learn MoreMonitoring & Evaluation
Coaches have experience and expertise to support the learning and tasks of the people that need help.
Learn MoreEvaluation Coaching & Training
Coaches have experience and expertise to support the learning and tasks of the people that need help.
Learn MoreEvaluation Logic Models
A logic model is a one-page, compelling graphic (your road map) that tells the reader/reviewer exactly what, when, where, why, and how.
Learn MoreArticles and White Papers About Considerations
What Are You Actually Measuring? Selecting the Right Indicator to Measure Progress
Articles and White Papers About Monitoring & Evaluation To RCT or Not? Randomized Control Trials in Nonprofit Work Read More Whose Job is it to Evaluate? Read More The Problem with Relying Solely on Dashboards Read More Finalizing Reports: Statements of Differences Read More Load More
Read MoreWhy is Cultural Competence Critical to Evaluation
Articles and White Papers About Evaluation Goals Whose Job is it to Evaluate? Read More Delivering Strong M&E Reports Read More What Are You Actually Measuring? Selecting the Right Indicator to Measure Progress Read More Coaching for Evaluation Priorities Read More Load More
Read MoreWhat Are The Right Questions To Ask Before Investing?
Articles and White Papers About Cost Analysis Trust Us!? Building and Maintaining Trust Capital Read More How Do We Conduct Feasible, Cost-Effective Data Collection Read More Critical vs Tedious Data Collection Read More What Are You Actually Measuring? Selecting the Right Indicator to Measure Progress Read More Load More
Read MorePathways of Change Understanding the Timeline of Theories of Change
Articles and White Papers About Logic Models Technology Troubleshooting in M&E Read More M&E and Technology Read More Theories of Change Read More Pathways of Change Understanding the Timeline of Theories of Change Read More Load More
Read MoreFAQ About Considerations
The number of questions depends on an evaluation’s purpose and goals, and the type of questions being asked. It is important that before you launch a survey, you test it with a sample to ensure it is understandable and reasonable in length. Ideally, surveys should take no longer than 15 minutes for respondents to complete.
There are many things to keep in mind when creating an evaluation plan, but some considerations are the program/project goals, the evaluation questions that need to be answered, who the key stakeholders are, program/project activities, outputs, outcomes, and any challenges or other factors that may affect the program.
During the development of a new program or when an existing program is being modified for a new population.
Goal-based evaluation, outcome evaluation, impact evaluation, cost-effectiveness, and cost benefit analysis.
Lessons learned should come from multiple sources, not just a single source, so that the information gained can be reinforced and triangulated.
What Our Clients Say About Us
Peggy Ostrander, DNPc, APRN, FNP-C Plano, Texas