A Guide to Navigating the Evaluation Maze: “A Framework for Evaluation” from the Centers for Disease Control and Prevention (CDC), Part 1

This week-end I found myself navigating the underground tunnel system of a local university on my way to the library. Although this was not my first time, it got me thinking of others. If not for the signs, newer navigators would have either run into dead-ends or ended up walking in circles. Evaluations can also go around in circles or run into dead-ends. In this post I aim to whet your appetite for the evaluator’s version of signs and guideposts: evaluation models or frameworks.

Some think of them as evaluation road maps or mental models. Usually such models are based on years of experience and/or research. Following such models will help to spare you costly mistakes.

Today I will briefly introduce the Centers for Disease Control and Prevention’s (CDC) Framework for Evaluation. A thorough presentation is beyond the scope of my post, so please review the references I have included for future study.

A Framework for Evaluation.

Source: Centers for Disease Control and Prevention (CDC), Office of the Associate Director for Program (OADPG)

Step 1: Engage Stakeholders

Stakeholders include everyone linked to or benefiting from your program: for e.g. participants, program staff, national staff, collaborators, funders and even evaluators. Identify a small number of key stakeholders and involve them as much as possible throughout the lifespan of the evaluation. Such involvement is crucial since it ensures that stakeholders, especially those belonging to vulnerable populations, are adequately represented. A range of active and passive involvement strategies may include:

  • forming an evaluation committee
  • promoting engagement via
    • face to face meetings
    • capacity building activities
    • teleconferences
    • e-mail or discussion groups
    • simple interviews or surveys of stakeholders
    • letters and newsletters to inform them of evaluation activities and key decisions

The type of involvement strategy you choose should be custom-tailored to the specific needs of your particular program and stakeholders. Pay close attention to organizational climate and of course, timing!

Step 2: Describe the Program

Describing the program can be much harder than it deceptively seems! Various stakeholders may have differing ideas of what the program entails or should entail. Even an individual stakeholder’s perspectives can evolve over time. An iterative process is important to get everyone on the same page and to determine whether everyone’s intentions for the program reflect the actual program goals.

Once program goals are clarified, work backwards to develop a logic model, which is a flow chart demonstrating relationships between program components and the outcomes you are seeking.

Step 3: Focus the Evaluation Design

Focused evaluations are the most useful. Prioritize and focus your evaluation questions in collaboration with the small number of key stakeholders. Consider how to best serve their needs and how to prioritize the competing needs of various stakeholders. Then choose the most appropriate evaluation methods that will provide you with the best answers to those evaluation questions. Seek to balance:

  • efficiency and practicality with
  • the quality and type of data and the level of accuracy needed.

To be Continued…

Sources/Further References:

Centers for Disease Control and Prevention (CDC), Office of the Associate Director for Program (OADPG). (2011) A Framework for Evaluation. Retrieved February 6, 2012. From www.cdc.gov/eval/framework/index.htm A reliable, easy to navigate website hosted by the CDC.

Milstein, B., Wetterhall, S. and the CDC Evaluation Working Group. (2012) A Framework for Program Evaluation: A Gateway to Tools. The Community Toolbox. J. Nagy & S. B. Fawcett (Eds.). Retrieved February 6, 2012. From http://ctb.ku.edu/en/tablecontents/sub_section_main_1338.aspx The Community Tool Box is an online tutorial that is designed especially for community-based nonprofits and hosted by the University of Kansas.

——————

For more resources, see our Library topic Nonprofit Capacity Building.

________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. Visit her website at http://www.priyasmall.wordpress.com. See her profile at http://www.linkedin.com/in/priyasmall/