How to Get out of the Evaluation Report Writing Slump

A-man-sitting-and-thinking-while-working-on-his-laptop

We have all fallen into a writing slump at one point or another. Though evaluation report writing does not involve the same kind of creativity as writing a novel, report writers can experience the same type of writer’s block. Here are 10 tips from my experience and reading that have helped deal with the evaluation report writing slump.

1. Visualize Positive Outcomes
Athletes spend time visualizing themselves performing successfully. Spend a minute visualizing yourself successfully working through that report. Convert this vision to positive self-talk.

2. Just Do It
Tell yourself to stop thinking too much about the report writing, and just start writing. Just the process of writing and arranging my ideas on paper has helped to beat writer’s block.

3. Break the project into smaller steps.
Write your report one evaluation question at a time. Break up your report into smaller sections, and don’t address other sections until you have finished the section you are working on. Make a checklist of sections to complete, and check them off as you go.

4. Focus on Quantity versus quality
Often writer’s block is caused by perfectionism—trying to get the first draft perfect. For your first draft, focus on quantity versus quality. Time yourself and force yourself to write as much as possible in that time period. Then for subsequent drafts, revise, revise and revise!

5. Find your most productive time of day
Determine your most productive time of day. When are you most free of interruptions? When can you focus on your work the best? If possible, arrange your schedule so that you can write during this time.

6. Make it a habit
Incorporate report writing into your daily routine. Write at the same time every day whether you feel like writing or not. Writing something everyday will help keep you motivated to write.

7. Discuss report writing with stakeholders.
Brainstorm ways to involve stakeholders, from sharing completed reports to involving them in drafts. Before data collection, write up a mock results section with the actual numbers missing and ask stakeholders to fill in the blanks according to their expectations. This is a strategy that Michael Patton details in the third edition of his book, Utilization Focused Evaluation. This helps to establish whether the expected outcomes match actual evaluation results. This also helps to engage stakeholders in the reporting process.

8. Ask Colleagues for feedback
Set intermediate deadlines before the actual report is due, to submit drafts of sections of the report to colleagues for review. This can help you stay motivated to write and will help elicit valuable feedback. Two heads are always better than one. This strategy will also help beat procrastination.

9. Read Other Evaluation Reports
Read other evaluation reports to help get into the report writing mode. This can also help you get a better idea of how much detail is necessary in reports. Be careful though that you don’t spend your time procrastinating by reading other reports instead of writing your own report.

10. Practice, Practice, Practice.
Even when you don’t have to write reports, stay in the habit of writing by keeping up a professional blog. Read journal articles and textbooks in your field, and collect key points and nuggets of wisdom. Then practice paraphrasing these key points. These can be incorporated in your blog too.

Five Tips on Making Your Evaluation More Systematic

A-lady-smiling-while-passing-a-file-in-her-ofice

Evaluation experts often define evaluation as a systematic endeavor. Recently I have been considering what this really means. How do we carry out a more systematic evaluation? How do we translate this into practice?

 

Aim for Consistency

Aim for consistency in data collection efforts. Data should be collected the same way every time. How can this be practically achieved? Write out instructions for completing survey questionnaires and share them out loud. Do this even though it might seem unnecessary. There are various ways individuals can interpret how to answer a question. For example, including the directions “please check off only one option” avoids potential problems such as some individuals only checking off the best response while others checking off multiple responses.

Also, use the same questions for the survey or semi-structured interview or focus group, each time. This helps with consistency in data analysis across various groups.

 

Aim for Replicability

Include detailed instructions for those who will be administering your survey with the goal of someone else being able to replicate your evaluation study. Though evaluation is not the same as research, aiming for replicability will make your evaluation efforts more consistent and systematic. I like to think of it as akin to leaving detailed instructions for a friend who will be caring for a temperamental pet or plant. Such a pet or a plant will thrive better on consistent care. So it is wise to attempt to replicate the same level of care you’d give the pet or plant by writing out a plan of care.

Sometimes stakeholders can feel that such instructions are insulting to their intelligence. So it helps to emphasize the need for consistency and to explain why you are doing things the way you have chosen to do them. Data that is collected inconsistently can result in difficulties in analyzing the data and having to exclude responses, which can further complicate analysis. This also decreases the validity of the data collection method—that is, we are not really measuring what we think we are measuring.

 

Involve Stakeholders in Every Stage of the Evaluation, especially Planning

How do we maintain consistency especially when others are involved in data collection? Involving key program stakeholders in planning the evaluation can increase consistency in data collection efforts. Brainstorm with them ways to collect data consistently. Provide an interactive training in data collection.

 

Draft a Written Plan for Data Collection

A written plan for data collection can help identify pitfalls ahead of time. It also provides a game plan to stick to each time. Once data collection has started, have regular meetings with program stakeholders or staff to discuss the data collection plan and how adherence to the plan can be maintained.

 

Pilot test Your Data Collection Method

Pilot testing your data collection method can help bring awareness of potential problems with your data collection tool. It also provides a good opportunity for program stakeholders such as clients to provide input about your data collection method and tool. A pilot testing survey includes questions such as:

  1. Were all the questions easy to understand?
  2. Were all the survey directions clear?
  3. Is there any other feedback you have about the survey process?
  4. How can we further improve this survey?

Tips on Planning for Focus Groups

A-female-consultant-exchanging-pleasantries-with-a-business-man

A focus group is a moderated group discussion that focuses on particular topics of interest. Moderators lead focus groups and usually follow a discussion guide of open-ended questions. Here are some tips for planning for focus groups in program evaluation, gleaned from my reading of Richard A. Krueger’s and Mary Anne Casey’s excellent book, Focus Groups: A Practical Guide for Applied Research, 4th edition and supported by my own experience.

1. Read Krueger and Casey’s book, Focus Groups: A Practical Guide for Applied Research, 4th edition.

This is well-written, comprehensive book filled with practical tips on planning, conducting, analyzing and reporting on focus groups. This blog post cannot serve as a substitute for reading this book. I hope it peaks your interest and inspires you to read the book.

2. Ask yourself whether focus groups are the best method for your evaluation

Create a mental or a drawn out figure listing the pros and cons of focus groups versus other methods such as written surveys or observations. Consider cost-effectiveness, the type of information that you are seeking and the actual resources available. Do you have in-house staff that are qualified to conduct focus groups or that are able to be trained to do so? Or can you afford a professional moderator?

3. Draft a written evaluation plan ahead of time

This is a very important step as it forces us to put our ideas down on paper, spell out steps, think ahead, and ensure that each step is justified. It also avoids last minute decisions that can affect the robustness of your evaluation. A concrete written evaluation plan can also be shared with colleagues and stakeholders to generate valuable feedback.

4. Decide on types of participants to be included in the focus groups

Ask yourself and stakeholders these questions: Who will give you the information you are looking for? Talk to the gatekeepers of your communities and program stakeholders to best answer this question. For example, do you want a mix of patients and caregivers in the same group or are you able to differentiate the groups by patient and caregiver? Are the participants less likely to be candid if the groups are mixed? Present such questions to stakeholders and participants of your initial focus groups.

5. Get feedback on your focus group discussion guide

Start out by asking your stakeholders what questions should be asked during the focus groups. Getting feedback will also help to make sure that all your questions are clear and not likely to be misunderstood. This will help avoid other commonly made mistakes like cramming too many questions into the discussion guide. The discussion guide should make it easy for the group members to enter in and open up. Some strategies include using an ice-breaker question and going from general to detailed questions.

6. Plan to use other methods to corroborate findings

Findings from focus groups are best verified by other methods such as written surveys and observation. This helps address the concern that group participants may give answers that the moderator or others want to hear (social desirability bias). In non-profit settings, it may be hard to convene a focus group where no one knows each other. This might introduce some bias too, hence the need for other methods to verify your findings from the focus groups.

To Evaluate, let’s Facilitate: Building a stronger evaluation life-cycle, by boosting facilitation skills.

group-diverse-people-having-business-meeting

I remember, as a child, dutifully memorizing the life-cycles of frogs and butterflies. Now as an evaluator, I find myself participating in a work-related life-cycle of sorts too, though no need of painstaking memorization here:

Evaluation Planning –>Negotiating the plan –> Conducting the evaluation (negotiating this) –>Reporting –>Planning of future evaluations…–>(and the cycle continues)

While recently reading one of my program evaluation books, I was struck by the important role that facilitation plays in every stage of an evaluation’s life cycle. I am not just saying this because I have facilitated in the past.

Some might have a love-hate relationship with facilitation. But we cannot do program evaluation in a vacuum, without working with people, also known as our stakeholders—those who share a vested interest in our evaluations. Wherever people are concerned, facilitation is a valuable skill that we all use consciously or unconsciously.

In my experience, until this was pointed out to me in my reading, I did not realize that I was unconsciously using facilitation skills and also participating in facilitation led by a couple of my evaluation stakeholders. So whether we realize it or not, facilitation is important in the process of conducting evaluations.

Here are three reasons why, which I have gleaned from recent perusals of 1) Program Evaluation, by John Owen and Patricia Rogers and 2) John Bryson and Michael Patton’s chapter “Analyzing and Engaging Stakeholders” in Wholey and colleagues’ book Handbook of Practical Program Evaluation.

Facilitation During Evaluation Planning

Facilitation is central to the planning stages of an evaluation. Why? If you’re like me, I’d dislike doing an evaluation, knowing that it would never get used. We all want our work to count, to be useful.

One of the ways to help ensure that an evaluation gets used, is to involve your primary stakeholders in planning stages. And what better way to truly involve them, than to facilitate ongoing discussions with them in the planning stages of your evaluation. As I discussed in my previous post, use these discussions to mutually build a collective vision that guides and motivates as many of those around the table, as possible.

Facilitation During Evaluation Negotiation

Once your evaluation plan is ready to go, it’s time to negotiate the plan with your primary stakeholders. Negotiation also takes facilitation skills. Some of us might like the negotiation process more than others.

Either way, you may find it helpful during the negotiation process, to focus on working together and coming up with a revised plan that reduces the level of stress for the key stakeholders and key players of your evaluation.

However short or lengthy an evaluation might be, I like to think of my evaluation clients as co-workers. Facilitating your way through a bearable, if not pain-free negotiation process, helps lay the ground work for successful relationships with evaluation stakeholders and hopefully, ultimately, an evaluation that gets used.

Facilitation During Evaluation Reporting and Dissemination

Using facilitation skills before and during report-writing, will help engage your stakeholders in the reporting process. This will also help increase the likelihood that stakeholders will apply and use evaluation results.

The same applies to the stage of sharing evaluation results, that is dissemination. So how do we facilitate in ways that engage and provoke thought? As you might have answered, one way is to ask questions that draw your audience in.

In his book Focus Groups: A Practical Guide for Applied Research, Richard Krueger recommends that evaluators write reports not with a general audience in mind, but with individuals in mind. Why not apply this to facilitation in the reporting stage?

Let’s tailor our reporting and facilitation according to what we know about the people that comprise our primary stakeholder group. In other words, facilitate with particular persons in mind, not a general audience.

For readers who are novice evaluators, does this article help? For my readers who are more experienced evaluators, how have facilitation skills helped you build a stronger evaluation?

________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. See her profile at http://www.linkedin.com/in/priyasmall/

Tips on Building Relationships with Evaluation Stakeholders, Part 2

business-consultant-with-her-clent

“It is one of the most beautiful compensations of this life that no man can sincerely try to help another without helping himself.”
Ralph Waldo Emerson

This post is part 2 in a series of posts dedicated to the committed stakeholders I’ve worked with over the past year, who continue to inspire me to follow their example of helping others. This post continues where we left off from my previous post on tips to build relationships with stakeholders, inspired by lessons learned and a recent fundraising letter I received from my alma mater.

Share a vision

Remain positive and communicate a central idea—a vision that will take your program

onward and upward. The fundraising letter gradually built up to this vision: “everyone deserves

a chance.” Build relationships and talk to participants and staff to help build a mutual vision

that resonates and revitalizes.

Encourage participation in the evaluation, regardless of amount or type.

Every bit counts. Key stakeholders can participate in different ways.

Encourage stakeholders to give the gift of time. Seek new ways to involve them. Invite them to share their responses to evaluation-related drafts such as surveys, evaluation plans and especially evaluation reports. Offer evaluation-related training to increase your organization’s capacity. I have found our community partners and volunteers to be incredibly valuable allies in planning evaluations. They have experienced the needs that the programs address. They are experts in their own rights. I am so thankful for their heartfelt advice and the gift of their time.

Time it right.

Isn’t timing everything? The letter arrived in my mailbox during the holiday season, when some may be more inclined to give. Consider signs of readiness to participate in evaluation-related activities. What is the energy level of the group?1 How enthusiastic are people?

Be willing to listen.

The fundraising letter related a firsthand account of a student’s story. Do your best to hear from program participants directly. Evaluation offers a more objective and systematic way of collecting this information. I appreciate the value of gathering evaluation data from interviews and focus groups, that is, qualitative data: such methods offer a glimpse into participants’ lives and can lead to insights that a survey cannot. This is what I enjoy most about the work I do—getting to know the real people whom we serve.

“Stop, look and listen.”

I’ve been learning recently the need to:

  1. Stop! “Stop myself”—my own preconceived notions, plans and ideas1.
  2. Look! Observe body language and context of people’s comments1.
  3. Listen! Stopping and looking helps me to listen to what is really being said and to what is not being said.

Sharing the stories of people we’ve listened to has the potential to impact program decision-making.

These are 7 ways that a fundraising letter’s message stayed with me. I hope these seven tips also help us strengthen our relationships with our own evaluation stakeholders.

Many of us desire to give back to our communities in some way. We desire to reach out and touch the future—to play a part in bringing about tangible outcomes that make the world better place. Let’s nurture that desire to give in the stakeholders of all our evaluations.

1 Krueger, R. (1988). Focus Groups: A Practical Guide to Applied Research.

——————

For more resources, see our Library topic Nonprofit Capacity Building.

________________________________________________________________________________________________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. See her profile at http://www.linkedin.com/in/priyasmall/

Tips on Building Relationships with Evaluation Stakeholders, Part 1

A-consultant-with-his-clents

“It is one of the most beautiful compensations of this life that no man can sincerely try to help another without helping himself.”

Ralph Waldo Emerson

This post is dedicated to the committed stakeholders I’ve worked with over the past year, who continue to inspire me to follow their example of helping others.

The season of giving has been upon us again, and some of you have probably been involved in your nonprofit’s own fundraising efforts. Recently I received a fundraising letter from my alma mater that really stayed with me, long after I put it down. The letter reminded me of evaluation and building relationships with those who have an important stake in our evaluations. Yes, as you have surmised, this post is not about financial giving. It is about building relationships with evaluation stakeholders, a lesson I continue to learn. These relationships are vital to crafting an evaluation that is meaningful and useful to your stakeholders. Here are some tips to building those relationships with evaluation stakeholders:

Tip 1: Find and learn from a “champion”.

The letter I received was from a well-known figure who has supported my alma mater in various ways, from speaking at graduation ceremonies to talking with students firsthand to learn about their experiences. Find a like-minded supporter who is a “gatekeeper” of the community—a leader who will champion the evaluation. A gatekeeper gives an evaluator an entrance into the community. This is my advice to fellow evaluators or those commissioning evaluation: Look out for and learn from (or urge your evaluator to do so) someone whom your stakeholders trust, respect, and talk about.

This might be someone who has had face to face contact with your stakeholders, and who has positively impacted them. Ask this person thoughtful questions to help you learn from their experiences. Some of my best meetings have been when I’ve learnt something new by asking a question. A deeper understanding of the program being evaluated and the people being served establishes a strong base for the evaluation.

Recently, I was inspired by one such person, Dave Purdy, who won the trust of communities by his tireless dedication to the cause. Despite challenges, he drove long distances to get to know and serve people with Parkinson’s. One evaluation stakeholder summed up his description of this person’s accomplishments with these simple, yet meaningful words: “he’s just a great guy.”

Tip 2: Sit down face to face

There is nothing like a face to face talk. In this age of e-mail and social media, I’ve made the mistake of not recognizing the value of face to face meetings. But there is nothing like sitting down with people for a cup of coffee. Nothing can take the place of face to face time. Over time, all these interactions build up to help us get to know each other and gain trust. Especially since evaluation can make some feel threatened (for more information, please see a previous post, How to Address Others’ Fears about Program Evaluation), building trust equals building relationships with stakeholders.

Tip 3: Discuss mutual goals

The author of the fundraising letter shared a striking, specific, personal goal of his: that every alumnus from the class of 2012 to 1920, make a contribution. Use face to face time to engage others in:

  • Articulating mutually meaningful goals
  • Thinking through goals
  • Updating stakeholders on evaluation activities and considering progress
  • Revising those goals, as needed

It is especially vital to communicate evaluation goals with those commissioning your evaluation. It is so easy for evaluations to get de-railed by competing needs and tangential directions. Re-visit goals during every meeting. Strong evaluations are focused. Carefully and mutually thought-out evaluation goals and objectives will help focus your evaluation, while building relationships with your key stakeholders.

(To be continued…)

——————

For more resources, see our Library topic Nonprofit Capacity Building.

_____________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. See her profile at http://www.linkedin.com/in/priyasmall/

How to Set Meaningful Professional Development Goals in Evaluation: Part 2.

Team-leader-preparing-a-crisis-speech-with-teammates-

A warm thanks to all who’ve reached out since the start of this blog. I began this series of posts thinking of a couple of my readers who consider themselves relatively new to evaluation. But regardless of whether we’re newer, seasoned or somewhere in between, isn’t there always something new to learn? This is why I enjoy evaluation. On that note, let’s continue considering professional development goals. Again, I am writing more for my own learning and development. I do not consider myself an expert. Before we dive in, some of us might be cringing at the thought of adding more to our to-do list. Here are some productivity tips that I’ve found helpful:

How to Increase Productivity and Get through that To-Do list

  • Dreading a task?
    • Get it done early in the morning or whenever your energy levels peak.
    • Find a time (and a place) where you are least likely to be interrupted.
    • Un-connect: place all gadgets in airplane mode.
    • Set a timer to 10 minutes, and commit to just 10 minutes of that task.
  • Keep track of time:
    • Use a spreadsheet to keep account of your time, and
    • Set a timer to avoid getting carried away by your work and losing track of time.
  • Exercise—it can boost energy levels
  • Tackle larger goals like professional development with like-minded others…

Join a Community of Evaluators

The American Evaluation Association (AEA) is a great professional organization that provides many opportunities to get involved, learn from others and network. Many of the links below first came my way via the AEA. Here are a few resources from the AEA:

  • EVAL TALK: AEA’s e-mail list serve. Like all such resources, those who participate learn the most!
  • aea365: The American Evaluation’s Association’s Tip-A-Day program, by evaluators and for evaluators. Don’t discount yourself from writing a post. This blog features posts from newer evaluators too. Some of the contributors shared that they wrote posts to continue learning and reflecting about evaluation. Please contact me if you’d like to write for aea365.

Commit to Continuing Your Education

Nearly two decades after he gave me this advice, my father’s words ring true now more than ever: Money comes and goes. But you can never lose your education. Here are some free or low-cost options for skill building and continuing education:

  • A website of evaluation-related resources, compiled by Gene Shackman, Applied Sociologist. http://gsociology.icaap.org/methods/
  • The American Evaluation Association. Topics covered by upcoming American Evaluation Association e-study webinars include:

1) Correlation/Regression Analyses and

2) Evaluation reporting using Data Dashboards.

  • Consider attending the American Evaluation Association’s 2013 conference from October 14-19th in Washington, DC.
  • Coursera offers access to online university courses. I’m particularly interested in the data analysis courses using the free, open-source R software.

———

For more information about personal development, see the Free Management topic Personal Development.

_____________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. Visit her website at http://www.priyasmall.wordpress.com. See her profile at http://www.linkedin.com/in/priyasmall/

How to Set Meaningful Professional Development Goals in Evaluation: Part 1.

-woman-having-a-conversation-with-friends.

I’m back after a hectic work period (my apologies for the sparse posts!) I return with a renewed commitment to this blog, aspiring to practice a lesson I’ve recently learnt: do less, plan and reflect more! Today I’d like to tackle the importance of reflection when setting professional development goals. Reflection, at least in my own experience, has led to goals, and goals have led to more reflection.

Really? But how easy is it for goal-setting to become more of an obligation and less about reflection? We all have had our share of the mandatory project goals and objectives. Am I pushing it, asking us to stop and draft more goals—this time, personal ones, for professional development? Let’s re-visit the value of the reflective process.

How to Get to Where You are Going: Look back

Yes, goals help get us where we are going. But how do we set meaningful professional development goals? Recently, I found—okay, I’ll I admit it—an unused, old graduation gift. A journal. Scrawled inside the front cover was a note, something to this effect: “Write down your experiences. Looking back will help take you where you should go.”

Reflection Helps to Craft Customized Goals

  1. Consider major events in the past: accomplishments/successes and set-backs/conflicts.
  2. Then match these up with your professional strengths and weaknesses: growth opportunities!
  3. Keep building on past experiences as you set professional development goals, until…voila! there it is, a customized goal.

It is so easy to compare ourselves with others and to try to make their goals ours. But as we, perhaps unconsciously, set out to out-do the goals of the Joneses of the non-profit world, might we be setting ourselves up for failure? Your professional development goals have to be yours—custom-designed for you and your work context.

Try not to be discouraged by set-backs or limitations. Last month consolation and insight poured in from discussing a set-back with a trusted friend and former co- worker—alright, maybe it also helped that she came bearing gifts— home-made raspberry preserves. I learnt afresh that mistakes, limitations and set-backs help me to learn and grow if I can let them go for a while and then turn them around to form tailored professional development goals.

First Stop for Setting Professional Development Goals: CES Competencies for Evaluation Practice

There has been an intriguing debate on EVAL TALK (the American Evaluation Association’s (AEA) list serve) over credentialing for evaluators. But whether or not you decide to get credentialed, still check out the Canadian Evaluation Society’s (CES) credentialing requirements. Jean King, Professor in the Department of Educational Policy and Administration at the University of Minnesota and Coordinator of the Evaluation Studies Program, stressed the importance of this resource during a past AEA Thought Leader discussion series. Pay special attention to the CES’s Competencies for Canadian Evaluation Practice. You might also find inspiration here at this CES webpage to craft a professional developmental goal or two.

So here’s a goal for this week:

I’ve been revisiting the CES Competencies and considering how I can hone these skills in my current work. Let’s look at the first competency domain: “Reflective Practice.” A worthy goal! Yet so hard to carry out in the frenzied and furious pace of the non-profit world (I exaggerate a bit). I’ve been guilty of do, do, doing and putting off reflection for later. But wouldn’t you agree that reflective practice is even more important in such an environment? One way I can weave “reflective practice” into my own evaluation work is by keeping a journal.

For example, this week my goal is to: 1. read the chapter on engaging stakeholders in Wholey and colleagues’ Handbook of Practical Program Evaluation, 2. apply it to my work by jotting down a lesson learned, coupled with a future objective for my evaluation practice. Simple, do-able, and measurable. The CES competency check-list also includes:

  1. networking and joining a professional evaluation organization and
  2. taking professional development courses.

Let’s tackle those topics in my next post.

So, any thoughts on professional development and goals? I’d love to hear from you!

————-

For more information about personal development, see the Free Management topic Personal Development.

_____________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. Visit her website at http://www.priyasmall.wordpress.com. See her profile at http://www.linkedin.com/in/priyasmall/

How to Simplify Survey Design

a female manager convincing potential investors on the benefits of her company

Keeping It Simple

Many of us have probably have had a past English teacher or colleague or two warn us about the plague of wordiness. “Keep It Simple!” they still admonish. Some of us, though, have trouble practicing it. “But I really want to make the best impression. And what about all those great ideas? We’ll fit it in somehow!”

The other day read I read an article exhorting its readers to keep their writing simple. It said something to the effect of, “We already have pages and pages to read, so please spare us! Keep it brief.” The message struck home. And since then I’ve been having a series of “Aha” moments.

Avoid the Pitfall of a Long Questionnaire: Understanding Causes

The “Do-it-all” Questionnaire

Let us apply that principle to evaluation. Say, you’re thinking of conducting a survey. The do-gooder in you wants this to be the best questionnaire ever. The temptation is so great to add this question and that question. To fill it chock-full of ideas you are borrowing from colleagues.

Trying Not to Step on Anyone’s Toes

And then there is the importance of involving all the stakeholders. Stakeholder A really wants to see these questions added. Stakeholders B and C want you add some more questions. They’re all valuable questions and this input is vitally necessary. Yet the manager or evaluator might be tempted again to take the easy way out and just include everyone’s questions. And that would give you one extra-long survey: one that is trying to— super-politely and super-humanly juggle too many balls at once.

One Pitfall Leads to Another

We all probably can relate to that with a sigh. We do our best to juggle everything and be super-people. But eventually you and I might just reach our limit. Sooner or later, one of those balls may have to drop. Those long surveys might wear out your program participants. And then you end up with overwhelming amounts of data to be entered, cleaned and analyzed in record-breaking speed. That translates into a long evaluation report overflowing with words and statistics that no one has time to decipher. I’ve travelled down this road early in my evaluation career. So how do we avoid this pitfall?

A few initial thoughts:

Communicate and Seek Consensus

Try speaking honestly with your primary stakeholders. Express your appreciation for their commitment and their ideas while presenting the challenge of balancing multiple needs. Achieving this balance and managing the conflict that might ensue are valuable skills for an evaluator to hone. Reassure every one of the valuable role each one plays. Seek the wisdom and grace to navigate this difficult role, learning from older colleagues with years of experience. Then do your best to guide your group of stakeholders to achieve consensus or, at the very least, compromise about what the questionnaire should cover.

Vision

Involve your small group of stakeholders in putting a shared goal for the evaluation into words. Start by listing your common goals for the program. Then ask the group to rank them and go from there.

Practice Brevity

Practice writing simple survey questions or better yet, adapt from others’ work. I’ve begun using e-mailing as an opportunity to practice trimming away unnecessary words and content. I’m practicing this with survey design too. It is very satisfying and liberating to be able to fit content in a page rather than two or three or five!

Now we’d really like to hear from you. Do you think a shorter survey is realistic? Are there any challenging trade-offs?

——————

For more resources, see our Library topic Nonprofit Capacity Building.

______________________________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. Visit her website at http://www.priyasmall.wordpress.com. See her profile at http://www.linkedin.com/in/priyasmall/

How to Evaluate on a Budget: DIY or Outsource?

A young-businessman-working-business-reports-using-computer.

This question has nagged me for a while. It emerged to the forefront as I recently considered the most efficient way to carry out my responsibilities, spurred on by the 20-80 rule: 80% of our outcomes come from 20% of our efforts. So how do you evaluate on a budget? Should you do-it-yourself (DIY) or outsource? We’ll begin by considering some of the advantages of either option.

Advantages of the Do-It-Yourself Approach

Firsthand Knowledge of Issues

We are sometimes our own worst critics. But we may have more to offer than we’d imagine. Recently I had the privilege of working with stakeholders who have personally experienced the challenges their program addresses. I emerged from that meeting with a deep appreciation for the collective wisdom of program participants and those who work closest with the people who receive a non-profit’s services. It is also vital to fully coordinate evaluation efforts with existing program operations, especially to plan program evaluation during program planning stages.

Save Money

It goes without saying that you can streamline evaluation-related costs and increase efficiency by using in-house staff who are most familiar with the way a program operates.

Save Time

In-house staff already have relationships with program participants and thus need not go out of their way to build trust. Building trust takes time! Good evaluations are founded on relationships of trust with program participants. If they do not like or trust the person who is administering the evaluation, can we truly expect genuine and forthcoming answers? Also, staff can gather data (via surveys, interviews, etc) in the course of carrying out their responsibilities. Since they know the ins and outs of a program, program staff’s expertise is vital to the success of any program evaluation.

While in-house staff bring indispensable strengths to an evaluation, please also consider the advantages of using an external evaluator.

Advantages of Using an External Evaluator

Expertise

We’ve all probably had these thoughts at some point or the other:

“Evaluation is not rocket science, anyone can do it!”

“What is so hard about designing a survey, distributing it and analyzing the data? Even a high-school student can do it.”

This topic was recently broached within an online American Evaluation Association Thought Leader Discussion Group. The danger is that evaluation can seem a lot easier than it actually is. But if you look at Program Evaluator Job Announcements, you’ll notice specific qualifications. The following is a sample of what you may see for a junior level position:

  • at least a Masters in Social Sciences or Public Health, Ph.D. preferred
  • at least 1-2 years of experience conducting program evaluations
  • strong communication and interpersonal skills
  • experience with quantitative data analysis and with data analysis software, such as SPSS

The general consensus seems to include the above qualifications as well as engagement in professional continuing education in the field of program evaluation. (Before you stop reading this post and throw up your hands in despair because you do not have the budget to hire another person, please keep reading. we’ll get to possible solutions to this dilemma soon!)

There have been reams of papers and chapters written on program evaluation, specific models, quantitative and qualitative methods and survey design. Enough to make even the most experienced evaluator feel that there is always so much more to learn! Other evaluation colleagues often call attention to mistakes that well-intentioned novices have made, for example, in sharing evaluation results and data.

Sure, it is important to maintain a level of healthy skepticism: of course, any evaluator will want to make a case for why you should hire a professional to do the job right. But I’d encourage you to also consider how much risk you are willing to take if an evaluation is not done properly. What do you stand to lose? Think about all the ways that misrepresented data could hurt your program. Think about all the opportunities to grow and improve your program that may be lost through a botched evaluation.

Objectivity

Again from first hand experience, I’ve heard stakeholders closest to the program express appreciation for the objectivity and fresh perspective that an external evaluator (also known as an independent evaluator) brings to the table. Program participants may feel awkward being forthcoming with those actually delivering the services. Due to their existing relationships with program staff, participants may feel subconsciously pressured to give answers that they think staff want to hear.

Ability to Coordinate

Having a skilled evaluator coordinate an evaluation effort will spare you headaches and worry. This may be a better option over merely dividing up responsibilities among maxed-out staff or having to worry about coordinating an extra project on your own.

Possible Solutions

Alright, now we come to the simple solutions. Coordination implies that you do not have to hire the evaluator to do everything! This translates into cost savings. Carter McNamara, Ph.D. astutely applied the 20-80 rule to program evaluation in his article Basic Guide to Program Evaluation. Since 20% of the effort can produce 80% of the outcomes, a good option may be to contract with an external evaluator just for the 20% of effort that produces the 80% of outcomes. I agree with Dr. McNamara that the best responsibilities to outsource would be the design of the evaluation and surveys. I’d also add data analysis and reporting.

Data collection can be most time-consuming and expensive. Using program staff for these functions may be a good compromise, as long as measures are taken to encourage objectivity, for example, a volunteer placing surveys in sealed envelopes.

If you still find the notion of contracting with an independent evaluator daunting, consider the following solutions that I have observed other non-profits use. Once you have clearly defined the scope of a very feasible evaluation project, carefully recruit and choose a well-qualified:

  • graduate student whose rates and schedule may be more flexible
  • pro bono evaluator who may be trying to gain expertise in a new area or may just be interested in giving back
  • a stay-at-home parent who may be willing to trade their program evaluation expertise for more flexible terms.

You may considering beginning your recruiting efforts at the American Evaluation’s Association’s Career Center or on AEA’s LinkedIn Group. Here is some further reading on using an external evaluator from the Centers for Disease Control and Prevention.

Evaluating on a budget doesn’t have to be an unattainable dream and neither does it have to be a do-it-yourself disaster. With creative and strategic solutions and careful planning, it can be a practical reality.

——————

For more resources, see our Library topic Nonprofit Capacity Building.

_______________________________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. Visit her website at http://www.priyasmall.wordpress.com. See her profile at http://www.linkedin.com/in/priyasmall/