Tips on How to Conduct Interviews for Program Evaluation (part 1)

Interviews are a way to collect useful data for program evaluation. They provide qualitative data, which is more text-based–for example: quotes, stories, descriptions– versus the quantitative or numbers-based data that written surveys (also known as questionnaires) provide. I recently interviewed people for a program evaluation and gained a new and fresh appreciation for the following tips:

Tip #1: Decide beforehand whether interviews are the most effective and efficient way of collecting the data you need.

Weigh the pros and cons of interviews:

Pros of interviews:

Interviews may:

  • Provide opportunities to probe for information that you may not otherwise think to ask for in a written questionnaire.
  • Give you information and stories that people may not otherwise share in a written survey.
  • Help you build rapport with interviewees and help identify stakeholders who really care about the program and may want to get further involved in the evaluation. Involving stakeholders is key to a successful evaluation. (see my previous post on the CDC program evaluation model.)
  • Help explain trends in quantitative data, explaining questions such as “why” and “how.” They can give you a good idea of how programs work and can help you generate a program description that is critical for every evaluation. Interviews can provide rich data that paint a picturesque portrait of your program.
  • Have potential to facilitate the expression of opinions and feelings in the interviewees’ unique “voices.” They are a rich source of quotes for future grant proposals.
  • Phone interviews are less expensive than in-person interviews.

Cons of interviews

  • More resource-intensive: it is time consuming to conduct and participate in interviews, to transcribe them and to analyze data.
  • Requires interviewers to be trained (again more resource-intensive—think training time, planning and designing training materials and presentations)
  • Interviewers need to be articulate and to be able to think quickly “on their feet” and simultaneously think ahead to decide on the next question they need to ask, listen and take notes at the same time1.
  • Usually smaller samples are used: so the representativeness of your data is much more limited. For example, your data only represents those 14 people interviewed versus representing 140 surveyed or possibly being able to infer results to a larger population when using a questionnaire.

In the end, if you decide that you really need the type of data that interviews provide, interviews can be really worth the extra time and effort!

Tip #2: Carefully design and follow an interview script even if you are the only interviewer, and train interviewers. Make sure the script and the training facilitate the following practices among interviewers:

  1. upholding ethical standards of behavior,
  2. building rapport and
  3. safeguarding the quality of data. Selected examples:

Adhere to ethical procedures such as informed consent

It can be so tempting to improvise, thinking that this will make the questions sound less rehearsed. But this makes it really easy to forget important steps like informing participants of the purpose of the interviews and asking them whether they are interested in participating in the interview (informed consent). Inform participants of potential risks and benefits of participating in the evaluation. This is especially important when collecting highly confidential health-related data.

Some participants may give you reasons for why they cannot participate in the interviews. This is where the interviewer has to first carefully discern whether the interviewee is actually interested in participating. Do not assume that everyone has the time or interest in participating. The interviewer has to then strike the careful balance between addressing any barriers that would prevent an interested interviewee from participating versus maintaining a high standard of professional ethics by being respectful of the individual’s decision not to participate and being careful about statements that may be misperceived as pressuring or coercing participant.

Do not use leading questions

Do not use leading questions, that is, questions or statements that can unconsciously influence the interviewees to give certain answers. Example of a leading question: “What are the some of the challenges program participants face in getting to classes?” Versus “Do participants face challenges in getting to classes?”

Avoid double-barreled questions

Be extra vigilant to avoid double-barreled questions, these can easily creep in especially while spontaneously asking probing questions. Example of a double barreled question: “Do you either send cards or call your program participants?” Answer: “Yes.” The problem is that these questions don’t help you figure out which of the two options is used.

Consider hiring a professional

Since there may be other considerations that go into upholding ethical conduct, building rapport, and safeguarding data quality, when doing a do-it-yourself evaluation, one option may be to collaborate with a professional evaluator to design the interview script and to train your interviewers.

1Earl Babbie. (2001). The Practice of Social Research, 9th edition. Wadsworth/Thomson Learning

——————

For more resources, see our Library topic Nonprofit Capacity Building.

___________________________________________________________________________________________________________________

Priya Small has extensive experience in collaborative evaluation planning, instrument design, data collection, grant writing and facilitation. Contact her at priyasusansmall@gmail.com. Visit her website at http://www.priyasmall.wordpress.com. See her profile at http://www.linkedin.com/in/priyasmall/