Left Navigation

:: Home :: Current / Past Issues :: Call for Manuscripts :: For Authors :: For Reviewers :: The Forum Board

Qualitative program evaluation methods

J. Mitch Vaterlaus, M.S.
Graduate Extension Assistant
Utah State University

Brian J. Higginbotham, Ph.D.
Associate Professor and Family Life Extension Specialist
Utah State University

Abstract

Evaluation is an important component of refining programs and documenting impacts. Evaluation aids the profession as a whole and assists Extension faculty in meeting promotion requirements. Qualitative methods are commonly used in evaluations in order to explore specific facets of programs and to give voice to participants’ experiences. These methods provide in-depth information that can assist Extension faculty in enhancing the quality of their programs. This review highlights differences between quantitative and qualitative evaluation methods. The elements, processes, and limitations of qualitative evaluation methodology are detailed. In addition, specific guidelines are provided for increasing the trustworthiness of qualitative evaluations.

Keywords

Evaluation, methods, qualitative

Introduction

Extension professionals may not feel they have the time, resources, or expertise for conducting advanced statistical analyses (Higginbotham, Henderson, and Adler-Baeder 2007). There also may be concern that quantitative methodologies will not provide practical and in-depth information often needed for program improvement. Extension faculty with these concerns should consider the possibilities of qualitative research.

“Qualitative research” is a title that represents a broad family of methods (Bamberger, Rugh, and Mabry 2006; Bodgan and Biklen 1998). It has been defined as the process of “making sense” of data gathered from interviews, on-site observations, documents, etc., then “responsibly presenting what the data reveal” (Caudle 2004, 417). The major difference between qualitative and quantitative approaches lies in their epistemological foundations (Bamberger et al. 2006). In other words, the approaches differ in what constitutes knowledge, how knowledge is acquired, and how it is used. Ragin (1994, 93) explains, “Most quantitative data techniques are data condensers. They condense data in order to see the big picture. Qualitative methods, by contrast, are best understood as data enhancers. When data are enhanced, it is possible to see key aspects of cases more clearly.”

The underlying assumptions of qualitative methods are closely related to Cooperative Extension’s mission of understanding and meeting people’s needs at the local level (U.S. Department of Agriculture 2010). For Extension administrators and faculty, qualitative program evaluations can enhance understanding of their participants’ experiences (Bamberger et al. 2006). This is done through techniques that give voice and articulate participant perspectives (Bodgan and Biklen 1998). Qualitative analyses are often used in large-scale, rigorous, and formal program evaluations. However, they can also be used in the pilot studies, small budget projects, ad hoc, and quick-turnaround endeavors that many Extension faculty undertake (Bamberger et al. 2006; Caudle 2004). This review highlights the following issues for Extension faculty who may be interested in using qualitative methods in program evaluation:

The research question

Research questions are different in quantitative and qualitative methodologies (Corbin and Strauss 2008). Qualitative research questions are used to seek understanding of phenomena that are not fully developed, where quantitative methods are used to test hypotheses. In qualitative research, the research question leads the evaluator into the data where the issue can be explored. Qualitative research questions are broader than quantitative research questions but should be specific enough to tell the reader what is being investigated. For example, “What do male participants say about their marital relationships after completing a marriage enrichment course?” The question identifies the topic (marital relationships), the period in time (after program completion), and the perspective of interest (men who participated in a marriage enrichment course). With qualitative research, the perspective of interest can be individuals, families, groups, or organizations (Corbin and Strauss 2008).

Qualitative data collection

Once a research question has been formulated then data can be collected from appropriate sources. A particular strength of qualitative research is the variety of data sources that can be used including face-to-face interviews, phone interviews, focus groups, videos, observation, diaries, or historical documents (Corbin and Strauss 2008). Interviews are commonly used in qualitative program evaluations (Bamberger et al. 2006).

Qualitative interviewing is typically semi-structured. The interviewer has a focus but is also afforded flexibility (Bamberger et al. 2006). In semi-structured interviews the interviewer generally has a list of questions and discussion prompts, but the order in which they are asked can vary in each interview. The interviewer may ask additional questions and probe beyond the questions on their lists (Berg 1998). Some things to consider in collecting data through interviews include the following:

Qualitative data analysis

Generally, qualitative findings are generated through inductive processes ­— from detailed information to general themes (Bamberger et al. 2006). The most common qualitative analytic technique is thematic analysis. Thematic analysis involves:

This type of data analysis requires attention to detail and simultaneously being able to consider the data as a whole. Depending on the number and length of interviews, this process can be very time consuming. There are several variations of thematic approaches (Bodgan and Biklen 1998). There are also other analysis techniques that can be used depending on the type of data that is collected (see Berg 1998; Corbin and Strauss 2008; Creswell 2007).

Quality in qualitative evaluation

The quality of qualitative research rests on how the data are gathered and analyzed (Tracy 2010). “Trustworthiness” is a common term in qualitative research and is closely related to the term “validity” in quantitative research (Marshall and Rossman 2011). This term refers to the credibility, transferability, dependability, and objectivity of the research (Marshall and Rossman 2011; Schwandt 2007). Increasing the trustworthiness of the study increases the likelihood that evaluation results will warrant publication. A few suggestions for increasing trustworthiness include

Challenges and considerations in qualitative evaluation

Qualitative evaluation does not come without challenges. The beginning qualitative researcher may feel overwhelmed by the time and expertise required to complete qualitative evaluations (Corbin and Strauss 2008). Many of the procedures and terminologies used within qualitative inquiry are very different than quantitative research (Malterud 2001).

As with any evaluation, Extension faculty must carefully make a plan to complete the evaluation in light of their other responsibilities and time constraints. Organization and documentation is particularly important when working with large data sets (e.g., transcripts, recordings, field notes) (Bogdan and Biklen 1998; Caudle 2004). Research procedures should be documented and accepted best practices should be followed to ensure quality and trustworthiness. Planning the entire process from the onset can also increase the coherence in the design and procedures (Maxwell 2009). The plan should include realistic time frames for conducting interviews, transcribing, coding, and writing.

Participants may feel uncomfortable with the less-structured nature of qualitative interviews. Consideration should be given in the procedures to build rapport and to ensure participants’ confidentiality. Extension faculty may need to identify areas of qualitative inquiry that they may need to read more about or seek mentorship from a more experienced qualitative researcher.

When data is collected and analyzed, researchers should use caution in discussing implications and generalized findings. The foundational purposes of qualitative research are different than quantitative research. Malterud (2001, 486) explained, “The findings from a qualitative study are not thought of as facts that are applicable to the population at large, but rather as descriptions, notions, or theories applicable within a specified setting.” The sampling technique and rigor of the data collection influence the scope of the generalizability or transferability of the findings. The results from qualitative studies provide in-depth and rich information that can lead to new hypotheses, theory, and directions in programming. Before presenting or submitting an article based on qualitative data, Extension faculty should consider the scope and purpose of the research to make sure the evaluation will make a meaningful impact on the field (Tracy 2010).

Publishing qualitative results is one way to contribute to the progression of Extension. The trustworthiness of the data is critical because academic journals attempt to publish rigorous findings. Some academic journals do not publish qualitative research but some journals exclusively publish qualitative research (e.g., http://qrj.sagepub.com/). The Forum for Family and Consumer Issues and Journal of Extension regularly publish articles that use qualitative methods. Lists of journals that are receptive to qualitative methods can be found online (see http://www.slu.edu/organizations/qrc/QRjournals.html). Reviewing qualitative articles from these journals can lead to a greater understanding of qualitative procedures and terminologies.

Conclusion

Extension faculty are generally required to publish articles in order to meet tenure promotion requirements (Schwab 2003). They are also expected to provide quality research-based programming (U.S. Department of Agriculture 2010). It is possible for Extension faculty to accomplish both of these purposes through the evaluation of their programs. Qualitative evaluation may serve as a less intimidating way to contribute to professional literature and meet promotion requirements. It does not require an advanced knowledge of statistics and can be done at a scale and scope to match each agent’s budget, interests, and need. Furthermore, steps can be taken to insure the quality of the results and to enhance the trustworthiness of the process. When done well, qualitative research can provide valuable insights that can be used to improve programs locally while also influencing related programming efforts more broadly (see Higginbotham, Henderson, and Adler-Baeder 2007).

References

Bamberger, M., J. Rugh, and L. Mabry. 2006. Real World Evaluation: Working under budget, time, data, and political constraints. Thousand Oaks, CA: Sage.

Berg, B. 1998. Qualitative research methods for the social sciences. Boston: Allyn and Bacon.

Bodgan, R., and S. Biklen. 2007. Qualitative research for education: An introduction to theories and methods. Boston: Pearson.

Caudle, S.L. 2004. “Qualitative data analysis,” in J.S. Wholey, H.P. Hatry, and K.E. Newcomer (eds.) Handbook of practical program evaluation. San Francisco: Jossey-Bass. 417-438.

Corbin, J., and A. Strauss. 2008. Basics of qualitative research. Thousand Oaks, CA: Sage.

Creswell, J. W. 2007. Qualitative inquiry and research design: Choosing among five approaches. Thousand Oaks, CA: Sage.

Kaplan, R. and D. Saccuzzo. 2009. Psychological testing: Principles, applications, and issues. Belmont, CA: Wadsworth.

Higginbotham, B., K. Henderson, and F. Adler-Baeder. 2007. Using research in marriage and relationship education programming. Forum for Family and Consumer Issues12(1).http://ncsu.edu/ffci/publications/2007/v12-n1-2007-spring/higginbotham/fa-4-higginbotham.php

Malterud, K. 2001. “Qualitative research: Standards, challenges, and guidelines.” The Lancet 358(9280): 483-488. doi: 10.1016/S0140-6736(01)05627-6

Marshall, C., and G. Rossman. 2011. Designing qualitative research. Thousand Oaks, CA: Sage.

Maxwell, J.A. 2009. “Designing a qualitative study.” In L. Bickman and D.J. Rog (eds.) Applied Social Research Methods. Thousand Oaks, CA: Sage. 214-253.

Ragin, C. 1994. Constructing social research. Thousand Oaks, CA: Pine Forge Press.

Schwab, C. 2003. “Editor’s Corner: The scholarship of extension and engagement: What does it mean in the promotion and tenure process?” The Forum for Family and Consumer Issues 8(2).http://ncsu.edu/ffci/publications/2003/v8-n2-2003-may/editors-corner.php

Schwandt, T.A. 2007. “Judging interpretations.” New Directions for Evaluation 114:11-25.

Tracy, S.J. 2010. Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry 16:837-851.

United States Department of Agriculture. 2010. Extension. http://www.csrees.usda.gov/qlinks/extension.html

http://ncsu.edu/ffci/publications/2011/v16-n1-2011-spring/index-v16-n1-Marchr-2011.php

Footer Nav