{"id":42,"date":"2018-10-12T12:01:36","date_gmt":"2018-10-12T12:01:36","guid":{"rendered":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/chapter\/module-1-chapter-3-practice-evaluation-as-evidence\/"},"modified":"2024-11-20T18:25:36","modified_gmt":"2024-11-20T18:25:36","slug":"module-1-chapter-3-practice-evaluation-as-evidence","status":"publish","type":"chapter","link":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/chapter\/module-1-chapter-3-practice-evaluation-as-evidence\/","title":{"rendered":"Module 1 Chapter 3: Practice Evaluation as Evidence"},"content":{"raw":"The right-hand side of the evidence-intervention-evidence figure from Chapter 1 (Figure 1-1) is the focus of this chapter.\r\n\r\n<img class=\"aligncenter size-full wp-image-21\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention.png\" alt=\"\" width=\"650\" height=\"339\" \/>\r\n\r\nIn Chapter 2 we looked at evidence-informed practice decisions. In this chapter, we introduce information about evaluating practice, what other disciplines call data-based or data-driven decision making: using data and evaluation research methods to make social work practice accountable and to inform practice improvement efforts.\r\n\r\nIn this chapter you will learn:\r\n<ul>\r\n \t<li>basic principles related to four evaluation formats (needs assessment, outcome, process, cost-effectiveness)<\/li>\r\n \t<li>distinctions between practice, program, and policy evaluation<\/li>\r\n \t<li>how evaluation and intervention research compare.<\/li>\r\n<\/ul>\r\n<h2><strong>Why Evaluate?<\/strong><\/h2>\r\nIn short, social work professionals engage in evaluation of practice as an accountability issue. We are accountable to clients, programs, funders, policy decision-makers, and the profession to ensure that we are delivering the best possible services, that the services we deliver achieve the promised benefits, and that the resources dedicated to our services are well-spent. This has previously been covered in our discussions regarding standards presented in the Social Work Code of Ethics. Of particular relevance to this discussion is the Standard 5.02 concerning evaluation and research (p. 27). Social workers are expected to evaluate policies, programs, and practice interventions, as well as facilitate research that contributes to the development of knowledge.\r\n<h2><strong>What is Evaluation?<\/strong><\/h2>\r\nThroughout this book, we examine methods for evaluating intervention efforts. A framework for understanding different approaches to evaluation is helpful, beginning with the nature of the evaluation research questions and exploring how these relate to different forms or approaches to evaluation.\r\n\r\n<strong>Evaluation Questions.\u00a0<\/strong>By now you recognize that research designs and methodologies are driven by the nature of the research questions being asked. This is equally true in the evaluation research arena. Here is a sample of the kinds of questions asked in evaluating social work practice at different levels:\r\n<ul>\r\n \t<li>Did client behavior change to a significant degree and in the desired direction?<\/li>\r\n \t<li>Were gains associated with intervention sustained over time?<\/li>\r\n \t<li>Are there unintended negative consequences associated with the intervention?<\/li>\r\n \t<li>To what extent are principles of diversity awareness integrated into practitioner behaviors and practitioner supervision?<\/li>\r\n \t<li>How satisfied are clients with various aspects of the delivered intervention?<\/li>\r\n \t<li>Is the intervention\u2019s cost\/benefit ratio favorable compared to other intervention options?<\/li>\r\n \t<li>Are some people deriving more benefit than others from the intervention?<\/li>\r\n \t<li>Is there a more cost-efficient way to achieve similar gains from the intervention?<\/li>\r\n<\/ul>\r\n<strong>Evaluation Formats<\/strong>. Because evaluation questions differ, social workers employ varied formats for engaging in evaluation. Here is a description of four major forms of evaluation research: needs assessment, outcome evaluation, process evaluation, and cost-effectiveness evaluation.<strong>\u00a0<\/strong>\r\n\r\n<em>Needs assessment. <\/em>The aim of <strong><em>needs assessment\u00a0<\/em><\/strong>is to answer questions related to the scope of a problem or need and where gaps exist in efforts to address the problem or need. For example, school social workers may want to know about the problem of bullying that occurs in a school district. They might engage in a needs assessment to determine the nature and extent of the problem, what is needed to eradicate the problem, and how the problem is being addressed across the district. Where they detect sizeable gaps between need and services provided, social workers can develop targeted responses. The needs assessment might also indicate that different responses need to be launched in different circumstances, such as: elementary, middle, and high school levels; or, parents, teachers, administrators, peers, and mental health professionals in the district; or, different neighborhood schools across the district. Needs assessment is often concerned with the discrepancy between what is needed and what is accessed in services, not only what is offered. As proponent of social justice, social workers are also concerned with identifying and addressing disparities (differential gaps) based on income, race\/ethnicity, gender\/gender identity, sexual orientation, age, national origin, symptom severity, geographical location (e.g., urban, suburban, rural disparities), and other aspects of human diversity. The gap between two sides or groups is sometimes monumental.\r\n\r\n[caption id=\"attachment_41\" align=\"aligncenter\" width=\"397\"]<img class=\"size-full wp-image-36\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms.png\" alt=\"\" width=\"397\" height=\"166\" \/> \u00a9Audrey Begun[\/caption]\r\n\r\n<em>Outcome evaluation. <\/em>Evaluating practice outcomes happens at multiple levels: individual cases, programs, and policy. Social work professionals work with clients or client systems to achieve specific change goals and objectives. For example, this might be reducing a person\u2019s alcohol consumption or tobacco use, a couple having fewer arguments, improving student attendance throughout a school, reducing violence in a community, or breaking a gender or race based \u201cglass ceiling\u201d in an institution. Regardless of the level of intervention, social work professionals evaluate the impact of their practices and intervention efforts. This type of research activity is called <strong><em>outcome evaluation.\u00a0<\/em><\/strong>When outcome evaluation is directed to understanding the impact of practices on specific clients or client systems, it is called\u00a0<strong><em>practice evaluation<\/em><\/strong>.\r\n\r\nEvaluating the outcomes of interventions also happens at the aggregate level of programs. Social workers engaged in <strong><em>program evaluation\u00a0<\/em><\/strong>look at the impact of an intervention program on the group of clients or client systems it serves. Rather than providing feedback about an individual client or client system, the feedback concerns multiple clients engaged in the intervention program. For example, social workers might wish to evaluate the extent to which child health goals (outcomes) were achieved with an intervention program for empowering parents to eliminate their young children\u2019s exposure to third-hand smoke. The background for this work is described in an article explaining that third hand smoke is the residue remaining on skin, clothing, hair, upholstery, carpeting, and other surfaces; it differs from first- or second-hand smoke exposure because the individuals are not exposed by smoking themselves or breathing the smoke someone else produces. Young children come into close contact with contaminated surfaces when being held by caregivers, riding in vehicles, or crawling and toddling around the home where smoking has occurred, leaving residue behind (Begun, Barnhart, Gregoire, &amp; Shepperd, 2014). Outcome oriented program evaluation would be directed toward assessing the impact of an intervention delivered to a group of parents with young children at risk of exposure to third-hand smoke at home, in transportation, from relatives, or in child care settings.\r\n\r\n<img class=\"aligncenter size-full wp-image-37\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS.png\" alt=\"\" width=\"529\" height=\"172\" \/>\r\n\r\n<strong><em>Policy evaluation\u00a0<\/em><\/strong>has a lot in common with program evaluation, because policy is a form of intervention. Policy evaluation data are based on intervention effects experienced by many individuals, neighborhoods, communities, or programs\/institutions taken together, not tracking what happens with one client system or a single program at a time. For example, communities may gather a great deal of evaluation data about the impact on drug overdose deaths related to policies supporting first-responders, family members, friends, and bystanders being able to deliver opioid overdose reversal medications (naloxone) when first encountering someone suspected of experiencing opioid overdose. \u201cAs an antidote to opioid overdoses, naloxone has proven to be a valuable tool in combating overdose deaths and associated morbidity\u201d (Kerensky &amp; Walley, 2017, p. 6). Policy evaluation can answer the question of how much impact such a policy change can make. Policy evaluation also answers questions such as: who should be provided with naloxone rescue kits; how naloxone rescue kit prescribing education might alter opioid prescribing behavior; whether different naloxone formulations, doses, and delivery methods provide similar results and how do their costs compare; how what happens after overdose rescue might keep people safe and link them to services to prevent future overdose events; and, how local, state, and federal laws affect this policy\u2019s implementation (see Kerensky &amp; Walley, 2017). These factors help determine if the impact of a policy is simply a drop in the bucket or a flood of change.\r\n\r\n<img class=\"aligncenter size-full wp-image-38\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-drop-of-water.png\" alt=\"\" width=\"268\" height=\"172\" \/>\r\n\r\n<em>Process evaluation.\u00a0<\/em><strong><em>Process evaluation\u00a0<\/em><\/strong>is less concerned with questions about outcomes than with questions about how an intervention or program is implemented. Why evaluating process matters is clear if you think about fidelity examples previously discussed (e.g., the Duluth model for community response to domestic violence). Process evaluation matters in determining what practitioners really do when intervening and what clients or client systems experience during an intervention. It also matters in terms of understanding the \u201cmeans to the end,\u201d beyond simply observing the end results. Process evaluation also examines the way an intervention or program is supported by agency administrators, agency activities, and distribution of resources\u2014the context of the intervention\u2014and possible efficiencies or inefficiencies in how an intervention is delivered.\r\n<blockquote><em>\u201cProcess evaluations involve monitoring and measuring variables such as communication flow, decision-making protocols, staff workload, client record keeping, program supports, staff training, and worker-client activities. Indeed, the entire sequence of activities that a program undertakes to achieve benefits for program clients or consumers is open to the scrutiny of process evaluations\u201d (Grinell &amp; Unrau, 2014, p. 662). <\/em><\/blockquote>\r\nFor example, despite child welfare caseworkers\u2019 recognition of the critically important role in child development for early identification of young children\u2019s mental health problems and needs, they also encounter difficulties that present significant barriers to effectively doing so (Hoffman et al., 2016). Through process evaluation, the investigators identified barriers that included differences in how workers and parents perceived the children\u2019s behavioral problems, a lack of age-appropriate mental health services being available, inconsistencies with their caseworker roles and training\/preparation to assess and address these problems, and a lack of standardized tools and procedures.\r\n\r\n<img class=\"aligncenter size-full wp-image-39\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram.png\" alt=\"\" width=\"445\" height=\"279\" \/>\r\n\r\n<em>Cost-effectiveness evaluation. <\/em>Cost-related evaluations address the relationship between resources applied through intervention and the benefits derived from that intervention. You make these kinds of decisions on a regular basis: is the pleasure derived from a certain food or beverage \u201cworth\u201d the cost in dollars or calories, or maybe the degree of effort involved? While costs are often related to dollars spent, relevant costs might also include a host of other resources\u2014staff time and effort, space, training and credential requirements, other activities being curtailed, and so forth. Benefits might be measured in terms of dollars saved, but are also measured in terms of achieving goals and objectives of the intervention. In a <strong><em>cost-effectiveness evaluation\u00a0<\/em><\/strong>study of Mental Health Courts conducted in Pennsylvania, diversion of individuals with serious mental illness and non-violent offenses into community-based treatment posed no increased risk to the public and reduced jail time (two significant outcomes). Overall, the \u201cdecrease in jail expenditures mostly offset the cost of the treatment services\u201d (Psychiatric Times, 2007, p. 1)\u2014another significant outcome. The intervention\u2019s cost-effectiveness was greatest when offenses were at the level of a felony and for individuals with severe psychiatric disorders. While cost savings were realized by taxpayers, complicating the picture was the fact that the budget where gains were situated (criminal justice) is separate from the budget where the costs were incurred (mental health system).\r\n\r\n<img class=\"aligncenter size-full wp-image-40\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality.png\" alt=\"\" width=\"402\" height=\"277\" \/>\r\n<h2><strong>How Evaluation and Intervention Research Compare<\/strong><\/h2>\r\nThe goals, objectives, and methods of <strong><em>evaluation research\u00a0<\/em><\/strong>and <strong><em>intervention research\u00a0<\/em><\/strong>often appear to be very similar. In both cases, systematic research procedures are applied to answer questions about an intervention. However, there exist important differences between evaluation and research to consider, important because they have implications for how investigators and evaluators approach the pursuit of evidence.\r\n\r\nDifferences begin with the nature of the research questions being asked. Evaluation researchers pursue specific knowledge, intervention researchers pursue generalizable knowledge. In evaluation, the goal is to inform leader or administrator decisions about a program, or to inform an individual practitioner\u2019s intervention decisions about work with specific clients. The aim of practice or program evaluation is to determine the worth of an intervention to their agency, their clients, and their stakeholders. Intervention researchers, on the other hand, have as their goal the production of knowledge or the advancing of theory for programs and practitioners more generally\u2014not a specific program or practitioner. This difference translates into differences in how the research process is approached in evaluation compared to intervention science. Figure 3-1 depicts the differences in approach, methodology, analysis, and reporting between evaluation and intervention research (LaVelle, 2010).\r\n\r\nFigure 3-1. Differences between intervention and evaluation research.\r\n\r\n<img class=\"aligncenter size-full wp-image-41\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research.png\" alt=\"\" width=\"708\" height=\"527\" \/>\r\n<div style=\"float: left;min-height: 120px;width: 99%;margin-bottom: 10px;padding: 10px;background-color: #f1f7fe\">\r\n\r\n<img class=\"size-full wp-image-194 alignleft\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Stop_ThinkV2-150x150-1.png\" alt=\"Stop and Think\" width=\"150\" height=\"150\" \/>Take a moment to complete the following activity.\r\n\r\n[h5p id=\"3\"]\r\n\r\n<\/div>\r\n<h2>Chapter Summary<\/h2>\r\nIn this chapter, you were introduced to why evaluation and empirical evidence are important in social work practice. You also learned about the nature of evaluation questions and how these relate to evaluation research. In this chapter, we introduced four different formats for evaluation (needs assessment, outcome evaluation, process evaluation, and cost-effectiveness evaluation), and you learned to distinguish between evaluation and intervention research.","rendered":"<p>The right-hand side of the evidence-intervention-evidence figure from Chapter 1 (Figure 1-1) is the focus of this chapter.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-21\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention.png\" alt=\"\" width=\"650\" height=\"339\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention.png 650w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention-300x156.png 300w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention-65x34.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention-225x117.png 225w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Figure-1-1.-Relationships-between-social-work-evidence-and-intervention-350x183.png 350w\" sizes=\"auto, (max-width: 650px) 100vw, 650px\" \/><\/p>\n<p>In Chapter 2 we looked at evidence-informed practice decisions. In this chapter, we introduce information about evaluating practice, what other disciplines call data-based or data-driven decision making: using data and evaluation research methods to make social work practice accountable and to inform practice improvement efforts.<\/p>\n<p>In this chapter you will learn:<\/p>\n<ul>\n<li>basic principles related to four evaluation formats (needs assessment, outcome, process, cost-effectiveness)<\/li>\n<li>distinctions between practice, program, and policy evaluation<\/li>\n<li>how evaluation and intervention research compare.<\/li>\n<\/ul>\n<h2><strong>Why Evaluate?<\/strong><\/h2>\n<p>In short, social work professionals engage in evaluation of practice as an accountability issue. We are accountable to clients, programs, funders, policy decision-makers, and the profession to ensure that we are delivering the best possible services, that the services we deliver achieve the promised benefits, and that the resources dedicated to our services are well-spent. This has previously been covered in our discussions regarding standards presented in the Social Work Code of Ethics. Of particular relevance to this discussion is the Standard 5.02 concerning evaluation and research (p. 27). Social workers are expected to evaluate policies, programs, and practice interventions, as well as facilitate research that contributes to the development of knowledge.<\/p>\n<h2><strong>What is Evaluation?<\/strong><\/h2>\n<p>Throughout this book, we examine methods for evaluating intervention efforts. A framework for understanding different approaches to evaluation is helpful, beginning with the nature of the evaluation research questions and exploring how these relate to different forms or approaches to evaluation.<\/p>\n<p><strong>Evaluation Questions.\u00a0<\/strong>By now you recognize that research designs and methodologies are driven by the nature of the research questions being asked. This is equally true in the evaluation research arena. Here is a sample of the kinds of questions asked in evaluating social work practice at different levels:<\/p>\n<ul>\n<li>Did client behavior change to a significant degree and in the desired direction?<\/li>\n<li>Were gains associated with intervention sustained over time?<\/li>\n<li>Are there unintended negative consequences associated with the intervention?<\/li>\n<li>To what extent are principles of diversity awareness integrated into practitioner behaviors and practitioner supervision?<\/li>\n<li>How satisfied are clients with various aspects of the delivered intervention?<\/li>\n<li>Is the intervention\u2019s cost\/benefit ratio favorable compared to other intervention options?<\/li>\n<li>Are some people deriving more benefit than others from the intervention?<\/li>\n<li>Is there a more cost-efficient way to achieve similar gains from the intervention?<\/li>\n<\/ul>\n<p><strong>Evaluation Formats<\/strong>. Because evaluation questions differ, social workers employ varied formats for engaging in evaluation. Here is a description of four major forms of evaluation research: needs assessment, outcome evaluation, process evaluation, and cost-effectiveness evaluation.<strong>\u00a0<\/strong><\/p>\n<p><em>Needs assessment. <\/em>The aim of <strong><em>needs assessment\u00a0<\/em><\/strong>is to answer questions related to the scope of a problem or need and where gaps exist in efforts to address the problem or need. For example, school social workers may want to know about the problem of bullying that occurs in a school district. They might engage in a needs assessment to determine the nature and extent of the problem, what is needed to eradicate the problem, and how the problem is being addressed across the district. Where they detect sizeable gaps between need and services provided, social workers can develop targeted responses. The needs assessment might also indicate that different responses need to be launched in different circumstances, such as: elementary, middle, and high school levels; or, parents, teachers, administrators, peers, and mental health professionals in the district; or, different neighborhood schools across the district. Needs assessment is often concerned with the discrepancy between what is needed and what is accessed in services, not only what is offered. As proponent of social justice, social workers are also concerned with identifying and addressing disparities (differential gaps) based on income, race\/ethnicity, gender\/gender identity, sexual orientation, age, national origin, symptom severity, geographical location (e.g., urban, suburban, rural disparities), and other aspects of human diversity. The gap between two sides or groups is sometimes monumental.<\/p>\n<figure id=\"attachment_41\" aria-describedby=\"caption-attachment-41\" style=\"width: 397px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-36\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms.png\" alt=\"\" width=\"397\" height=\"166\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms.png 397w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms-300x125.png 300w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms-65x27.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms-225x94.png 225w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2018\/10\/Geologic-land-forms-350x146.png 350w\" sizes=\"auto, (max-width: 397px) 100vw, 397px\" \/><figcaption id=\"caption-attachment-41\" class=\"wp-caption-text\">\u00a9Audrey Begun<\/figcaption><\/figure>\n<p><em>Outcome evaluation. <\/em>Evaluating practice outcomes happens at multiple levels: individual cases, programs, and policy. Social work professionals work with clients or client systems to achieve specific change goals and objectives. For example, this might be reducing a person\u2019s alcohol consumption or tobacco use, a couple having fewer arguments, improving student attendance throughout a school, reducing violence in a community, or breaking a gender or race based \u201cglass ceiling\u201d in an institution. Regardless of the level of intervention, social work professionals evaluate the impact of their practices and intervention efforts. This type of research activity is called <strong><em>outcome evaluation.\u00a0<\/em><\/strong>When outcome evaluation is directed to understanding the impact of practices on specific clients or client systems, it is called\u00a0<strong><em>practice evaluation<\/em><\/strong>.<\/p>\n<p>Evaluating the outcomes of interventions also happens at the aggregate level of programs. Social workers engaged in <strong><em>program evaluation\u00a0<\/em><\/strong>look at the impact of an intervention program on the group of clients or client systems it serves. Rather than providing feedback about an individual client or client system, the feedback concerns multiple clients engaged in the intervention program. For example, social workers might wish to evaluate the extent to which child health goals (outcomes) were achieved with an intervention program for empowering parents to eliminate their young children\u2019s exposure to third-hand smoke. The background for this work is described in an article explaining that third hand smoke is the residue remaining on skin, clothing, hair, upholstery, carpeting, and other surfaces; it differs from first- or second-hand smoke exposure because the individuals are not exposed by smoking themselves or breathing the smoke someone else produces. Young children come into close contact with contaminated surfaces when being held by caregivers, riding in vehicles, or crawling and toddling around the home where smoking has occurred, leaving residue behind (Begun, Barnhart, Gregoire, &amp; Shepperd, 2014). Outcome oriented program evaluation would be directed toward assessing the impact of an intervention delivered to a group of parents with young children at risk of exposure to third-hand smoke at home, in transportation, from relatives, or in child care settings.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-37\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS.png\" alt=\"\" width=\"529\" height=\"172\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS.png 529w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS-300x98.png 300w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS-65x21.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS-225x73.png 225w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Stylized-word-art-of-the-word-RESULTS-350x114.png 350w\" sizes=\"auto, (max-width: 529px) 100vw, 529px\" \/><\/p>\n<p><strong><em>Policy evaluation\u00a0<\/em><\/strong>has a lot in common with program evaluation, because policy is a form of intervention. Policy evaluation data are based on intervention effects experienced by many individuals, neighborhoods, communities, or programs\/institutions taken together, not tracking what happens with one client system or a single program at a time. For example, communities may gather a great deal of evaluation data about the impact on drug overdose deaths related to policies supporting first-responders, family members, friends, and bystanders being able to deliver opioid overdose reversal medications (naloxone) when first encountering someone suspected of experiencing opioid overdose. \u201cAs an antidote to opioid overdoses, naloxone has proven to be a valuable tool in combating overdose deaths and associated morbidity\u201d (Kerensky &amp; Walley, 2017, p. 6). Policy evaluation can answer the question of how much impact such a policy change can make. Policy evaluation also answers questions such as: who should be provided with naloxone rescue kits; how naloxone rescue kit prescribing education might alter opioid prescribing behavior; whether different naloxone formulations, doses, and delivery methods provide similar results and how do their costs compare; how what happens after overdose rescue might keep people safe and link them to services to prevent future overdose events; and, how local, state, and federal laws affect this policy\u2019s implementation (see Kerensky &amp; Walley, 2017). These factors help determine if the impact of a policy is simply a drop in the bucket or a flood of change.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-38\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-drop-of-water.png\" alt=\"\" width=\"268\" height=\"172\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-drop-of-water.png 268w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-drop-of-water-65x42.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-drop-of-water-225x144.png 225w\" sizes=\"auto, (max-width: 268px) 100vw, 268px\" \/><\/p>\n<p><em>Process evaluation.\u00a0<\/em><strong><em>Process evaluation\u00a0<\/em><\/strong>is less concerned with questions about outcomes than with questions about how an intervention or program is implemented. Why evaluating process matters is clear if you think about fidelity examples previously discussed (e.g., the Duluth model for community response to domestic violence). Process evaluation matters in determining what practitioners really do when intervening and what clients or client systems experience during an intervention. It also matters in terms of understanding the \u201cmeans to the end,\u201d beyond simply observing the end results. Process evaluation also examines the way an intervention or program is supported by agency administrators, agency activities, and distribution of resources\u2014the context of the intervention\u2014and possible efficiencies or inefficiencies in how an intervention is delivered.<\/p>\n<blockquote><p><em>\u201cProcess evaluations involve monitoring and measuring variables such as communication flow, decision-making protocols, staff workload, client record keeping, program supports, staff training, and worker-client activities. Indeed, the entire sequence of activities that a program undertakes to achieve benefits for program clients or consumers is open to the scrutiny of process evaluations\u201d (Grinell &amp; Unrau, 2014, p. 662). <\/em><\/p><\/blockquote>\n<p>For example, despite child welfare caseworkers\u2019 recognition of the critically important role in child development for early identification of young children\u2019s mental health problems and needs, they also encounter difficulties that present significant barriers to effectively doing so (Hoffman et al., 2016). Through process evaluation, the investigators identified barriers that included differences in how workers and parents perceived the children\u2019s behavioral problems, a lack of age-appropriate mental health services being available, inconsistencies with their caseworker roles and training\/preparation to assess and address these problems, and a lack of standardized tools and procedures.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-39\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram.png\" alt=\"\" width=\"445\" height=\"279\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram.png 445w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram-300x188.png 300w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram-65x41.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram-225x141.png 225w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Photo-of-a-hand-drawing-a-blank-flow-chart-diagram-350x219.png 350w\" sizes=\"auto, (max-width: 445px) 100vw, 445px\" \/><\/p>\n<p><em>Cost-effectiveness evaluation. <\/em>Cost-related evaluations address the relationship between resources applied through intervention and the benefits derived from that intervention. You make these kinds of decisions on a regular basis: is the pleasure derived from a certain food or beverage \u201cworth\u201d the cost in dollars or calories, or maybe the degree of effort involved? While costs are often related to dollars spent, relevant costs might also include a host of other resources\u2014staff time and effort, space, training and credential requirements, other activities being curtailed, and so forth. Benefits might be measured in terms of dollars saved, but are also measured in terms of achieving goals and objectives of the intervention. In a <strong><em>cost-effectiveness evaluation\u00a0<\/em><\/strong>study of Mental Health Courts conducted in Pennsylvania, diversion of individuals with serious mental illness and non-violent offenses into community-based treatment posed no increased risk to the public and reduced jail time (two significant outcomes). Overall, the \u201cdecrease in jail expenditures mostly offset the cost of the treatment services\u201d (Psychiatric Times, 2007, p. 1)\u2014another significant outcome. The intervention\u2019s cost-effectiveness was greatest when offenses were at the level of a felony and for individuals with severe psychiatric disorders. While cost savings were realized by taxpayers, complicating the picture was the fact that the budget where gains were situated (criminal justice) is separate from the budget where the costs were incurred (mental health system).<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-40\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality.png\" alt=\"\" width=\"402\" height=\"277\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality.png 402w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality-300x207.png 300w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality-65x45.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality-225x155.png 225w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/The-iron-triangle-of-time-cost-and-quality-350x241.png 350w\" sizes=\"auto, (max-width: 402px) 100vw, 402px\" \/><\/p>\n<h2><strong>How Evaluation and Intervention Research Compare<\/strong><\/h2>\n<p>The goals, objectives, and methods of <strong><em>evaluation research\u00a0<\/em><\/strong>and <strong><em>intervention research\u00a0<\/em><\/strong>often appear to be very similar. In both cases, systematic research procedures are applied to answer questions about an intervention. However, there exist important differences between evaluation and research to consider, important because they have implications for how investigators and evaluators approach the pursuit of evidence.<\/p>\n<p>Differences begin with the nature of the research questions being asked. Evaluation researchers pursue specific knowledge, intervention researchers pursue generalizable knowledge. In evaluation, the goal is to inform leader or administrator decisions about a program, or to inform an individual practitioner\u2019s intervention decisions about work with specific clients. The aim of practice or program evaluation is to determine the worth of an intervention to their agency, their clients, and their stakeholders. Intervention researchers, on the other hand, have as their goal the production of knowledge or the advancing of theory for programs and practitioners more generally\u2014not a specific program or practitioner. This difference translates into differences in how the research process is approached in evaluation compared to intervention science. Figure 3-1 depicts the differences in approach, methodology, analysis, and reporting between evaluation and intervention research (LaVelle, 2010).<\/p>\n<p>Figure 3-1. Differences between intervention and evaluation research.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-41\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research.png\" alt=\"\" width=\"708\" height=\"527\" srcset=\"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research.png 708w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research-300x223.png 300w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research-65x48.png 65w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research-225x167.png 225w, https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-content\/uploads\/sites\/185\/2024\/08\/Differences-between-intervention-and-evaluation-research-350x261.png 350w\" sizes=\"auto, (max-width: 708px) 100vw, 708px\" \/><\/p>\n<div style=\"float: left;min-height: 120px;width: 99%;margin-bottom: 10px;padding: 10px;background-color: #f1f7fe\">\n<p><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-194 alignleft\" src=\"https:\/\/pressbooks.ulib.csuohio.edu\/projectmanagement2ndedition\/wp-content\/uploads\/sites\/185\/2024\/08\/Stop_ThinkV2-150x150-1.png\" alt=\"Stop and Think\" width=\"150\" height=\"150\" \/>Take a moment to complete the following activity.<\/p>\n<div class=\"h5p-iframe-wrapper\"><iframe id=\"h5p-iframe-3\" class=\"h5p-iframe\" data-content-id=\"3\" style=\"height:1px\" src=\"about:blank\" frameBorder=\"0\" scrolling=\"no\" title=\"Module 1 Chapter 3 Types of Evaluation\"><\/iframe><\/div>\n<\/div>\n<h2>Chapter Summary<\/h2>\n<p>In this chapter, you were introduced to why evaluation and empirical evidence are important in social work practice. You also learned about the nature of evaluation questions and how these relate to evaluation research. In this chapter, we introduced four different formats for evaluation (needs assessment, outcome evaluation, process evaluation, and cost-effectiveness evaluation), and you learned to distinguish between evaluation and intervention research.<\/p>\n","protected":false},"author":3,"menu_order":4,"template":"","meta":{"pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[],"contributor":[],"license":[],"class_list":["post-42","chapter","type-chapter","status-publish","hentry"],"part":18,"_links":{"self":[{"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/chapters\/42","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/wp\/v2\/users\/3"}],"version-history":[{"count":4,"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/chapters\/42\/revisions"}],"predecessor-version":[{"id":290,"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/chapters\/42\/revisions\/290"}],"part":[{"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/parts\/18"}],"metadata":[{"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/chapters\/42\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/wp\/v2\/media?parent=42"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/pressbooks\/v2\/chapter-type?post=42"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/wp\/v2\/contributor?post=42"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/pressbooks.ulib.csuohio.edu\/swk627\/wp-json\/wp\/v2\/license?post=42"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}