The Contribution of Implementation Evaluation to the Field of Public Health
GUEST EDITORIAL — Volume 20 — November 2, 2023
Tamara Vehige Calise, DrPh, MEd1; Antonio J. Gardner, PhD, MS, MCHES2 (View author affiliations)
Suggested citation for this article: Calise TV, Gardner AJ. The Contribution of Implementation Evaluation to the Field of Public Health. Prev Chronic Dis 2023;20:230323. DOI: http://dx.doi.org/10.5888/pcd20.230323.
PEER REVIEWED
Introduction
Chronic diseases such as heart disease, diabetes, and cancer are the leading causes of death and disability in the United States, contributing largely to the $4.1 trillion in health care costs spent per year across the nation (1). Once overlooked or considered as secondary influences on chronic diseases, social factors — education, lifestyles, living situations, financial conditions, cultural traditions, and governmental policies, among others — are now acknowledged as major contributors to health, affecting individuals, groups, and communities in positive and negative ways (2). Furthermore, these factors interact, influence, modify, and enable or constrain health interventions and program implementation across different settings. Many modern-day efforts work to change conditions, or the context within which health is produced, and may have multilevel and multidirectional interventions that perform independently and interdependently. Subsequently, the outcomes are the product of the complex interventions as well as the contexts in which they develop.
Judgements about strategies that address the social factors have traditionally been guided by randomized controlled trials (RCTs) — the “gold standard” — in identifying effective interventions (3). However, adjusting the influence of contextual factors as causal effects or controlling for these conditions in an attempt to reduce bias is nearly impossible, especially when “real life” changes in unpredictable and variable ways (4). Accordingly, there is concern that complex interventions deemed effective by RCTs may not reduce health inequities and, in fact, could widen them (5). The more we embrace diverse opportunities for ongoing learning and thoughtful conduct, appraisal, and synthesis of information used to generate evidence, the more effective we will be in addressing public health complexity rooted in effecting change (6).
Purpose of the Special Collection
Preventing Chronic Disease (PCD), a peer-reviewed public health journal sponsored by the Centers for Disease Control and Prevention (CDC), promotes dialogue on the implementation and adaptation of evidence and practical experience to address inequities and improve population health (7). This special collection features 9 implementation evaluation (IE) articles — a type of article that PCD publishes — in which authors describe implementation and adaptations of interventions across a range of chronic disease risk factors that have been implemented and evaluated in real-world settings (8).
Dissemination and implementation models, theories, and frameworks are important in helping to understand context, understand how interventions work, and provide generalizable knowledge (9). Noticeable in this special collection is the use of frameworks, as well as the mixed-methods approaches to evaluation, and the alignment of work to theory. Whooten et al (10) and Harden et al (11) used RE-AIM (12) to systematically address the gap between research and practice. Although authors described physical activity (PA) programs and used elements of RE-AIM as metrics guiding their work, the approaches were different. Whooten et al (10) performed an exploratory concurrent-nested, mixed-methods evaluation of a preexisting before-school PA program. The authors highlighted adaptability and differences in implementation across the participating schools. Harden et al (11) documented adaptations made to an older adult PA program so that all audiences had access to relevant information that informs decision-making processes for training, delivery, and participation at the administrator, instructor, and participant levels. The authors presented contextual factors and processes that may be seen in the chronic disease morbidity and mortality reports more distally. Perry et al (13), who also reported on a PA program (among cancer survivors), used the Interactive Systems Framework for Dissemination and Implementation (14), which has similar elements to RE-AIM, to describe the context and processes that helped organizations implement the program.
Other authors contributed to the knowledge base by exploring specific factors of context such as partnerships, community readiness, and implementation strategies. Calancie et al (15) acknowledged the layers of complexity to prevent childhood obesity by describing a coalition approach driven by the Stakeholder-Driven Community Diffusion theory (16,17) for implementing, assessing, and analyzing collaborative efforts. The authors not only narrate the ways they assessed changes in coalition member knowledge and understanding of the problem and solutions, but they also provide details on how data were used to generate and implement action within their community. Linabarger et al (18) also presented an approach to unpack the role of collaboration in the development and implementation of dental care. The article shares insight on the use of mixed-methods evaluation to assess collaboration between the chronic disease and oral health programs of state departments to collaboratively develop and implement joint projects. The evaluation identifies many factors that facilitated collaboration including investing in relationships, creating a collaborative norm, and meeting and communicating frequently, which could be applicable in other public health areas. Long et al (19) spoke of collaboration and capacity building between academics and school staff. Although more details would be helpful to understand implementation and the specific adjustments made from evaluation input, the article demonstrated the utility of ongoing data collection and dissemination to ensure the sustainability of a complex environmental strategy to reduce sodium intake in school lunches in a large district.
Golden et al (20) described community readiness as a contextual factor to implement a pediatric weight management program in medically underserved areas and shared lessons learned on potential barriers and facilitators in communities that could affect implementation efforts. Leeman et al (21) presented methods used to assess implementation of quality improvement coaching for improving human papillomavirus vaccination coverage, part of an RCT, in an effort to identify variations, including implementation and contextual factors, across 3 states.
Finally, the article by Maxwell et al (22) is a good example of the differences in uptake, implementation, integration, and sustainability of interventions proven effective in increasing colorectal cancer screening (CCS). The authors looked at implementation across 355 clinics partnering with the Colorectal Cancer Control Program and suggested that both technical and financial support, and the ability to integrate 6 of 8 strategies into electronic health records, may be key to implementation. They also indicated that clinics may require even more support and encouragement to add 2 of the evidence-based interventions into their practice. Maxwell and her colleagues also reported that one of the evidence-based strategies is uniquely suited to reduce cancer disparities and may be of greater interest to clinics that serve populations with substantial barriers to CCS. These findings, in addition to the other studies published on this project, provide insight on context and may guide clinics in implementing or adapting approaches in their own settings.
Implications for Public Health
Communicating the variation in implementation and effectiveness, as well as understanding the applicability of findings from one context to another, could help decision makers best use their resources to address the major contributors of chronic disease among their specific populations and across their communities. The work described in these articles focuses on processes, implementation within specific contexts, and contextual factors. Although this information is useful in its own right, we highlight 2 aspects that may improve the future reporting of IEs. First, although PCD has a checklist with approximately 40 items for authors to consider as they prepare their manuscripts (8), the level of detail presented on context and the consistency describing the intervention and implementation varied across the 9 articles. For example, several articles reported implementation adaptations and illustrated various changes, which may help those interested in the intervention understand context in terms of the organizational resources and community environment, as well as other factors.
Second, there has been a growing recognition of the importance of establishing guidance for reporting IEs, and the interactions between an intervention and its contexts (23). Checklists such as the Consolidated Standards of Reporting Trials (CONSORT) for RCTs and the Transparent Reporting of Evaluations with Nonrandomized Designs (TREND) were developed to help authors report in a consistent, transparent, and complete manner (24,25). Although these are useful contributions, a commonly used best practice on criteria to assess IEs or how context should be considered and reported does not exist, which may explain the variations in this series and the appraisal of studies against criteria not suitable for this type of evaluation.
Authors reported the limitations of their reported IEs, a standard dissemination best practice. However, several authors used efficacy and effectiveness study criteria. For example, Harden et al (11) acknowledged the importance of an iterative cycle like assess, plan, do, evaluate, and report; the desire to disseminate information so that audiences can make informed decisions; and the unpredictable timeline associated with the process. They explained that efficacy trials are not necessary if an adaptation does not threaten outcomes, yet stated their study was limited by the fact that randomization or causation could not be explored.
Conclusion
Describing interventions and context is difficult given the possibilities and level of detail needed for those not directly affiliated to understand (26), but progress has been made. Criteria outlining the intervention and contextual categories, levels, or domains with which authors should judge their work and discuss when reporting evidence from IEs may be a great contribution to the field of public health. Consistency and details will make the evidence more useful to decision makers interested in implementing, adapting, sustaining, transferring, and scaling up interventions suited to address today’s complex public health in their respective communities.
Author Information
Corresponding Author: Tamara Vehige Calise, Co-Director Evaluation and Research, JSI Research & Training Institute Inc, Health Services Department, 44 Farnsworth St, Boston, MA 02210 (tcalise@jsi.com).
Author Affiliations: 1JSI Research & Training Institute, Inc, Boston, Massachusetts. 2The University of Alabama, Department of Community Medicine and Population Health, Tuscaloosa, Alabama.
References
- Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion. Health and economic causes of chronic diseases. Accessed October 31, 2023. https://www.cdc.gov/chronicdisease/about/costs/index.htm
- National Academies of Sciences, Engineering, and Medicine; National Academy of Medicine; Flaubert JL, Le Menestrel S, Williams DR, editors. The future of nursing 2020–2030: charting a path to achieve health equity. Washington (DC): National Academies Press; 2021.
- Hoffmann T, Bennett S, Del Mar C. Evidence-based practice across the health professions. 4th edition. Amsterdam (NL): Elsevier; 2013.
- Jacups SP, Bradley C. Is the evidence-based medicine movement counter-productive: are randomised controlled trials the best approach to establish evidence in complex healthcare situations? Public Health Res Pract. 2023;33(1):3312303. PubMed doi:10.17061/phrp3312303
- Jull J, Whitehead M, Petticrew M, Kristjansson E, Gough D, Petkovic J, et al. . When is a randomised controlled trial health equity relevant? Development and validation of a conceptual framework. BMJ Open. 2017;7(9):e015815. PubMed doi:10.1136/bmjopen-2016-015815
- Ogilvie D, Adams J, Bauman A, Gregg EW, Panter J, Siegel KR, et al. . Using natural experimental studies to guide public health action: turning the evidence-based medicine paradigm on its head. J Epidemiol Community Health. 2020;74(2):203–208. PubMed doi:10.1136/jech-2019-213085
- Centers for Disease Control and Prevention. Preventing Chronic Disease: about the journal. Accessed October 25, 2023. https://www.cdc.gov/pcd/about_the_journal/index.htm
- Centers for Disease Control and Prevention. Preventing Chronic Disease: types of articles. Accessed October 25, 2023. https://www.cdc.gov/pcd/for_authors/types_of_articles.htm
- Kwan BM, McGinnes HL, Ory MG, Estabrooks PA, Waxmonsky JA, Glasgow RE. RE-AIM in the real world: use of the RE-AIM framework for program planning and evaluation in clinical and community settings. Front Public Health. 2019;7:345. PubMed doi:10.3389/fpubh.2019.00345
- Whooten RC, Horan C, Cordes J, Dartley AN, Aguirre A, Taveras EM. Evaluating the implementation of a before-school physical activity program: a mixed-methods approach in Massachusetts, 2018. Prev Chronic Dis. 2020;17:E116. PubMed doi:10.5888/pcd17.190445
- Harden SM, Balis LE, Strayer T III, Wilson ML. Assess, plan, do, evaluate, and report: iterative cycle to remove academic control of a community-based physical activity program. Prev Chronic Dis. 2021;18:E32. PubMed doi:10.5888/pcd18.200513
- Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–1327. PubMed doi:10.2105/AJPH.89.9.1322
- Perry CK, Campbell LP, Currier J, Farris PE, Wenzel ES, Medysky ME, et al. . An evidence-based walking program in Oregon communities: Step It Up! survivors. Prev Chronic Dis. 2020;17:E156. PubMed doi:10.5888/pcd17.200231
- Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, et al. . Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3–4):171–81.
- Calancie L, Nappi D, Appel J, Hennessy E, Korn AR, Mitchell J, et al. . Implementing and evaluating a stakeholder-driven community diffusion–informed early childhood intervention to prevent obesity, Cuyahoga County, Ohio, 2018–2020. Prev Chronic Dis. 2022;19:E03. PubMed doi:10.5888/pcd19.210181
- Appel JM, Fullerton K, Hennessy E, Korn AR, Tovar A, Allender S, et al. . Design and methods of Shape Up Under 5: Integration of systems science and community-engaged research to prevent early childhood obesity. PLoS One. 2019;14(8):e0220169. PubMed doi:10.1371/journal.pone.0220169
- Kasman M, Hammond RA, Heuberger B, Mack-Crane A, Purcell R, Economos C, et al. . Activating a community: an agent-based model of Romp & Chomp, a whole-of-community childhood obesity intervention. Obesity (Silver Spring). 2019;27(9):1494–1502.
- Linabarger M, Brown M, Patel N. A pilot study of integration of medical and dental care in 6 states. Prev Chronic Dis. 2021;18:E72. PubMed doi:10.5888/pcd18.210027
- Long CR, Rowland B, Gannon M, Faitak B, Smith G, Clampitt J, et al. . Reducing sodium content of foods served in Arkansas’s largest school district: evaluation of the Sodium Reduction in Communities Program. Prev Chronic Dis. 2022;19:E55. PubMed doi:10.5888/pcd19.220051
- Golden CA, Hill JL, Heelan KA, Bartee RT, Abbey BM, Malmkar A, et al. . A dissemination strategy to identify communities ready to implement a pediatric weight management intervention in medically underserved areas. Prev Chronic Dis. 2021;18:E10. PubMed doi:10.5888/pcd18.200248
- Leeman J, Petermann V, Heisler-MacKinnon J, Bjork A, Brewer NT, Grabert BK, et al. . Quality improvement coaching for human papillomavirus vaccination coverage: a process evaluation in 3 states, 2018–2019. Prev Chronic Dis. 2020;17:E120. PubMed doi:10.5888/pcd17.190410
- Maxwell AE, DeGroff A, Hohl SD, Sharma KP, Sun J, Escoffery C, et al. . Evaluating uptake of evidence-based interventions in 355 clinics partnering with the Colorectal Cancer Control Program, 2015–2018. Prev Chronic Dis. 2022;19:E26. PubMed doi:10.5888/pcd19.210258
- Craig P, Di Ruggiero E, Frohlich KL, Mykhalovskiy E, White M; Canadian Institutes of Health Research–National Institute for Health Research (NIHR). Taking account of context in population health intervention research: guidance for producers, users and funders of research. Southampton (UK): NIHR Evaluation, Trials, and Studies Coordinating Centre; 2018.
- Ridde V. Need for more and better implementation science in global health. BMJ Glob Health. 2016;1(2):e000115. PubMed doi:10.1136/bmjgh-2016-000115
- Hales S, Lesher-Trevino A, Ford N, Maher D, Ramsay A, Tran N. Reporting guidelines for implementation and operational research. Bull World Health Organ. 2016;94(1):58–64. PubMed doi:10.2471/BLT.15.167585
- Vanderkruik R, McPherson ME. A contextual factors framework to inform implementation and evaluation of public health initiatives. Am J Eval. 2016;38(3):348–359.
The opinions expressed by authors contributing to this journal do not necessarily reflect the opinions of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors’ affiliated institutions.