Article Text

Download PDFPDF

How is implementation research applied to advance health in low-income and middle-income countries?
  1. Olakunle Alonge1,
  2. Daniela Cristina Rodriguez1,
  3. Neal Brandes2,
  4. Elvin Geng3,
  5. Ludovic Reveiz4,
  6. David H Peters1
  1. 1 Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
  2. 2 Office of Maternal and Child Health and Nutrition, Bureau for Global Health, United States Agency for International Development, Washington, District of Columbia, USA
  3. 3 Department of Medicine, University of California San Francisco, San Francisco, California, USA
  4. 4 Knowledge Management, Bioethics, and Research Department, Pan American Health Organization, Washington, District of Columbia, USA
  1. Correspondence to Dr Olakunle Alonge; oalonge1{at}jhu.edu

Abstract

This paper examines the characteristics of implementation research (IR) efforts in low-income and middle-income countries (LMICs) by describing how key IR principles and concepts have been used in published health research in LMICs between 1998 and 2016, with focus on how to better apply these principles and concepts to support large-scale impact of health interventions in LMICs. There is a stark discrepancy between principles of IR and what has been published. Most IR studies have been conducted under conditions where the researchers have considerable influence over implementation and with extra resources, rather than in ‘real world’ conditions. IR researchers tend to focus on research questions that test a proof of concept, such as whether a new intervention is feasible or can improve implementation. They also tend to use traditional fixed research designs, yet the usual conditions for managing programmes demand continuous learning and change. More IR in LMICs should be conducted under usual management conditions, employ pragmatic research paradigm and address critical implementation issues such as scale-up and sustainability of evidence-informed interventions. This paper describes some positive examples that address these concerns and identifies how better reporting of IR studies in LMICs would include more complete descriptions of strategies, contexts, concepts, methods and outcomes of IR activities. This will help practitioners, policy-makers and other researchers to better learn how to implement large-scale change in their own settings.

  • implementation
  • delivery
  • science
  • research
  • low and middle income countries
  • literature review

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Summary box

  • Implementation research (IR) in low-income and middle-income countries has mainly focused on evaluation of whether strategies for implementing evidence-informed health interventions can work; little of the IR addresses problems of scale-up and sustainability, which are key issues for health interventions.

  • Most of the publications on IR are not conducted under routine conditions for management and financing. If IR is to make an impact on policy and practice or inform the scale-up of programmes, more research needs to be conducted under the conditions in which interventions are expected to be implemented.

  • Most IR publications do not describe implementation characteristics completely; future research should more consistently provide complete descriptions of the implementation strategies, report on implementation variables and the context under which implementation occurs.

  • IR uses a full range of quantitative, qualitative and mixed methods approaches, but more rigorous and adaptive research designs are needed to address how to scale up and sustain interventions

Introduction

Implementation research (IR) efforts are not new, but attention is growing,1 particularly for the potential of IR to support evidence-informed interventions needed for achieving the Sustainable Development Goals.2 3 The growing attention to IR has revealed a gap in understanding of how widely IR is used and how its concepts and methods could be applied to achieve widespread health impact in low-income and middle-income countries (LMICs).4 In 2004, a call was made at the WHO Ministerial Summit on Health Research in Mexico for more IR around health systems strengthening strategies and evidence-informed interventions addressing major public health problems.5 The health research landscape in many LMICs has, however, changed little since this call.6

While there are ongoing debates concerning the definitions and boundaries of IR, there is a consensus on certain principles that should apply in IR.2 7–10 For instance, the need to conduct such research under real world conditions, and for it to respond to implementation problems, often in real time, is clear.2 8 9 The need to include inquiry about the context and for a team comprising diverse stakeholders to support implementation is also important.2 8–10 There are also concepts and methods that have been identified as appropriate for studying implementation problems.7–9 11–13 For instance, the use of implementation outcome variables that describe the results of intentional actions to deliver a programme,7 8 and a pragmatic research paradigm that allows flexibility in studying implementation strategies and contextual factors has been strongly recommended as being suitable for IR.8 11–15

There is also a consensus on key characteristics of IR, which we refer to as implementation descriptors, which should be reported in the peer-reviewed literature involving IR (box 1).16–18 These criteria are based on frameworks described under different published reporting guidelines for IR,16–18 but made broader to capture the plurality of concepts, methods and approaches used in global health research. These criteria have been described as important descriptors of IR studies because they provide insight into the how implementation works and indicate factors necessary for evaluating the external validity of evidence from IR studies.16

Box 1

Implementation descriptors: what you need to learn from others’ implementation experience

Context and intervention

  • Clear description of the planned intervention, the evidence that justifies its choice and the main implementation strategies.

  • Contextual factors that affect implementation.

  • Clear description of who is implementing the intervention and strategies.

  • Any deviations from the planned intervention design and activities.

Implementation results

  • Measurement of implementation outcome variables, preferably showing changes over time

  • Clear discussion of the policy and practice implications of the study findings.

In this paper, we examine how IR is being operationalised to advance health in LMICs, drawing attention to how the IR principles and methods in health studies are applied in efforts to enhance the potential impact of interventions, which include policies, programme and individual behaviours. The goal of this paper is to describe how IR and evidence from such studies could be better applied to support large-scale impact of health interventions in LMICs.

Characteristics of IR health studies from LMICs

To characterise IR studies from LMICs, we conducted a systematic review of the relevant literature (box 2), examining the implementation descriptors identified in box 1. The number of IR publications over a 18 years review period (between 1998 and 2016) increased substantially. However, few papers reported on these factors, limiting the lessons that can be learnt from their efforts to enhance implementation in LMICs. For example, only 791 articles (8% of the pool of 10 292 relevant peer-reviewed IR literature) described the evidence-informed interventions and the set of implementation strategies that accompanied them. A majority of the 791 articles described who was doing the implementation (mainly civil society organisations [CSO]); however, less than 15% described whether there was any deviation from the initial planned intervention, which limits the understanding of how these interventions were adapted to fit their context (table 1). Adaptation is often a necessary step for achieving large-scale impact of interventions in varied settings.

Box 2

Selection of relevant peer-reviewed IR literature

  • We conducted a systematic review of implementation research (IR) literature between January 1998 and December 2016 (see online supplementary file 1 for the full search string, PRISMA flow chart and methodology).

  • We identified relevant peer-reviewed records from the systematic review (see online supplementary file 2 for the bibliography of the relevant IR publications identified). Relevant peer-reviewed IR literature was broadly defined as research or evaluation articles that describe the implementation of an intervention to improve health8 and are set in low-income and middle-income countries.

  • We conducted additional in-depth review on relevant peer-reviewed records that satisfied three or more of the implementation descriptors described in box 1 (see online supplementary file 3 for links to the relevant IR publications selected for in-depth review).

Supplemental material

Supplemental material

Supplemental material

Table 1

Prevalence of key implementation descriptors reported in IR studies—context and intervention

Only a small set of articles report more completely on the implementation descriptors (28 studies from our review reported on three or more descriptors and are shown in table 2 and online supplementary file 3).19–46 All but 1 of the 28 articles reported contextual factors,45 but 15 did so partially,19 20 28–32 34 35 38–40 43 46 usually focusing on external factors only (mainly, relevant sociopolitical factors that are external to the project settings and implementing organisations, but with implications for the set of implementation strategies), excluding discussion on the internal factors (social factors within the project settings and implementing organisations, including the organisation’s structure and culture). Fourteen articles measured changes in implementation outcome variables,19 20 22–28 30 31 37 38 46 including acceptability (perception among stakeholders that an intervention is agreeable), fidelity (extent to which an intervention was implemented as described), uptake (intention, decision or action to use an intervention) and coverage or reach (degree to which eligible population for an intervention received the intervention).7 8 Only five articles used an explicit or published implementation research model or theory such as the Consolidated Framework for Advancing Implementation Research (CFIR) and the theory of organisational readiness for change.15 20 28 36 37 42 Twenty-five articles reported who was conducting the implementation,19–31 33–41 44–46 but most only focused on the front-line implementers (eg, community health workers) without describing the overarching institutions supporting activities. Only nine articles reported on any adaptation or deviations from planned intervention,19–22 30 31 33–35 and it is unclear if this reflects a reporting issue or a lack of deviations actually taking place. For the reporting criteria emerging from this series,2 47 48 embeddedness (of the IR study in existing programmes and the larger health system) and incentives (of involved stakeholders to conduct the IR study) were reported very infrequently, seven and four articles, respectively. However, more than half of these articles (15) discussed policy and/or practice implications in some way.19–28 32 33 38 42 43

Table 2

In-depth review of IR articles with more complete reporting of implementation descriptors

Application of key IRDS principles and concepts to IR health studies from LMICs

Key IR principles and concepts can be summarised under six major themes (box 3).1 2 7–9 11–13 These themes reflect the convergence in literature on the distinguishing features of IR activities that are expected to produce impact at scale and are relevant when examining IR studies that may yield evidence of large-scale impact.

Box 3

Key principles and concepts of IR activities

  • Implementation research (IR) activities should focus on questions around implementing evidence-informed interventions.

  • IR activities should be conducted under real world conditions, with the types of resources, incentives and operational support they would have under routine situations (where provision of additional resources is an implementation strategy, and then the study should indicate how such provisions will be sustained and integrated into the system).

  • IR studies should provide evidence for the ‘how’ and ‘why’ evidence-informed interventions led to health impact through use of implementation outcome variables and a pragmatic research paradigm.

  • Conducting IR studies should be fit to purpose and balance the need to address the immediate implementation problems and support broader and longer term learning.

  • IR activities require multistakeholder collaboration and partnerships.

  • Key characteristics of IR activities should be reported to facilitate learning and action.

Conducting research that focuses on questions about implementing evidence-informed interventions

Questions about implementation are often concerned with how to produce expected results in a particular setting from an intervention that has been previously shown to work elsewhere.1 2 7–9 Such questions may address the fit of an intervention to new settings, how to introduce, scale-up or sustain the intervention among broader populations or address real-time operational issues that may occur in the process of implementation.7 8 IR studies that focused on scale-up and sustainability are particularly essential for understanding the large-scale impact of health interventions. However, less than 5% of 791 Implementation Research (IR) studies reviewed for this paper addressed objectives concerning issues such as scaling up (n=32) or sustainability (n=25). The BetterBirth project,38 which focused on implementing a Safe Childbirth Checklist (SCC) to promote safe childbirth practices and reduce maternal and neonatal mortality in Uttar Pradesh, India, is an example of a research project that did address questions around implementing an evidence-informed intervention. Using an iterative learning process, researchers involved in the BetterBirth project adapted the SCC intervention to operational issues that occur in the process of introducing the SCC to health workers in different settings from where it was originally tested. Over two learning phases, they added a peer-to-peer coaching strategy to support the introduction of the checklist, in part to respond to leadership challenges and health worker’s lack of motivation that were identified during an initial implementation phase. They used physician coaches to motivate health facility leaders and provide critical leadership for the implementation and used nurse coaches to motivate behavioural changes in birth attendants to use the checklist. The study was able to demonstrate large increases in the correct use of the SCC intervention over time and provide evidence for a set of implementation strategies to support the scale-up of the SCC.

Whereas the focus of most IR studies are around evaluation, key implementation questions in most settings are concerned with how to scale up or sustain an intervention within a practice area or population.49 Issues around scaleup of interventions and their sustainability occur within complex health systems that are constantly evolving.8 9 The lack of studies addressing issues on scale-up and sustainability may perhaps point to the practical difficulties in how to conduct research that addresses complexity. Indeed, the capacity of public health researchers to address questions examining complex and adaptive processes has been limited until recently.50 There are, however, examples of implementation models for describing and studying scale-up and innovations for sustaining implementation activities and outcomes from other fields that are being extended to IR in public health.51–53 For example, Aarons and colleagues describe one model for implementing public sector services and included four different implementation phases (exploration, adoption/preparation, implementation and sustainment), highlighting specific factors that are important at each phase.51 These included factors affecting the outer context (service environment, interorganisational environment, consumer support/advocacy), the inner context (intraorganisational characteristics, individual adopter characteristics), innovation characteristics and the fit with existing systems.51 There are additional methods related to systems thinking that are also useful for examining questions of implementation within complex systems.50 54

Conducting research on implementation of evidence-informed interventions under real world conditions

Whereas what constitute ‘real-world’ conditions for a set of implementation activities may be debatable, criteria such as whether implementation was led by usual implementing agencies and without additional funding (apart from usual budget) or management support for implementation during the study1 2 8 may be useful to gauge the extent to which conditions for implementing an intervention depart from normal routine. Where provision of additional resources is used as an implementation strategy, then this should be clear, and the study should indicate how such provisions will be sustained and integrated into the system. This principle—to conduct research on implementation under real world conditions—is perhaps one the most distinguishing characteristics of IR studies.2 8 10 Unfortunately, most IR studies are conducted under more controlled settings, which limits the extent to which learning from those studies could be applied to commonly found conditions.2 8–10 Our review of the literature also suggests that this critical approach may be lacking in published IR studies in LMICs. In examining management support, funding and who is doing the implementation, only 26% of IR studies reviewed did not have additional funding or provided plans for sustaining any additional funding provided as an implementation strategy, and implementation of interventions of these studies was identified as being led by the usual implementing agency in about 44% of the studies (table 3).

Table 3

Application of key IR principles to IRDS health studies for large-scale impact in LMICs

Some authors have suggested that implementation efforts and relevant IR studies should be led by dedicated implementation teams (including key personnel enacting the implementation strategy) under routine conditions.10 55 56 Such teams have been espoused as critical for maintaining the implementation activities under real-world practices and also aiding the scale-up and dissemination of such practices to other settings.10 55 56 The formation of teams is critical also for addressing systemic issues and contributing to sustainability. One example is the Population Health and Implementation Training (PHIT) Partnership in Mozambique, which has built a network of institutions to strengthen primary healthcare and is integrated into the Ministry of Health (MOH) at the subnational level.21 Partners include provincial health government, academic institutions (local and external), NGOs, and the research arm of the MOH.21 Each PHIT partner has a specific technical assistance and/or research role to support provincial health authorities design and implement activities. Another example is the proof of concept study in Liberia that assessed integration of family planning and immunisation. In this case, the MOH worked with an implementing partner NGO and USAID to design a project that would generate lessons to inform implementation at-scale. The three partners worked together to engage with a broader stakeholder group, identify intervention study areas, pilot intervention and share results (see Cooper in table 1).33

Research under real-world conditions may also incorporate readiness assessments or involve organisational and system level changes for optimising the benefit and impact of evidence-informed interventions.56 For example, the BetterBirth project recognised leadership gap at health facility levels in Uttar Pradesh, India, and the implication of this for the successful rollout of the SCC intervention.38 The project engaged with district health level leadership and an application-based technology to motivate organisational and system level changes that provided real-time data feedback to district teams on observations and the availability of essential birth supplies at health facilities.38 Using the data, the district health leadership were able to strengthen the real-time availability of supplies necessary for essential birth practices.41 The PHIT Partnership in Mozambique has integrated targeted operations research studies to address systems bottlenecks on a continuous basis; these are implemented in partnership with local level managers.21 Examples of their studies include assessments of the effect of introducing shift work to extend outpatient care clinic hours and quality improvement activities for reducing loss to follow-up among paediatric patients with HIV.21 These strategies involved organisational changes to improve work processes to support implementation of evidence-informed interventions.

IR conducted under real-world conditions also allows for identification and potentially the modification of incentives faced by different actors. A project that expanded community mobilisation for maternal and newborn health through women’s groups in rural Bangladesh faced challenges of frequent staff turnover of group facilitators due to unexpected constraints on facilitators’ movement or offers of better employment, which were compounded by recruitment requirements for minimum levels of literacy. Project implementers were aware of how important familiarity with local communities and customs was to effective facilitation, so they adapted the recruitment criteria and processes to be more flexible and responsive to their context.25 Another example is the STRETCH trial in South Africa, which was an intervention aimed at supporting nurse-initiated and managed antiretroviral therapy. As part of the trial, clinical support to improve quality of care and self-confidence was offered to nurses administering antiretroviral therapy through several mechanisms, including training, telephone and in-person assistance and management and logistics support. Unfortunately, clinical support was insufficient and nurses felt that STRETCH trainers were unable to fill the gap.24

IR studies should provide evidence for ‘how’ and ‘why’ interventions led to health impact through use of implementation outcome variables and a pragmatic research paradigm

The primary objective of IR studies is often about answering how and why interventions produce their desired effects on health in a given setting.8 9 There is almost always the assumption that some evidence exists a priori on what interventions have worked for a particular health problem in some (often controlled) setting; this is why they are selected as ‘evidence-informed’ interventions. What is often missing, however, is an understanding of the implementation pathway for these interventions; how they are carried out in real-life contexts to address existing health problems and how they may be used to achieve sustainable impact at scale.7 One way to describe the implementation pathway is to link changes in implementation outcome variables and contexts to specific implementation strategies.7 Implementation outcome variables include measures of implementation fidelity, acceptability of interventions by potential beneficiaries, uptake of the interventions, reach or coverage and cost.7 8 The changes in implementation outcome variables and contextual factors could in turn be linked to measures of programme effectiveness and impact.7 For example, an evaluation of the Brazil’s Bolsa Familia programme—a conditional cash transfer programme to reduce poverty and improve health outcomes among poor families—showed that an increase in the reach of the programme (an implementation outcome variable) was associated with increased utilisation of preventive child health services, and this in turn was associated with a decline in postneonatal deaths and infant mortality over a 5 years period.27 The study was able to show the implementation pathway leading to impact for the evidence-informed intervention by explicitly linking changes in implementation outcome variables to indicators of programme impact.

Our review, however, found that the majority of IR studies from LMICs neither described changes in implementation outcome variables nor described how these changes are related to implementation strategies and desired outcomes in a given setting (table 3). While such descriptions may not be necessary for all types of IR studies, they are essential for IR studies that yield evidence of large-scale impact. Without such links, IRDS studies perhaps become undistinguishable from more traditional efficacy and effectiveness studies57 and the utility of IRDS for fulfilling the dual role of being useful for generating knowledge while addressing health problem in real-time may not be realised.57

Use of a pragmatic research paradigm and relevant study designs such as implementation-effectiveness hybrid designs (which combines measurement of outcomes of clinical effectiveness and implementation), mixed methods designs (which strategically combines qualitative and quantitative methods) and systems science methods (which characterises nonlinear processes within complex phenomena) allow IR researchers to model the iterative nature of implementation efforts without compromising the rigour of such studies.8 11 12 Such research designs are also useful for modelling changes in the intervention as well as implementation outcome variables and programme effectiveness in the same study. Our review suggested that mixed methods were indeed commonly used in IR health studies in LMICs; more than one-third of IR papers reviewed used some form of mixed methods. However, when we examined these mixed method IR studies closely, very few described the explicit mixed method design strategies used,58 which have implication for the rigour and validity of findings from such studies. Furthermore, only 14 of the relevant 791 IR papers we reviewed used other designs informed by a pragmatic research paradigm, such as implementation-effectiveness hybrid designs or system science methods. One example involved studying the adaptations to the implementation of the Seguro Popular de Salud programme in Mexico (a health financing reform package to accelerate universal health coverage and financial risk protection) using a system dynamics approach to inform a better understanding of real-life implementation variations in the programme in response to different incentives.59 While the impact of the reform on reducing catastrophic health expenditure was previously established,60 the application of a system dynamics approach to the evaluation data provided useful lessons on how the reform was tailored to various contextual factors and the features of organisational and system level changes that contributed to the impact observed under various settings.61

Conducting IR studies should be fit to purpose and balance the need to address the immediate implementation problems and support broader and longer term learning

Whereas IR studies conducted under real-world conditions may be more responsive to immediate implementation problems, lessons learnt from such studies may not always be readily generalisable to support broader and longer term learning, resulting in a common trade-off of IR studies.61 Suggested approaches for balancing this trade-off include use of specific research designs (as described above) that allow IR studies to adapt and study particular real-world conditions while still ensuring that research parameters are studied and analysed objectively.8 11 12 Others have suggested that well-designed studies conducted within multisetting and multicountry contexts help to balance this trade-off and improve the generalisability of IR study findings.56 61 For example, a feasibility study to evaluate the potential for establishing a platform for bringing together relevant health data from the public and private sectors to promote data use at the district health level was initially implemented in five different districts of India, Nigeria and Ethiopia.62 This study captured potential facilitators and barriers to the introduction of the platform and provided generalisable learning on the lack of standardised processes for data-based decision-making at district level across all three countries.62 Less than 10% of studies we reviewed were conducted within multiple settings and multicountry contexts. This may in fact be reflective of the financial and logistical constraints of undertaking such complex research efforts.4 While it may be difficult, and perhaps inappropriate, to answer relevant IR questions using multiple settings and countries, the need to optimise the trade-off between internal and external validity of IR studies is another reason to use of adaptive research designs.8 11 12 56

IR requires multistakeholder collaboration and partnerships

Collaboration and partnerships among multiple stakeholders (such as academics, implementers, policy-makers and donors) across various influence domains (research, programme, policy and funding) is important for any IR enterprise to achieve large-scale impact.1 2 4 5 8 10

Aligning research, programme, policy and funding cycles has been suggested as critical to address health problems at scale using IR, especially in LMICs.2 Such alignments are often more easily accomplished when there is collaboration and partnership to facilitate decisions and cross-learning among actors involved in these various cycles.2 One example is a community mobilisation project of women’s groups in Bangladesh, which made considerable efforts to engage and build partnerships with local government and NGOs, and also with community stakeholders involved in the women’s groups, such as traditional birth attendants and community health workers.25 These efforts were made in recognition that trust was needed to ensure an effective intervention, but also to eventually address service gaps and influence policymaking.25 Similarly, an intervention to enhance social capital in Nicaragua worked with governmental, non-governmental and community stakeholders at the project’s outset to identify, based on their collective experience, the communities which would be targeted in the programme, but further interactions between the stakeholders was unclear.40 In the integrated family planning and immunisation project in Liberia, project team recommendations were supplemented by additional recommendations from the MOH, suggesting strong engagement from policy-makers.33 The degree of alignments may also reflect the extent of embeddedness of such IR activities within context-specific programme, practice or policy and could infer the policy and practice relevance of findings from such IR activities.34 Some teams develop relationships, either by design or that build organically over time, that put researchers in a position to support implementers’ roles and activities. For example, Ngana and colleagues point out how by implementing a participatory action research model to address health reporting challenges, health staff developed problem-solving skills that allowed them to better respond and adapt to systemic challenges.30 Under the STRETCH trial in South Africa, the trial coordinator contributed to improving communications between clinical staff to strengthen relationships between physicians, nurses and other staff, which were critical to successful task-shifting.24 However, our review showed that efforts to involve key stakeholders across various domains of influence are still infrequent, with only 16 out of the 28 articles included for in-depth review reporting on the policy or practice implications of their findings (table 3).

Reporting of key characteristics of IR studies

Our review uncovered the lack of consistent reporting of implementation characteristics, including the adequate descriptions of the context, the intervention itself, including deviations from planned interventions and changes in implementation variables. Several authors have noted that implementation of evidence-informed interventions in health has been hampered by the lack of adequate reporting of IR studies.16–18 As a result, implementers have difficulty in translating the IR publication findings into practice, and the findings of such research cannot be easily used outside of their original settings.16 17 Adequate reporting of implementation strategies has been particularly emphasised in the literature because of its centrality to implementation science.17 Indeed, some authors have referred to implementation strategy as the ‘intervention’ in IR studies,8 11 and several guidelines have been proposed on how to report implementation strategy and IR studies.16–18 All the guidelines recommend that an implementation strategy should be described in details and the implementation outcome variable that describes the proximal effect of such strategy should also be described.16–18

Other key aspects that have been proposed for reporting include description of the background (including implementation gaps and supporting evidence for selecting an intervention), the settings and contexts (including relevant factors within the internal and external environment of the implementing organisations and beneficiaries) and how these factors changed over time to facilitate or hinder implementation activities.16 Such information may be required to inform practitioners on how to adapt the strategy or implementation activities to their own local contexts. It is possible that issues, such as journal word limits and writing requirements, can restrict what is included in publications.16 Reporting of IR studies could be improved if deliberate attention is paid to highlighting key aspects of IR given journal constraints. Our review suggested that articles published in certain journals (eg, Global Health: Science and Practice, PLoS One, Implementation Science) had higher probability of reporting more completely on the implementation descriptors assessed under this study. This goes to suggest that journal constraints may be a much bigger factor in determining extent to which key IR characteristics are reported. Our review further suggested that involvement of multiple stakeholders in an IR study was associated with more complete reporting on the implementation descriptors assessed. For example, articles involving academics, CSO, government and donors had the highest percentage of reporting on implementation outcome variables (69%), much higher than observed with single stakeholder or other multistakeholder groups for example, academics/CSO/government (51%), academic/CSO (50%), academics/government (57%), academic/donor (67%).

Conclusion

IR has promise in supporting large-scale impact of health interventions in LMICs, but to maximise its potential, more IR work needs to take place under usual management conditions, employ a pragmatic research paradigm and address critical implementation issues such as scale-up and sustainability as part of complex systems. The increasing trend of publications in IR from LMICs is indicative of the promise and usefulness of IR approaches in addressing critical health problems in the developing world. Better reporting of IR studies in LMICs are needed that more clearly describe the strategies, contexts, concepts, methods and outcomes of IR activities and more peer-reviewed journals should encourage such reporting. More high quality peer-reviewed IR publications are needed to reflect the full range of problems, contexts, methods and results, which will serve to enhance learning and implementation of health interventions more widely.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.
  27. 27.
  28. 28.
  29. 29.
  30. 30.
  31. 31.
  32. 32.
  33. 33.
  34. 34.
  35. 35.
  36. 36.
  37. 37.
  38. 38.
  39. 39.
  40. 40.
  41. 41.
  42. 42.
  43. 43.
  44. 44.
  45. 45.
  46. 46.
  47. 47.
  48. 48.
  49. 49.
  50. 50.
  51. 51.
  52. 52.
  53. 53.
  54. 54.
  55. 55.
  56. 56.
  57. 57.
  58. 58.
  59. 59.
  60. 60.
  61. 61.
  62. 62.

Footnotes

  • Handling editor Seye Abimbola

  • Contributors OA conceptualised the paper, led the review, was the principal author and managed all revisions of drafts of the paper. DCR contributed to the review and writing drafts of the paper. DHP provided guidance to the review and writing drafts of the paper. NB, EG, LR and DHP revised the paper for intellectual content.

  • Funding The study has been funded by USAID, Alliance for Health Policy and Systems Research, and the World Bank.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement All data relevant to the study are included in the article or uploaded as supplementary information.