Abstract
Introduction:
There is a shortage of individuals trained in using quantitative methods in biomedical research in sub-Saharan Africa (SSA). Improving public health in SSA requires new ways to promote quantitative knowledge and skills among faculty in biomedical research and better-integrated network systems of support.
Methods:
We describe the development, implementation, and evaluation of an innovative faculty training and support program in SSA from December 2017–June 2020, using courses in monitoring and evaluation, data management, and complex surveys as prototypical examples. Indicators were selected to follow the 4 levels outlined in the Kirkpatrick evaluation model: reaction, learning, behavior, and results. We used survey data from faculty fellows and students and reported median change and interquartile ranges (IQR).
Results:
The training program created an international community of 26 faculty members working collaboratively to lead the training of 3 quantitative methods courses. The program increased faculty members’ knowledge of the course content (median increase 17 percentage points [IQR: 0, 20]). Faculty members, in turn, trained 380 students at institutions of higher education in 8 SSA countries (Botswana, Ethiopia, Ghana, Nigeria, Rwanda, South Africa, Tanzania, and Uganda).
Conclusion:
The program relied on collaborative funding from participating institutions and focused on individual capacity-strengthening. In the future, the program will be scaled to include other emerging areas, such as data science, will integrate institutional support and feedback, and will move some of the training and mentoring activities to an online platform. Finally, to ensure that faculty have both improved confidence and improvement in competence, in future iterations, the program will include competency evaluation at the start and end and pair fellows who need additional training with those who excelled to co-teach.
INTRODUCTION
There is a dearth of trained quantitative researchers in higher education in sub-Saharan Africa (SSA).1 While evidence suggests an increase in research outputs and health-related publications from authors in SSA, this comes from a small number of countries, and authors from high-income countries still dominate most public health publications that use data from SSA.2–4 This perpetuates power imbalances, as decisions made about defining research questions, what data to collect, and how to analyze these data are made by individuals removed from the communities addressed. Increased epidemiology and public health training have been associated with increased research output.4
Past efforts on capacity-strengthening in public health in SSA emphasized statistical training, developing institutional capacity to conduct health research, and improving fragile health systems.4,5 Other initiatives focused mainly on knowledge generation within universities and research networks, with little attention to the design of questions that resonate with national policy and individual talent development as an initiative’s primary objective.6 Recent initiatives, such as operational research in Rwanda, appeal to training and mentoring researchers to lead research projects from initial idea to publication of research findings.7 Moreover, countries such as South Africa now lead in research investments to improve local research capacity and strengthen international research collaboration and national research systems.3,4,8
Redesigning capacity-strengthening and training initiatives in public health in SSA is necessary and timely to promote globally competitive research and address the region’s public health challenges.1,3,4 These training types present prospects for future faculty training programs and collaborations with continuity and sustainability elements for strengthening biostatistical capacity in public health in low- and middle-income countries.
Redesigning capacity-strengthening and training initiatives in public health in SSA is necessary and timely to promote globally competitive research and address the region’s public health challenges.
We describe a faculty training program that aimed to support junior faculty in SSA to improve their pedagogical abilities, confidence in teaching, and content knowledge specific to quantitative methods courses and develop a cross-institutional and cross-country peer network that can lead to support future teaching and research collaborations. By providing didactic and experiential learning, the program was meant to improve faculty’s ability to teach students and implement these courses at their institutions, leading to improved capacity in quantitative methods for their students. In addition, the collaborations aimed to foster research proposed and conducted by the faculty, ultimately improving the quality and quantity of health research led by researchers in SSA. We describe the approach, outcomes, and provide lessons learned while implementing curricula on 3 courses with faculty from universities across 8 countries (Botswana, Ethiopia, Ghana, Nigeria, Rwanda, South Africa, Tanzania, and Uganda) in SSA.
FACULTY TRAINING PROGRAM DESCRIPTION
From 2017–2020, the McGoldrick Professional Development Program in Public Health offered 3 in-person faculty training courses on quantitative methods for monitoring and evaluation (M&E), methods for data management with software application (DM), and complex survey analysis (CSA). In 2021, the program offered a virtual faculty training course on health data science (HDS). Faculty (including postdoctoral fellows, instructors, and lecturers) who had experience with the course content area and could teach the course at their home institution in sub-Saharan Africa were eligible to apply for the program as faculty fellows. The application required a letter of recommendation from a faculty mentor that needed to include a statement of institutional support to offer the course at their institution during the following academic year.
Initial Training Model
M&E was the first course offered by the professional development program. The initial model consisted of a 1-week in-person workshop in Boston, USA, in December 2017, during which time the fellows met with senior faculty at Harvard T.H. Chan School of Public Health and were introduced to the M&E course syllabus, reviewed complex course content, and received pedagogical training. As part of the pedagogical training, faculty fellows were given opportunities to do a practice teaching session and receive feedback from their colleagues and senior faculty. After this workshop, the course director (who was based at Harvard T.H. Chan School of Public Health) traveled to each of the fellows’ home institutions to assist them in implementing the first round of the course for their students.
Model Shifted to Center in sub-Saharan Africa
After implementation of the M&E course, discussions were held between programmatic staff and faculty fellows about changes that could be made to the model for future courses. For the DM and CSA courses, the model was adapted to be centered in SSA and increase opportunities for international collaboration. Training and implementation occurred in 4 phases over 10 months: an online training platform, an in-person workshop, an inaugural course co-taught by all faculty trainees at the host institution, and home institution implementation supported, when possible, by the faculty fellows from other countries either in-person or virtually.
Through the online training platform, faculty fellows were introduced to course content that included 5 modules for each course. Each module included a prerecorded lecture, a hands-on activity, and a knowledge check. Depending on their preexisting knowledge of the course content, this phase took approximately 10–40 hours to complete. Feedback was provided in real time from the course director. In addition, faculty fellows were introduced to each other and communicated and exchanged ideas and experiences through online discussion boards.
Subsequently, the faculty fellows convened for a week-long in-person workshop in Rwanda (DM in 2019) or Ethiopia (CSA in 2020). During this workshop, activities included practice teaching with peer feedback, codevelopment of course assignments, and learning the more complex aspects of the course content.
Then, the faculty fellows co-taught the course to students at the host institution in Rwanda for DM and Ethiopia for CSA in what we refer to as the “inaugural course.” This phase gave them experience teaching the content in a low-risk environment where they had support from the course director and other fellows. At the end of the week-long inaugural course, the faculty fellows participated in a facilitated group discussion to reflect on the strengths of the course implementation, challenges, and support they anticipated needing to successfully implement at their home institution.
Finally, faculty fellows returned to their home institutions to adapt the course to their local context and then offer it to students at their institution. During this phase, faculty fellows from other institutions traveled to support faculty fellows’ implementation of the course. For example, a fellow from Uganda traveled to Botswana to support the implementation of the DM course.
Model Shifted to Accommodate COVID-19 Pandemic Travel Restrictions
In 2021, for the HDS course, institutional travel restrictions during the COVID-19 pandemic prevented implementation of the previous training models. Therefore, we implemented a modified approach with asynchronous training via the online platform, followed by a week-long virtual workshop, and finally, a week-long inaugural course that was held virtually.
Evaluation Methods
Program evaluation indicators were selected to follow the 4 levels outlined in the Kirkpatrick evaluation model: reaction, learning, behavior, and results.9 For the HDS course, only reaction indicators and programmatic data were collected. We used survey data conducted with the fellows before and immediately following the training to measure their reaction and learning and data collected in May 2020 (4 months–2.5 years after completion of training) to determine behavior and results. To measure results from the students’ perspectives, we conducted a survey in June 2020 with students who took the M&E course in 2018. Data were collected using SurveyCTO and analyzed using Stata version 17 (StataCorp, College Station, TX, USA). Indices for knowledge, perceived knowledge, and confidence were scaled to range from 0 to 100. We reported descriptive statistics for each indicator of interest and then used the Wilcoxon matched-pairs signed-rank test to assess the changes in confidence and knowledge from before to after the in-person training. We used thematic analysis to analyze open-text responses.
Ethical Approval
This evaluation received non-human subjects determination from the Harvard T.H. Chan School of Public Health.
RESULTS
Faculty Fellow Training
A total of 26 faculty fellows representing institutions of higher learning in 8 African countries (Botswana, Ethiopia, Ghana, Nigeria, Rwanda, South Africa, Tanzania, and Uganda) participated in the in-person faculty training program from 2017 to 2020, with 7 to 11 fellows trained each year. Fellows were trained to teach either 1 course (18 fellows), 2 courses (5 fellows), or 3 courses (3 fellows). Fellows included junior researchers (4; 15%), junior faculty (11; 42%), and senior faculty (11; 42%) who had demonstrated interest in public health research and education. Of the fellows, 18 (69%) were male and 8 (31%) were female.
The fellows’ reaction to the training was positive, with 96% saying they would definitely recommend the workshop training to another faculty fellow (Table 1). For fellows in either the DM or CSA course, the median increase in confidence in pedagogy was 29 percentage points (IQR: 20, 44); median increase in confidence in teaching the course content area was also 29 percentage points (IQR: 19, 42); and median increase in fellow’s knowledge was 17 percentage points (IQR: 0, 20). Fellows’ confidence in teaching the course content after the training was not associated with their knowledge level of the course content after training.
Indicators of Training Reaction, Learning, Behavior, and Results Among the Faculty Fellows
As a result of participating in the faculty training program, faculty fellows increased their confidence in pedagogy and teaching the course content, as well as their knowledge.
In open-ended responses after the training, many fellows commented on a desire to continue collaborations after the program had finished.
I would like this cohort of trainees to remain in a network and collaborate on more projects.
I would suggest we keep helping each other out especially with regards to the data collection software.
Several fellows either suggested that the length of the training program be extended or that support be continued after the workshop was completed. Many fellows commented on how much they had learned.
I learned a lot. Even if the primary goal of the training is data management using software application, I learned so much about teaching skill.
Student Training
In subsequent offerings of the courses at fellows’ home institutions, fellows reached 380 students, incorporating support from either the course director or another faculty fellow in person 12 of 19 times (63%). They adapted the content to fit their context by adapting the number and length of sessions, sometimes incorporating the content into an already existing course or removing some content from the curriculum; providing additional instruction time for specific sessions to meet the needs of the student population; and including local examples. When asked what support fellows needed to implement the course at their university, nearly every fellow requested one of the other fellows to co-teach the course with them. COVID-19 pandemic changes to education affected the implementation of the courses after March 2020. The HDS course did not receive support from the McGoldrick program to have fellows travel to co-teach. The fellows did not implement the HDS course at their home institutions within the year following the training and workshop.
Faculty fellows from 5 of the 6 institutions offering the M&E course emailed an anonymous follow-up survey to students approximately 1 year after course completion. Of the 150 students contacted, 62 completed the survey (response rate: 40%). Students who took the M&E course found it to be useful and interesting (Table 2). They described many ways in which they incorporated what they learned into their work. Some common themes included conducting and interpreting quantitative analysis, using Stata software, formulating research questions and hypotheses, improving sampling for evaluation, and developing project indicators. Six students said they had not incorporated changes, citing either not having an opportunity to do so or not having sufficiently advanced or practical skills to do so. In the additional comments section of the survey, the most common suggestion on how to improve the course was to extend the length of the course. Other common themes included incorporating more practical examples with in-class time to practice/apply learning and using additional kinds of analytic software.
Survey Results From Students Participating in the First Round of Monitoring and Evaluation Courses Offered at Faculty Fellows’ Home Institutions
Development of Institutional and Cross-Country Collaborations
The professional network built by the program has led to additional collaborations for grant applications between faculty fellows. In one, a collaboration between Harvard University and University of Ghana faculty members developed a Master of Science Biostatistics program at the University of Ghana. Others were a data science training program in SSA and a needs assessment to develop a training program in epidemiology and biostatistics in TB in SSA.
Implementation Costs
Costs for each model of implementation varied widely. Costs were largely driven by flights and lodging for the in-person sessions. The cost of the program coordinator and faculty were provided in kind and not included in the following results. The initial model based in Boston was the most expensive, as each fellow flew from SSA to Boston, with an average cost per fellow of US$3,237. The SSA-centered model only required 2 trainers to fly between continents for the training while all fellows flew within SSA. While the in-person training was for 1 week in the initial model and 2 weeks for the SSA-centered model, total lodging prices were similar, given the different costs between countries. Despite the longer length of training, costs for the SSA-centered model were still lower than the initial model, at US$1,948 per fellow for the first year and US$2,858 per fellow the other year. For the final phase of the model where fellows taught the course at their home institutions, costs were again significantly lower for the SSA-centered approach as flights were within the continent instead of from Boston (US$2,409 versus US$3,377). The cost of the all-virtual model only included the time of the program coordinator and faculty teaching the course.
DISCUSSION
This faculty training program intentionally evolved to center the program within SSA by providing in-person training at a host institution in SSA and teaching support from within the network of African faculty instead of faculty from the United States. These efforts helped shift intellectual and logistical decision-making to local institutions and further supported the growth of a peer network.
The program has successfully graduated 26 fellows who continue to work collaboratively. The program combines evidence-based training techniques, including online education, group discussions, didactic and experiential learning, and network support.10 The pedagogical skills, improved confidence in teaching, and collaboration/support network are likely to reach beyond the individual courses offered.
The program has successfully graduated 26 fellows who continue to work collaboratively.
Challenges and Opportunities
There were several challenges and opportunities with the program. Because the program focuses on individual faculty, it had little impact on structural barriers to faculty growth and implementation at their home institutions. There were 2 opportunities to affect structural barriers within the current program. First, faculty who traveled to another site for the training were provided a reprieve from administrative tasks at their home institution and were better able to focus on the training. Second, in the application, we required a letter of institutional commitment to support the fellow in offering the course at the home institution. We saw this commitment during implementation in the form of protecting faculty time, providing physical space and computer facilities to offer the course, and providing administrative support for implementation.
Another identified challenge is that some faculty fellows may have had increased confidence in teaching specific course content even if they did not have the knowledge to effectively teach. If these faculty are not adequately supported, this could lead to poor training of students at their home institution.
In our evaluation, we had a low survey response rate from students. If students who gained more from the program were more likely to respond, this may have biased our results to look more favorable.
One additional opportunity for faculty education programs such as this one is to assist in closing gender inequities in higher education. The gender distribution of faculty fellows in this project represents that of the programs they were drawn from but missed an opportunity to recruit faculty through a stratified approach that would ensure gender equity in this training opportunity.11
A final opportunity identified through the evolution of the model to be centered in SSA was an opportunity to combine research and evaluation with training opportunities to reduce costs. The faculty teaching the CSA course were already traveling to Ethiopia for research and able to offer the course without the cost of an additional trip. Future programs could further capitalize on existing research and evaluation programs to host training opportunities among research faculty.
Next Steps
The materials for the 4 courses are now available to researchers and educators. A critical aspect of the implementation model for all courses was teaching support for faculty fellows when they returned to their host institution. In future iterations, the program is exploring ways to work with home institutions to protect faculty fellows’ time from administrative tasks during this time. Learning from this program and the success of the Demographic and Health Surveys Faculty Fellows Program,12 our program will support faculty fellows trained from the same institution together with a common senior leader at the institution to work together to implement their courses.
The program will continue to move forward with an SSA-centered model with further focus on region-specific training and support. In addition, plans for the program include moving some of the training and mentoring activities online to reduce costs and benefit more users in SSA. However, while online learning significantly reduced the costs, we found that it also led to lower implementation of the training at fellows’ home institutions. Fellows in the online course had fewer opportunities to network with each other, resulting in fewer collaborations on research and grants external to the training program. We believe that moving the program entirely online is unlikely to be successful.
To ensure that faculty leave the training with both improved confidence and linked improvement in competence, future program iterations will include additional pre-training for fellows entering without sufficient content knowledge and additional training and support for fellows completing the course without sufficient content knowledge. Finally, fellows will be intentionally paired with others to balance individual strengths and weaknesses to co-teach when the course is offered at their home institution.
While the current approach for individual capacity-strengthening for quantitative methods often focuses on brief, 3–5 day long workshops that focus on the course content, this evaluation has demonstrated that longer, more involved capacity-strengthening programs are likely to be effective. They offer the additional benefit of building a network of educators and researchers and giving time and opportunity for pedagogical training.
CONCLUSION
This faculty fellow training program has demonstrated that it is possible to create an international capacity-strengthening program to support faculty and develop leaders for quantitative methods expertise at institutions of higher learning in SSA. With suitable investments, this faculty training program can provide an adequate supply of highly skilled academics in SSA to deliver training in quantitative topics. Finally, home institution engagement strategies will be important to ensure individual fellows’ learning and future practice to implement the teaching of quantitative research at institutions of higher education in SSA.
Acknowledgments
The McGoldrick Professional Development Program in Public Health operates under the aegis of the Harvard T.H. Chan School of Public Health (HSPH) in collaboration with the Africa Research Implementation Science and Education (ARISE) Network. We thank the McGoldrick Program, HSPH, and the ARISE network for their support. We also thank the faculty fellows who contributed their time to this program; capacity-building would not be possible without your dedication to teaching. We are also indebted to all the faculty and teaching assistants who supported the faculty training, including Christopher Sudfeld, Anna Gage, Oyetundan Adediran, and Jonathan Oyebamiji Babalola. Programs like this also cannot run without administrative support, and for that we are grateful for Priti Thareja, Megan Scott, Mukamisha Donatienne, and Meskerem Teshome. Thank you to the program mentors and curriculum developers: Bethany Hedt-Gauthier, Wafaie Fawzi, Charles Ruranga, Henry Mwambi, and Dana Thomson. Finally, thank you to Lily Schneider for her support in submitting this article.
Funding
This program was supported by funding from Mr. John McGoldrick, ACE-DS through a ACEII World Bank grant, and Partnership for African Social and Governance Research through grants from Strategic Partnerships for Higher Education Innovation and Reform and the United Kingdom Foreign, Commonwealth and Development Office.
Author contributions
Oleosi Ntshebe: conducted the primary analysis, writing–original draft. Sarah Anoke: developed and led the implementation of the training courses and evaluation data collection. Jesca M. Batidzirai: advised analysis. Chris Guure: advised analysis. Beatrice Muganda: led the theory for the educational evaluation. Marcello Pagano: obtained funding for the program, conceived of the original program design, oversaw all program implementation, developed and led the first training course, mentored co-authors. Muhammed Semakula: advised analysis. Elysia Larson: conducted the primary analysis, writing–original draft, developed and led the implementation of the training courses and evaluation data collection. All authors critically revised the article and approved the final version.
Data availability
Details about the topics covered in each course are available upon request to the corresponding author.
Competing interests
SA, MP, EL have received funding support from the McGoldrick program.
Notes
Peer Reviewed
First Published Online: January 16, 2025.
Cite this article as: Ntshebe O, Anoke S, Batidzirai JM, et al. Building public health quantitative methods capacity and networks in sub-Saharan Africa: an evaluation of a faculty training program. Glob Health Sci Pract. 2025;13(1):e2200570. https://doi.org/10.9745/GHSP-D-22-00570
- Received: August 5, 2023.
- Accepted: December 5, 2024.
- © Ntshebe et al.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly cited. To view a copy of the license, visit https://creativecommons.org/licenses/by/4.0/. When linking to this article, please use the following permanent link: https://doi.org/10.9745/GHSP-D-22-00570