Weak monitoring and evaluation (M&E) systems and limited supply of M&E human resources in Africa signal the need to strengthen M&E capacity.
This exploratory study evaluated the effect of short course training on professionals’ knowledge and skills in the areas of mixed methods research, systematic review and meta-analysis and general principles of M&E.
A partially mixed concurrent dominant status design including quantitative (multilevel modelling and meta-analyses) and qualitative (thematic content analysis) components was employed to evaluate the impact of a 4-day short course training focusing on these areas.
Thirty-five participants participated in the training. Participants experienced an increase in knowledge in the three areas; however, average change in knowledge did not differ across participants’ employment settings. Participants’ self-stated objectives considered as SMART and belonging to a higher level in Bloom’s taxonomy were associated with change in knowledge. Based on comments made by participants, majority intended to apply what they learned to their work; clarity of content delivery was the most liked aspect of the training, and the use of more practical sessions was recommended as a way to improve the training.
This study provides preliminary evidence of potential of the use of short course training as an approach to strengthening capacity in M&E in less-developed countries such as Kenya. It underscores the importance of participants’ self-stated objective(s) as an element to be considered in the enhancement of knowledge, attitudes and skills needed for acceptable capacity building in M&E.
Effective monitoring and evaluation (M&E) is pivotal for good governance and public resource management because it promotes transparency, accountability and a performance culture (AFDB
The literature on continuing professional development in M&E identifies practice-based learning activities for enhancing professional competence and lately links it to various strategies for lifelong learning (Kuji-Shikatani
Over the last 10 years, Kenya has made significant progress in strengthening M&E capacity. A number of training programmes have been established by universities or colleges, research institutions and development partners. Nine universities (five public and four private) currently offer master’s level training in M&E. International organisations and communities of practice such as
For most M&E professionals, who tend to have limited time to attend formal training,
The primary purpose of this study was to evaluate the effect of
A team of five trainers from the University of South Florida, University of Nairobi, ICF, MEASURE Evaluation PIMA, Children of God Relief Institute and Africa Capacity Alliance conducted a 4-day short course consisting of three interrelated segments: MMR, SRMA and general principles of M&E. The purpose of the training was to strengthen the practical M&E skills by focusing on MMR, SRMA and statistical data analysis.
The training consisted of 16 modules. Each day (08:30–17:30), four modules were covered, each lasting about 2 h. There was a 30-min morning tea break, 1-h lunch break and 15-min evening tea break. A typical session consisted of an interactive PowerPoint presentation with lots of practical examples, questions and answers, practice using different analysis software with trainers being present for consultation, request of verbal feedback from participants, sharing of real-life examples (e.g. having two doctoral students present a summary of their dissertation work, which was based on MMR design) and reinforcement of key concepts at the end of each session. Moore, Green and Gallis’ conceptual model for planning and assessing continuous learning guided the structuring of the training (Moore, Green & Gallis
This training was unique in several aspects. Firstly, it incorporated hands-on use of multiple software (RevMan, Stata, R, Excel and CMA) to perform different analysis, thus affording participants the opportunity to practice what they learned before returning to their workplace. Secondly, rather than waiting until the end of training, participants provided feedback which informed readjustment of the training to be more relevant to their needs. Thirdly, participant diversity in terms of disciplines encouraged interaction as all shared a common interest in M&E. Fourthly, unlike most training where trainers all come from the same institution, ours was a multidisciplinary team from different collaborating institutions: a Carnegie African Diaspora Fellowship Program (CADFP) fellow from the University of South Florida in the United States; a CADFP host from the School of Mathematics, University of Nairobi; and three M&E experts from three different capacity development partners in Kenya (ICF, MEASURE Evaluation PIMA, Children of God Relief Institute and Africa Capacity Alliance). The composition of the training team was informed by a study indicating that collaboration between international partners and African institutions or between in-country institutions and organisations keen on ECB was a promising strategy for enhancing M&E capacity in Africa (Tarsilla
To evaluate the effect of short course on participants’ knowledge and skills in the areas of MMR, SRMA and principles of M&E, a
Two questions were addressed in the
Data for this study were collected as part of a larger project,
The training targeted both pre- and in-service M&E professionals including early-career faculty and graduate students interested in enhancing their programme evaluation skills and practitioners in health, social and behavioural science research from public and private sectors. Noteworthy, participation in this study was motivated by participants’ desire to enhance their knowledge and skills in M&E in general and specifically in two areas: MMR and SRMA. About one-third of the participants were funded by their institution. The remaining participants were either self-funded (non-students) or received a waiver (students) from the host institution.
At the end of the training, participants were emailed a link to the evaluation survey and requested to complete it using either their laptops or smartphones. The survey, developed in
Change in knowledge, the outcome of interest, was assessed by having participants rate their level of knowledge about content covered
The survey included open-ended items requiring participants to state their objective for attending the session, changes in practice they intended to make following the training (or perceived barriers to making the changes), aspects of the training they liked the most, topics they suggested for future training and recommendation of how to improve the training. Responses to these items constituted the qualitative data. Qualitative data were quantitised to aid integration with quantitative data. For example, self-stated objectives were examined for a degree to which they constituted a SMART objective (Doran
To model change in knowledge, we considered the clustering of participants (level-1 unit of analysis) within employment settings (level-2). Acknowledging this nested nature of data implied, we did not assume the outcome was invariant across employment settings (academic vs. government agency vs. nongovernmental organisations). Such an assumption may lead to incorrect conclusions being drawn from the resulting inferential statistics (Raudenbush & Bryk
A total of 35 participants from diverse backgrounds participated in the training (43% women; 31% students, 21% faculty members, 41% in-service M&E professionals; 31% from academic institutions, 31% public institutions and 31% NGOs). Participants spent an average of 24 min to complete the survey (9–56 min). Although all participants were expected to participate in all modules, a few participants were about 5–10 min late for some modules.
The results of the meta-analysis showed an overall statistically significant increase in mean change in knowledge following the training [standard mean difference (SMD) = 1.60, 95% CI: 1.03, 2.17] (
Effect of short course training on knowledge (
The results of the unconditional means model showed that for each segment, the average change in knowledge did not differ across employment settings (τ00); however, there was significant variation (σ2) amongst participants within employment settings (
Parameter estimates and standard errors for modelling change in knowledge following short course training (
Parameter estimate or factor | MMR | SRMA | M&E |
---|---|---|---|
Unconditional model (no factor included) | |||
Change in knowledge across settings (τ00) | 0.0185 | 0.1703 | 0.0541 |
Variance within employment setting (σ2) | 0.7764 |
0.6930 |
0.9110 |
Intra-class correction (ρ) | 0.023 | 0.197 | 0.056 |
Average change in knowledge | 1.408 |
1.271 |
0.891 |
Model with other factors added | |||
Intercept |
-4.255(2.08) | -1.561(0.28) | 0.143(2.08) |
I achieved my objective(s) | -0.309(0.49) | -0.576(0.48) | -0.230(0.26) |
SMART objective | 0.839(0.34) |
-0.591(0.42) | -0.484(0.29) |
Higher level Bloom’s taxonomy ‘objective’ | 1.062(0.39) |
0.545(0.41) | -0.741(0.36) |
Information was conveyed effectively | 1.479(0.48) | 1.403(0.61) | -0.387(0.37) |
Technology did not hinder learning | 0.212(0.22) | 0.841(0.34) | -0.121(0.13) |
There was opportunity for interaction | -0.917(0.83) | -0.231(0.52) | -0.013(0.36) |
I would recommend the session to peers | 0.565(0.62) | -1.430(0.82) | 1.353(0.30) |
The session was organised | 0.340(0.40) | -0.177(0.93) | 0.110(0.33) |
Content was applicable to my work | 0.813(0.31) |
0.225(0.23) | 0.790(0.44) |
Pace of the session was just right | 0.208(0.21) | -0.045(0.23) | 0.554(0.22) |
I can teach peers what I learned | -0.113(0.35) | 1.079(0.44) | 0.077(0.17) |
MMR, mixed methods research; SRMA, systematic review and meta-analysis; M&E, monitoring and evaluation.
Intercept refers to setting-level mean change in knowledge score.
indicates significance at α = 0.05 level.
When other factors were added to the unconditional model (
Majority of participants who consented to participate in the three sections of the evaluation survey responded to the open-ended questions (MMR 85%, SRMA 96% and M&E 100%).
We assumed that the content of the training was relevant and time-bound within the training lifespan. Thus, determination of whether an objective was ‘SMART’ depended on evidence of being specific, measurable and attainable. Thematic analysis revealed that only a few participants stated objectives which were classified as SMART objectives (MMR 35%, SRMA 26% and M&E 30%). For example, ‘… to
Participants’ self-stated objectives classified as belonging to higher level Bloom’s taxonomy varied (MMR 43%, SRMA 30% and M&E 55%). Examples included ‘To learn MMR skills which in turn will assist me in the supervision of undergraduate and postgraduate academic projects’ (MMR), ‘Gain more skills in SR and MA especially as it applies to M&E’ (SRMA) and ‘To equip myself with the M&E skills that I can use in my career today and in future’ (M&E).
The majority of participants made statements suggesting that they intended to apply what they learned in their work (MMR 67%, SRMA 57% and M&E 83%). For example, MMR (‘I plan to use MMR in conducting programme evaluations and during my PHD,…I’ll apply MMR in my literature review’ and ‘Better equipped to conduct technical reviews of evaluation proposal, reviewing academic pieces of work (theses, abstracts)…better equipped to facilitate technical evaluation methods training..’), SRMA (‘Encourage more students to consider conducting a SR and MA as their thesis or dissertation if this sparks interest or appropriate for their chosen topic’ and ‘I am going to use results from existing systematic reviews more effectively at work … I plan to conduct a more structured literature review for my master’s thesis which is ongoing based on the skills I acquired’) and M&E (‘Use of work plans in my daily office tasks and projects’, ‘I am now in a better position of writing good frameworks for proposals’ and ‘Endeavour to use M&E tools to structure all the M&E activities within the projects I am in charge in the organisation’). Only six participants cited potential barriers, which we broadly categorised as
Although ‘clarity of content delivery’ and ‘applicability of knowledge acquired to work’ emerged as the most liked aspects of the training (i.e. highest average values in
Frequency of themes describing most liked aspects of the short course training.
Theme | MMR (%) ( |
SRMA (%) ( |
M&E (%) ( |
Average (%) |
---|---|---|---|---|
Clarity of content presentation | 41 | 35 | 25 | 34 |
Knowledge acquired is applicable in work | 33 | 30 | 30 | 31 |
Use of technology (software) or tools | 19 | 43 | 25 | 29 |
Intriguing nature of the content presented | 19 | 17 | 45 | 27 |
Interactive session or use of practical examples | 22 | 9 | 10 | 14 |
MMR, mixed methods research; SRMA, systematic review and meta-analysis; M&E, monitoring and evaluation.
For the MMR segment, participants suggested qualitative data analysis (‘More practice in analysing qualitative data’ and ‘Practical on qualitative analysis’) including the use of analysis software (‘software for creating qualitative research themes’ and ‘The analysis of QUAL data’). For the SRMA, more practicals using different software and emphasising data acquisition (‘How to easily identify the variable to pick for use in SR and MA’ and ‘Critical appraisal of studies’) and interpretation (‘Interpretation of the resultant findings of the two processes’) were cited. For the M&E, participants cited ‘Developing M&E work log frame and work plan’, advance M&E topics (‘Complexity-Aware M&E Approaches: Outcome mapping, Impact Evaluation and Communicating Data for Impact’), big data and how to set up M&E system.
Comments from participants indicated that majority felt that the training could be improved by having ‘more practical sessions’ and dedicating ‘more time on practicals’ (i.e. highest average values in
Frequency of themes describing how to improve the short course training.
Theme | MMR (%) ( |
SRMA (%) ( |
M&E (%) ( |
Average (%) |
---|---|---|---|---|
More practical sessions | 24 | 35 | 28 | 29 |
More time for practicals | 28 | 25 | 22 | 25 |
More time (general, no reason specified) | 32 | 15 | 17 | 21 |
More time for wide coverage of new topic | 8 | 20 | 6 | 11 |
Sending reading materials |
- | 5 | 12 | 9 |
Customise into basic versus advance levels | 12 | - | 6 | 9 |
Need for more partnerships to offer training | - | 5 | 6 | 6 |
MMR, mixed methods research; SRMA, systematic review and meta-analysis; M&E, monitoring and evaluation.
Although the items included in the online evaluation survey had low sensitivity, a simple informed consent was sought from each participant prior to responding to questions in each of the three segments of the survey. Participants had to click ‘Yes’ to participate in each segment or ‘No’ to decline participation. Some quotations from participants have been included in this study, however, anonymity is still guaranteed as one cannot trace them to individual respondents.
The purpose of this study was to evaluate the effect of short course training on participants’ knowledge and skills in the areas of MMR, SRMA and general principles of M&E. Overall, we found that short course training impacts trainee’s knowledge and skills in these areas. This finding coincides with the findings of prior research highlighting the potential benefits of short courses in improving the knowledge and skills of biomedical researchers and scholars in Africa (Chima, Nkwanyana & Esterhuizen
Quantitative findings of this study suggest that participants’ self-stated objectives classified as SMART objective and belonging to the higher level of Bloom’s taxonomy were associated with significant increase in participants’ knowledge. This finding is corroborated in the qualitative phase in which participants who experienced a large increase in knowledge also tended to state objectives which were considered SMART and belonged to higher level Bloom’s taxonomy.
A clearly stated objective by a participant allows both a trainee and a trainer to determine whether the objective is achieved at the end of the training. In this study, we find that few participants stated objectives, which were considered SMART. Participants tended to use words such as ‘develop’, ‘facilitate’ or ‘support’, which are less descriptive, less specific and difficult to measure. It is worth remembering that the greater the specificity, the greater the measurability. Another problem was the multiple verb use. For example, ‘To explore opportunities to increase my skills in SRMA’ could simply be stated as ‘To increase my skills…’ because exploring opportunities is a step towards increasing skills. Similarly, ‘To
Trainers routinely state learning objectives at the beginning of sessions so that learners know what is expected. Findings from this evaluation suggests that asking participants to self-state their objectives for participating in a session is an important element of training evaluation that is seldom undertaken. It can be used to ascertain the degree to which participants’ objectives are met, a useful piece of information as participants’ objectives may be incongruent with the trainers’ stated objectives. We recommend that at the beginning of a training, trainers should ask participants to verbally state their objectives for attending the training and help trainees contrast objectives (i.e. specific, measurable, narrow and concrete statements) with goals (general, broad, intangible and abstract statements). In this study, majority of participants tended to state goals instead of objectives. Words such as ‘explore’, ‘seek’ and ‘encourage’ should be avoided as they tend to describe strategies instead of objectives. The use of more precise verbs (e.g. ‘list’, ‘identify’, ‘compare and contrast’, ‘state’, ‘describe’ and ‘indicate’) which document action and are open to few interpretations should be encouraged.
A secondary purpose of this study was to explore the extent to which short course training facilitated change in practice at work. Findings from the quantitative phase show that content perceived to be relevant to work is positively associated with increase in knowledge. Similarly, participants’ statements of changes in practice intended following the training (qualitative phase) largely revolved around application of knowledge to work. In sum, this baseline evaluation is intended to inform future recommendations for use of short courses to strengthen M&E capacity in less-developed countries such as Kenya. Short courses have been successfully employed in health-related training (Bayona et al.
Firstly, we acknowledge that the included variables in the quantitative phase constitute a non-exhaustive list of potential predictors of change in knowledge, thus limiting our conclusions. Secondly, participants’ self-reported increase in knowledge may not accurately reflect actual changes in behaviour. However, we included factors that are typically representative of the impact of educational training. Finally, we did not collect baseline data to firmly confirm the impact of short courses; however, plans are underway to send a post-training survey in 6 months. Despite these limitations, this study has several strengths. Firstly, the findings provide an empirical, albeit preliminary, evidence related to the impact of short course training, which might inform the design and conduct of future studies. Secondly, with the burgeoning use of short course approach, to our knowledge, it is the first evaluation of short course training focusing MMR, SRMA and principles of M&E. A prior study in South Africa which evaluated the impact of 4-day short course on knowledge and skills of biomedical researchers and scholars in biostatistics employed a quantitative approach (Chima et al.
The weak M&E systems and limited supply of M&E human resources in Africa added to the emerging demands to successfully implement national development plans (e.g., The African Union’s Agenda 2063 and United Nation’s Sustainable Development Goals) signals the need to strengthen M&E capacity in resource-constrained countries such as Kenya. This mixed methods evaluation provides preliminary evidence of potential of the use of short course training as an approach to strengthening capacity in M&E in less-developed countries such as Kenya. It underscores the importance of having participants state their objectives for attending the training. This information is useful in the evaluation of the impact of the training. For institutions in less-developed countries that are interested in using short course strategy to build capacity in M&E, this study has the following recommendations:
Involve potential participants
Prior to training, ask participants to state their objectives for participating in the training. Given the importance of SMART objective, trainers should provide participants with sample SMART objective. As revealed in this study, majority of participants are unlikely to state SMART objective, and thus such guidance would help participants to craft SMART objectives. By self-stating their objectives, during the training evaluation, participants will be able to determine the extent to which the objectives are met. In addition, trainers will be able to ascertain the degree to which those objectives are consistent with training objectives.
Rather than waiting until the end of the training, request feedback from participants during the meeting and consider adjusting the training in response to the feedback. Feedback can be sought informally during tea break or formally by asking participants to note on a piece of paper ‘what went well’ and ‘what can be improved’ at the end of the day.
Finally, given that participants are likely to be from diverse backgrounds, it is recommended that a multidisciplinary training team be assembled. This would ensure that examples provided are relevant to the participants. For example, in this study, a multidisciplinary training team ensured that examples of problems analysed using analysis software were from a wide range of disciplines, thus encouraging engagement in the training.
The authors acknowledge the technical support received from the faculty and staff at the University of Nairobi, School of Mathematics (SOM). We are particularly grateful to Prof. Patrick Weke, Director of SOM, for his support during the training. The evaluation was made possible by the cooperation of participants who completed the online survey.
This work was supported by a grant from the Institute of International Education’s Carnegie African Diaspora Fellowship Program, funded by the Carnegie Corporation of New York.
The authors declare that they have no financial or personal relationship(s) that may have inappropriately influenced them in writing this article.
H.W. conceived the initial idea for the study and developed the study design, prepared both quantitative and qualitative data and wrote the initial draft of the manuscript. H.W. and N.O. conducted the statistical analysis, incorporating qualitative data. R.O., E.M. and M.N. were principally responsible for literature review, data cleaning, data coding and qualitative data analysis. N.O. reviewed the statistical analyses. All authors contributed significantly to the data interpretation, drafting of the manuscript and revision of the manuscript. All authors gave their approval of the final manuscript for publication.