Skip to content
All posts

How Can We Measure Impact in Research Capacity Strengthening Funding?

Dr Justin Pulford, Senior Lecturer at the Centre for Capacity Research, part of the Liverpool School of Tropical Medicine, discusses his role in working with practitioners and policymakers to improve methods for measuring the impact that funding for research capacity strengthening can have.


Research Capacity Strengthening (RCS)
has been defined as ‘the ongoing process of empowering individuals, institutions, organisations and nations to: define and prioritise problems systematically; develop and scientifically evaluate appropriate solutions and share and apply the knowledge generated’.[i] In recent years, international development donors, governments, and research councils have been increasingly investing in RCS initiatives in low- and- middle income countries (LMICs) as one means to improve socioeconomic development.

Such investment is founded on the assumptions that:

A) the production and uptake of locally produced research - of appropriate quality, quantity and relevance - will contribute towards socioeconomic development

B) existing LMIC research capacity is insufficient to maximally facilitate this.

 

The scale of UK’s investment in RCS, whilst not yet reliably calculated, is vast. For example, three current RCS initiatives (including Wellcome Trust-DFID’s ‘DELTAS Africa Initiative’, UKRI’s ‘Growing Research Capability’ GCRF call and NIHR’s ‘Global Health Research Units/Groups’ call) collectively account for close to £500 million alone.

However, despite this influx of funding, mechanisms by which RCS interventions ultimately impact on socioeconomic development in LMICs are poorly understood.

This in part is due to lack of investigation. Currently, there are no published impact evaluations of RCS initiatives and no standardised outcome or impact evaluation frameworks with operational metrics fit for purpose.

So with no published impact evaluations on RCS initiatives, and no agreed standard evaluation framework or metrics, how can we start to measure the impact of such investments?

 

As a first step to answering this question, a group of us from the Centre for Capacity Research (CCR) at the Liverpool School of Tropical Medicine have been working with funders to develop new guidance on improving RCS evaluation methods. Taking a holistic approach, we have analysed existing evaluation frameworks and catalogued the types of indicators that are currently used to evaluate RCS initiatives, identifying gaps concerning the quality and coverage of those indicators.

Our findings suggest most frameworks are chiefly oriented towards funders’ own internal performance requirements, rather than the impact generated by the RCS investment, and that the frameworks make limited reference to actual theories of research capacity strengthening. The validity of indicators and potential bias was rarely considered, and information on inter-relationships between indicators for outputs or outcomes was missing.

 

Our findings have demonstrated the importance of ensuring that funders’ over-arching theories of change (that describe how the overall scheme will achieve impact), and the theories of change for each funded RCS project within a scheme, are all aligned. Secondly, funders of RCS programmes can maximise evaluations of impact by explicitly capturing the RCS ‘ripple benefits’ that inevitably occur between individuals, institutions and societies.

 

Clearly there is more work still to be done. Whilst our work to date highlights the current problems faced by RCS funders due to the lack of a unifying, evidence-based approach to underpin their RCS efforts, greater attention to evaluation design, prospective indicator measurement, and systematic linkage of indicators is needed to provide more robust evidence on outcomes of health RCS, to enable a much more rigorous, harmonised and effective evaluation of RCS schemes.

[i] Lansang MA, Dennis R (2004). Building capacity in health research in the developing world. Bull World Health Organ. 82:764–70.