Research Process

Since 2007, DARA’s Humanitarian Response Index (HRI) has monitored OECD/DAC donor governments’ application of the GHD Principles using a combination of quantitative data on donor funding and policies with field research in different humanitarian crises to assess the quality of humanitarian assistance of these donors. The index contains 35 indicators of donor practices, organised into five pillars of practice. See Research Process chapter.

These indicators are used to also develop a donor group classification, according to the patterns of their similarities and respective differences in their performance based on a multi-dimensional analysis. This analysis helps to avoid oversimplification and/or misinterpretation of results, and offers more detail on a donor’s strengths and areas of improvement compared to its peers.

Field research for 2011 covered nine crises: Chad, Colombia, Democratic Republic of the Congo (DRC), Haiti, Kenya, and Sudan, which together received almost two thirds of international humanitarian assistance funding in 2010 (OCHA FTS 2011). Responses to a standard questionnaire asking interviewees for their opinions and perceptions – based on their direct experience liaising with the donors who support their work – of how well donors are applying good practice were collected. Over 1350 questionnaires were collected, with 877 pertaining to the 19 assessed OECD/DAC donors this year (there were insufficient responses for four donor governments) and the remainder for other funders and donors. This data served to build the qualitative indicators of the HRI.

Complementary data on government donor funding from sources such as the UN, World Bank and other international organisations was used to build the quantitative indicators of the Humanitarian Response Index. This edition of the HRI also includes a set of indicators which assess how donors address gender concerns in humanitarian action (see the chapter Addressing the Gender Challenge).

As indicated in previous HRI reports, there are limitations to the data available for the construction of these indicators, and the depth of analysis the HRI can provide. The research process, for example, uses financial data from 2010, which means that dramatic cuts to aid budgets by many donors, such as Spain, Ireland and others, are not reflected in the analysis. Equally, many of the recent positive moves taken by donors, like the UK and Australia, to update and improve their humanitarian assistance policy frameworks are not reflected in the data. These changes, both positive and negative, will take time to manifest at the field level, so any findings need to be contextualised.

One key element of the HRI research process is to capture the perspectives from the field, in order to understand how donors’ policies and practices are facilitating or impeding effective crisis responses. This year, as part of the validation process, we also followed-up with interviews at the headquarters level, and found that the perspectives from the field were largely corroborated by their headquarter colleagues. The HRI therefore offers a unique window for donors to get a broader overview of how they are perceived and where they could do better to support their partners.