In 2008, IOD PARC was contracted to conduct an evaluation of the African Development Bank’s Decentralisation strategy and process. In addition to traditional ev aluation methods, the Most Significant Change (MSC) methodology was used. This technique, developed by Rick Davies in the 1990s, involves the collection and systematic participatory interpretation of stories of significant change. This evaluation pioneered the use of MSC in a high-level multilateral organisation and also was the first of its kind to integrate video and blogging as a way to document the process and engage participants.
The following 10-minute introductory video is an interview with the evaluation’s team leaders, Julian Gayfer and Rick Davies, reflecting on the use of MSC in this project.
Most Significant Change (MSC) Technique Learning Note
This Learning Note on the use of the Most Significant Change (MSC) Technique has been developed with the idea of providing an overview of the key elements MSC, and how it can add value to monitoring and evaluation systems and approaches in various contexts. It also presents a review of the use of MSC by IOD PARC as one of several evaluation methods in ‘The Independent Evaluation of the Decentralisation Strategy and Process in the AfDB’ in 2008-09.
What is MSC?
The Most Significant Change (MSC) technique is a story-based tool for participatory monitoring and evaluation. MSC was conceived by Rick Davies and Jess Dart in the mid-90’s, and they describe the process as “the collection of significant change (SC) stories emanating from the field level, and the systematic selection of the most significant of these stories by panels of designated stakeholders or staff.”. The stories are told by designated project staff and stakeholders and revolve around the changes that have taken place as a result of the intervention. All the stories are then documented and shared with a group of people that will discuss them and agree on the most significant (MS) of those stories on the basis of a series of arguments and criteria defined by the group during the discussions. The selection of the MS story develops similarly to how the editors of a newspaper would pick the stories to feature on the cover of the paper the following day. In that way, the group of people will debate about the value of the changes reported and where the focus of the project impact reported should lie.
The MSC methodology is sometimes referred to as monitoring-without-indicators, because it does not make use of indicators that can be tracked, or as the ‘story’ approach, given that the data collected is in the format of stories of what change happened, when and why. MSC has also a strong aspect of impact monitoring, as it focuses on monitoring outcomes and impacts, rather than changes at activity and output level. And it also places significant emphasis on organisational learning, through the documentation that takes place throughout the process.
Furthermore, the MSC methodology can be applied regularly throughout the life of a project as a way of monitoring its implementation and therefore supplying project managers with ongoing data on the direction of the programme. Programme evaluations can also be informed by the MSC technique since it supplies information at impact and outcome level, allowing the review of programme performance as a whole.
The methodology is an easy and accessible tool for participatory monitoring that does not require any technical skills or the explanation of any M&E terms and concepts to those involved, as it relies on the universal format of stories and values. It focuses on learning rather than just accountability and it has proved an appropriate tool to use in evaluations and monitoring work requiring an account of the human dimension of the project (rather than just indicators or figures), as well as learning about the key values of an organisation during the discussions to select the most significant story.It is a good way of identifying unexpected changes (positive or negative) in that the pool of stories is as wide as the experiences that the respondents have had vis-à-vis the project. In addition to its straightforward format for data collection, the approach engages participants in the analysis of information as there is need to provide arguments when favouring a story in the selection of the most significant one. This aspect can contribute to building the capacity of team members in the analysis of data and identification of impact.
Finally, it is a useful methodology for M&E in interventions that have undefined outcomes or shortages in monitoring data.
The MSC methodology in practice
We used the MSC analysis as part of a mix of evaluation methods in ‘The Independent Evaluation of the Decentralisation Strategy and Process in the AfDB’ in 2008-09. The focus of the MSC in this assignment was to gain an insight into AfDB staff perceptions of change in three areas since 2004:
Changes in the relationship between AfDB Headquarters and Field Offices
Changes in the management of the AfDB portfolio
Changes in the country dialogue
We obtained stories of change via individual interviews for each of the three change areas with five members of staff in four AfDB Field Offices (Nigeria, Tanzania, Sierra Leone and Burkina Faso) and they were then subject to selection by three stakeholder groups (AfDB Field Office staff, National Stakeholders and Donors). The MS stories selected in all four offices where then presented to four stakeholder groups within the AfDB HQ in Tunis and were also subject to a selection by each of those groups.
The methodology was easy to grasp and explain to participants. AfDB Field Office and HQ teams responded in a positive and open way once the process was explained to them. In general, people were not aware of the technique but a clear explanation using colloquial speech (as opposed to M&E terms) and the opportunity to answer their questions about the process led to an active participation and acceptable understanding from all parties.
In various instances, there was the need to clarify issues around confidentiality, especially relating to the use of video, and when filming the summaries of those stories. To allay any concerns, we gave them the choice for stories to go unnamed and to decide for themselves if they wanted their story summary to be recorded with the video. The option to choose anonymity reassured all participants to the point where everyone opted to sign their stories and be filmed, which benefited our research since it is preferable to associate a story with a person and position as this adds more weight and context value to the account.
Isabelle Lemaire, from InsightShare, is an expert in participatory video (PV) and joined the evaluation team as the video consultant for the filming and editing of MSC videos. To find out more about PV and InsightShare, click here.
Being a systematic, transparent and participatory method of collecting and analysing qualitative information about change, MSC provided a good structure to hold discussions with team members, both individually (one-to-one interviews) as well as in groups (when selecting the MS story), at different levels and with a range of stakeholders. In addition, this systematic approach to collecting and analysing qualitative information added rigour to a methodology whose outcomes could otherwise be regarded merely as of anecdotal value.
Few technical skills are required for practitioners to apply the technique satisfactorily; besides having a good understanding of the focus of M&E and the key lines of the inquiry; good inter-personal skills are highly valuable to keep the participatory process enjoyable for those involved. The practitioner should have the ability to explain the methodology in an accessible way using easy language, and to keep the focus of the inquiry clear and straightforward. Approachability and a sense of humour helped in creating a relaxed atmosphere in plenary sessions for the voting and selection of stories. Willingness to adapt the approach to changing circumstances proved invaluable given the uncertainty of testing a methodology in new ground (learning from experience and unexpected resource constraints as we went along).
On inquiring a number of individuals about their own view of the most significant changes (as an open question) brought about by an intervention, the pool of information where one is drawing data from grows exponentially. In that way the method proved very useful in capturing valuable information (including unexpected changes) that could easily have been left out the ‘evaluation radar’, either because they were not obvious items to include in an interview schedule or because they told important aspects of the decentralisation process in the eyes of those experiencing it, rather than as a topic of choice of the evaluators.
From a participatory point of view, the step of the technique that involves one-to-one interviews does not differ, broadly speaking, from many other evaluation methods. However, the step that entails the selection of the most significant of stories can bring the whole team together and engage them in a lively debate through their familiarisation with the stories recorded and the discussion to choose the most significant one. The way we structured this stage allowed for an active participation of all attendees as we called for two rounds of individual votes shown on a screen, and collected the arguments that supported each vote as we went along. The debate shed light on the values and priorities of the team and the organisation. The dynamic formed in the offices during the work of the MSC stream of the evaluation was rather positive and engaging, even when time-keeping was one big limitation.
One of the main challenges with this methodology is to secure time availability from the respondents and programme staff to attend the interviews and plenary session to select the most significant story. It required good planning for the week prior to our visit and keeping to a tight schedule while on site. For meeting arrangements and the management of staff diaries, it was crucial to count with the support of an internal member of staff that ensured all team members were aware of the schedule and the importance of their individual participation. The endorsement by a leading member within the organisation or programme also facilitated the smooth running of the process.
Situations where there is no culture of open criticism can be hermetic to making one’s views and judgements explicit in front of a group of co-workers and managers and therefore there would be little value to the use of MSC in such an environment.
The integration of the MSC technique with the other inquiry streams within the evaluation was successful in the way that different samples were used for data collection by the main streams of inquiry and therefore there was no risk to overburden key informants. One of the key challenges was to articulate the interim findings of those streams as they ran in parallel, as time and tight logistics did not make it easy to set up a feedback mechanism from each of those threads into the other. MSC proved an excellent approach to reflect on the human dimension of decentralisation and provided illustrative examples of some of the key issues while reinforcing some of the conclusions. The technique was easily tailored to fit the approach and needs of the evaluation and at the same time retained its own essence.
This resource was developed by Juana Espasa, Clemence Billoud and Ronnie MacPherson.