M&E in the practice of sport development

An inventory of the M&E activities of organisations in the field of sport and development cooperation leads to the following conclusions: Sport organisations find it difficult to set up M&E Organisations have great difficulty measuring the effect of their activities: “How are you supposed to measure the effect of sport on development? We see smiling faces and people having fun and we feel that it is working, but how do we measure it? We can hardly count the number of smiling faces!” Limited evaluation of sport development projects Only a limited number of evaluations are carried out or commissioned by sport development organisations. This is not good for the professionalisation of the sector or for the development of a knowledge base about sport as an instrument for development. Several organisations cited limited means as a reason not to carry out evaluations. M&E is often a residual item on the budget, sometimes it is left out altogether. It is striking that financiers set only limited quality requirements for evaluations. The development of knowledge in the field would stand to gain from more M&E. Evaluations are basically quantitative output measurements that contribute but little to expanding knowledge Many organisations evaluate on the basis of quantitative output measurements. In that case, the result of an evaluation is, for example, that 50 trainers were trained over a two-year period, or that 300 girls took part in the sports activities. Even quantitative figures on the scope of projects are often not available. Organisations cite two important reasons for choosing to do quantitative output measurement. First of all, it is relatively easy to carry out. Second, it generally satisfies donors. A quantitative output measurement yields little information about the functioning of the organisation or about the effect of sport as an instrument for development. In contrast, a process measurement (Along what lines was the training of the 50 trainers developed? What went well and what did not?) yields a great deal of information about the organisation’s functioning and the influence of the surrounding context. A process-based approach is thus preferable. Sometimes organisations go one step further than a quantitative output measurement, and investigate the process and sustainability of the project (will the intervention have a long-term effect? Can the local partner continue to be self-supporting in the long run?). The important question – to what extent the sport activity actually contributed to development – is seldom asked (relevance, impact) nor is the question about efficiency. This problem is compounded when sport is used in the project as an instrument to achieve gender equality or to combat HIV/AIDS, for example. Then organisations sometimes do not get beyond counting the number of active girls who took part in sports activities. Organisations generally find it difficult to also collect and analyse qualitative information in addition to quantitative data. Most organisations find it easy to ascertain the number of qualified trainers, but it is more difficult to obtain insight into the extent to which the quality of the trainers improved. Another example is the number of people reached by a project in the context of HIV/AIDS education, for example. It is not hard to make a good estimate of this number. However, it is much more difficult for organisations to say to what extent this education indeed led to a change in behaviour. Limited attention to planning is a source of problems M&E starts with the description of the project. In formulating its objectives, an organisation should ask itself the question: How will I measure the results? How will I find out to what extent the objective has been achieved? Indicators must be formulated to measure the effectiveness of the project. Many sport and development cooperation organisations do not take M&E into account in the start-up phase, and only start thinking about it once the project is halfway or more. Sometimes project objectives are not clearly formulated, which makes them difficult if not impossible to measure. Organisations find it difficult to deal with complex planning systems such as the logical framework, which defines the project objectives as they follow from a logical sequence of activities, results to be expected and contextual factors . Organisations may have difficulty finding suitable performance indicators.

Leave a Reply

Your email address will not be published. Required fields are marked *