|You are here: Home > Quarterly reports|
ESDN Quarterly Reports
The ESDN Quarterly Reports provide in-depth documentation of a selected topic on a quarterly basis. The current ESDN Quarterly Report is displayed below. For previous ESDN Quarterly Reports click here.
ESDN Quarterly Report September 2006
Evaluation and Review of National Sustainable Development Strategies
This ESDN Quarterly Report gives an overview of different approaches in the evaluation and review of National Sustainable Development Strategies (NSDS) in Europe. In so doing, it concentrates on qualitative evaluations and reviews that assess the quality of process-related aspects of SD strategies, such as policy-making processes, policy instruments, implementation procedures, stakeholder involvement, coordination activities, etc. Taking the relevant presentations and discussions at the ESDN Conference 2006 in Salzburg into account, the report provides further details about evaluation and reviewing experiences made in Austria, France, Switzerland and the UK. It summarizes the different approaches by discussing their strengths and weaknesses.
When looking through policy documents, reports and academic literature dealing with SD policy making, it is striking that terms like evaluation, assessment, review, audit or monitoring are frequently used synonymously without a proper clarification how they differ from each other. Two examples: In a contribution to the journal Natural Resource Forum, an article states that “a variety of approaches have been used for assessing a country’s NSDS” and that “all these evaluations examine (…)” (George and Kirkpatrick, 2006: 147). A recent report prepared for the OECD (Dalal-Clayton and Bass, 2006) is concerned about “monitoring mechanisms for NSDS” and gives an overview of internal reviews, external auditing, etc. Again, the terms used are not clarified and comprehensibly distinguished from each other. Therefore, we suggest to starting a reflection process about the meaning of the most commonly used concepts of providing feedback in an SD strategy process, such as:
Some of the key characteristics of the different feedback mechanisms can be described as follows.
On a general level, one can distinguish between process-oriented approaches and performance-oriented approaches (Martinuzzi, 2004) of providing feedback in an SD strategy process. While the approaches that focus on the processes of SD policy-making (e.g. policy coordination/integration activities, deployment of policy instruments, implementation procedures, stakeholder participation, etc.) apply rather qualitative methods, those that measure the performance of SD strategies with regard to policy objectives often use quantitative SD indicators.
Another important general distinction has to be made with regard to the levels of application of the different approaches:
This Quarterly Report focuses on qualitative evaluations and reviews of NSDS in selected European countries.
NSDS are strategic processes rather than strategy documents. They need to adapt to new situations and challenges constantly. The various feedback mechanisms described above help them by doing so: by providing feedback to policy-makers they are “the basis for coherent and self-reflective policy-making in a knowledge-based society” (Steurer and Martinuzzi, 2005). Additionally, evaluations become crucial in a time of increasing demand from various stakeholders to gain insights into the delivery process of NSDS objectives and to see that SD strategies make a difference in policy-making.
The report “A review of monitoring mechanisms for National Sustainable Development Strategies” by Dalal-Calyton and Bass (2006) outlines the following five strategic purposes of “strategy monitoring”:
Overall, all types of feedback mechanisms in SD strategy processes must be seen as part of a broader “Sustainability Management System”. Evaluating, reviewing and monitoring SD strategies are not isolated tasks that judge the quality of the strategy process and/or measure the effectiveness and impacts of individual projects, but they are parts of an organised feedback process which should be integral part of the governance of SD.
A review of NSDS by the European Commission in 2004 revealed that NSDS vary considerably regarding their content, approach and level of implementation. Generally, the MS face a number of common challenges in the design and implementation of their NSDS, such as:
The report, “National Strategies for Sustainable Development”, published by the International Institute for SD (IISD) and the Germany Society for Technical Cooperation (GTZ) in 2004 pointed out that although important progress has been made, the 19 countries analysed in the report were only at “the early stages of learning toward effective strategic and coordinated action for SD” (IISD & GTZ, 2004, ix). One of the key challenges identified in the report is the development of feedback mechanisms, including monitoring, evaluation, learning and adaptation.
In early 2005, the European Environment and Sustainable Development Advisory Council (EEAC) published the study “Sustaining Sustainability” about the state-of-the-art of NSDS in nine EU MS. One general finding is that most countries characterise the SD process as a ‘learning process’. As a consequence, “SD strategies cannot be implemented like a ‘plan’, but need flexible approaches on the government side with at the same time firm and accountable objectives” (Niestroy, 2005, 11). The study identifies several future challenges, among them an in-depth comparative analysis of national priorities, targets and indicators outlined in the NSDS.
Steurer and Martinuzzi (2005) provide a comprehensive overview of strategy formation patterns and experiences with NSDS in Europe. They refer to NSDS as important policy strategy documents available to governments to systematically organise the SD policy-making process and list key characteristics and good practices of NSDS processes. As their analysis shows, virtually all European NSDS processes feature some sort of monitoring and/or evaluation mechanisms, many of them outlined in the strategy documents themselves.
As mentioned in a recent OECD report (2006) on good practices in the NSDS, these documents must not be understood as static plans, but should evolve over time as more information becomes available and the implementation process is evaluated and/or reviewed. The OECD report suggests that “learning, adaptation and continual improvement should be characteristics of national strategies. This requires a process to monitor strategy implementation, to report to government bodies and stakeholders, and to feed back information for adjustment and improvements” (OECD, 2006, p. 29). In this context, the guidelines developed for NSDS by various bodies, like the OECD, the UN and the International Institute for Environment and Development (IIED) highlight the need for evaluation in order to enable learning about whether a strategy is on the right path and if the objectives and targets set out are translated into action.
The topic is also addressed in the renewed EU SDS that includes a section on ‘implementation, monitoring and follow-up’ which states that “a progress report on the implementation of the SD strategy in the EU and the Member States” (EU SDS, p. 26, 33) will have to be submitted every two years.
All NSDS processes foresee some sort of evaluating, reviewing or monitoring mechanism process. For an overview of the mechanisms used in Europe, please go to the Country Profiles section of the ESDN website.
This section provides an overview of the practical experiences with qualitative evaluations and reviews that were made in selected European countries.
Peer reviews are most often associated with the Organisation for Economic Cooperation and Development (OECD) that began to use the peer review process in the 1960s. Since then, peer reviews lie at the heart of the international cooperation in the OECD and the method has been adopted by various international organisations, like the EU, the United Nations (UN), the International Monetary Fund (IMF) or the World Trade Organisation (WTO). In a 2002 OECD report, a peer review is described as “the systematic examination and assessment of the performance of a state by other states, with the ultimate goal of helping the reviewed state improve its policy making, adopt best practices, and comply with established standards and principles” (p. 4). Furthermore, it is highlighted that a peer review is conducted on a non-adversarial basis and relies heavily on mutual trust among the states involved as well as their shared confidence in the process. Lehtonen (2005, 177) argues that “peer reviews can be seen as a mechanism attempting to combine the functions of learning and accountability within a single evaluation framework”.
In the context of SD, the European Union addressed the issue with a proposal at the World Summit on SD in Johannesburg in 2002 where it suggested developing a system for promoting the sharing of experience with NSDS between countries. The need for more coherence between the various NSDS in Europe and devise mechanisms to better pool experience and good practices remain an objective of the EU, especially in the face of the diversity of SD strategy approaches. This idea was taken up by the European Commission in its proposal for a revised EU SDS where it suggests in the chapter on ‘delivering results’ to “undertake a light peer review process, focusing on specific themes, and in particular, seeking to identify examples of good policies and practices that could be implemented by all” (p. 14). The uptake of peer reviews for NSDS was further specified and concretized in the renewed EU SDS from June 2006. In paragraph 37 it says, “with regard to the national level, the Commission report [biannual progress report on the implementation of the SD strategy in the EU and the MS, starting in September 2007] will build on Member States’ actions to implement the EU SDS and the results gained from completed Peer Reviews”. Paragraph 42 specifies the voluntary peer reviews of NSDS that “should start in 2006 with a first group of Member States”. The following points are mentioned in paragraph 42 about the execution of peer reviews:
The idea behind the peer reviews of the NSDS within the EU is to identify and share good practices in a process of mutual learning. The peer review of a national strategy is voluntary and will be undertaken upon the initiative of the MS concerned. The process should be a bottom-up exercise with participatory elements – involving stakeholders from all political levels – with no intention to ‘name and shame’. The peer reviews are intended to address all three SD pillars and the peer reviewed country is free to choose to undertake a review of the whole NSDS or focus on one or more specific issues.
The European Commission proposes to follow the peer reviews as an observer, providing methodological help if required and financial incentives during the two-year pilot phase. Regarding the first topic, the Commission in February 2006 has published a “Guidebook for Peer Reviews of National Sustainable Development Strategies” that offers practical guidance. It is based on past experiences of evaluating NSDS, interviews with selected governmental and NGO representatives as well as the experiences made with the French peer review process (see details below). Regarding financial incentives, the Commission has launched a call for proposals to provide financial support to peer reviewed MS. The call was open until 30 June 2006. The first country that will undertake a peer review in the context of the renewed EU SDS are the Netherlands with Germany and Finland being the most likely peer countries. The Dutch peer review process is intended to start in late 2006.
Another, non-EU country is also planning a peer review process: Norway intends to undertake a peer review as part of the revision process for the NSDS and as review of the environmental policy of the country. Sweden will be the main peer country in this process. No concrete timetable for the peer review process has been agreed upon yet. Norway intends to consult the Guidebook published by the European Commission, however, will not undertake a full peer review of the whole NSDS but will select certain topics for a special focus.
The guidebook on peer reviews has been produced for the European Commission by an independent project team. It presents an approach to mutual improvement and learning on NSDS that can be applied across all EU MS. It is furthermore intended to provide essential information needed for undertaking a peer review process in an accessible and easy-to-follow framework. Therefore, the guidebook is essentially a toolbox to support the exchange of good practices between MS and to improve the link between the EU and the national level. A common approach to NSDS peer reviews among EU MS should help to overcome common challenges and support the exchange of experiences, while fully respecting the diversity of national approaches, priorities, goals and targets.
Below is a graph outlining the key steps in the review process as proposed in the Guidebook for Peer Reviews:
At the World Summit on SD in Johannesburg in 2002, the French President, Jacques Chirac, made a commitment that France would organise a peer review process for its NSDS. This political commitment followed a proposal made by the EU to develop a pee review process in order to promote the sharing of experiences. Accordingly, a project was initiated in 2004 by two French ministries, the Ministry of Ecology and SD and the Ministry of Foreign Affairs, that had the objective to develop and test a methodology for ‘peer review’ of NSDS. The International Institute for Environment and Development (IIED) provided methodological help. Belgium, the UK, Ghana and Mauritius were chosen as peer countries for the peer review process (Brodhag and Talière, 2006). Below is a graph which shows the steps that were used in the French peer review process:
The four main steps of the French peer review process were:
The success of the French peer review process depended on three key issues: First, the strong political commitment for the whole process, especially the commitment made by the President of France; second, a clear objective about the reason and scope of the peer review process; and third, a positive approach of and relationship between the peer countries.
Based on the experiences made in France, there are two key stages for a peer review process:
Due to the growing international interest, an Expert Group Meeting on reviewing NSDS was held at the UN headquarters in New York in October 2005. In this meeting, several of the participating countries expressed their interest in organising a peer review or a similar process of shared learning. Full documentation of the event can be found on the UN homepage.
The main defining characteristic of external evaluations is that they are undertaken by institutions that have no direct responsibility for the development or implementation of the NSDS. This form of evaluation is, therefore, undertaken by external, government-independent evaluators (e.g. research institutes, consultants,) from either within the country or from other countries. Several countries have made experiences with external evaluations:
In February 2005, the Austrian Ministry of Agriculture, Forestry, Environment and Water Management (BMLFUW) invited a pre-selected number of institutions to submit proposals for the evaluation of the Austrian NSDS (adopted in 2002). In April 2005, an interdisciplinary group of independent researchers and consultants from Germany and Austria was appointed by the BMLFUW to undertake the evaluation and prepare a final report. The main objective was to evaluate the implementation instruments of the strategy not, however, the strategy and the policy goals itself. The requirement to undertake such an evaluation was set out in the NSDS with the aim to improve the strategy’s impacts and institutional effectiveness.
The actual evaluation was undertaken between April and November 2005. Two bodies guided the evaluation process: For organisational purposes, the external evaluation team was supported by a steering group that had the objective to coordinate the various organisational issues of the whole evaluation process. Moreover, a project board, consisting of various SD experts and stakeholders, accompanied the actual evaluation. It provided a forum for ongoing critical analysis and recommendations which offered important inputs for the participative style of the evaluation process. The evaluation process was based on a range of selected criteria, i.e. consistency, effectiveness, efficiency, appropriateness and transparency.
At the beginning of the evaluation process, the Austrian NSDS as well as the institutions responsible for coordination, implementation and monitoring were compared with the experiences of other OECD countries, based on the 2004 IISD/GTZ study. For the Austrian evaluation process, it was decided to focus on ‘institutions’ (e.g. Committee for a Sustainable Austria, Forum Sustainable Austria, working group of the regional SD coordinators, networks, etc) and ‘instruments’ (e.g. sustainability measures, work programmes, progress reports, indicators, etc.). For both of these, specific analytical criteria were developed. The following steps were most crucial in the Austrian evaluation process:
Two issues were considered as particularly important in the Austrian external evaluation process: First, to have an external point of view when evaluating the NSDS is an important added value. Second, to ensure that this external point of view is accepted at the political and administrative level, it is necessary to involve those actors in the evaluation process who are responsible for the implementation of the NSDS.
An external evaluation of the Swiss NSDS from 2002 was conducted by two independent institutions in 2006. The two government-independent evaluators were private consulting companies, Interface-Institute for Policy Studies and Evaluanda. This external evaluation process was organised in the context of a planned revision of the NSDS in 2007. It comprised an evaluation of the strategy process, content and products (outputs), outcomes and impact aspects. The results of this external evaluation will be published in autumn 2006 and will then be available at the ESDN website.
Internal reviews of NSDS are undertaken by national governments in order to measure progress towards the commitments, targets and objectives that were set out in the strategy document. The review is usually undertaken by government-related bodies, i.e. ministries or other administrative bodies (e.g. National SD Councils), with little or not external inputs and delivered in a report which is made publicly available. A number of European countries has made experiences with internal reviews over the last several years, e.g. UK, Belgium most recently in 2005 (only available in French, and Dutch) and 2002, Switzerland in 2004 and Finland in 2003. For further information, please also check the Country Profiles’ section on the ESDN website. Below, the experiences made with internal reviews in the UK will be described more in-depth.
In 1999, the UK published its second NSDS, entitled “A Better Quality Life”. As part of the implementation process of the strategy, annual progress reviews had to be produced by the Department for Environment, Food and Rural Life (DEFRA). The latest of these internal review reports was issued in March 2004. This report was conducted by the SD Unit within DEFRA and was part of a broader review process in the context of the development of a new NSDS in 2005. Therefore, the main objective of the report was not only to cover the key developments that occurred during 2003 (the focus year of review), but also to take stock and review government action towards SD as well as progress of NSDS implementation since the publication of the strategy.
The internal review process in the UK began in mid-2003 with a process to gather stakeholder views and to organise workshops in order to identify key themes and establish a set of specific objectives for the review. The aim was to be clear what the review should cover and what it should contribute to the process that led to the new NSDS rather than undertaking it because of a prior commitment. The aims of the review were then set out in a consultation document (Jones, n.d.):
The review process involved the use of questionnaires, that were sent to government departments and agencies, case studies and data collection. A steering group (consisting of key members of the internal review team, communications staff, the head of the SD Unit in DEFRA, and key statisticians) then commented on a draft report. Another draft report, including the suggestions of the steering group, was later submitted to the ministers for comments and formal approval of the Government. The final internal review report was announced by a ministerial statement in Parliament and published in March 2004. The process for the UK internal review 2004 took about give months. It involved 6 full-time policy officers and two part-time communication/information officers as well as requested contributions from government departments across all policy sectors (IIED, 2006).
The internal review covers the following topics:
After the publication of the internal review report, a consultation process was launched in April 2004. It included a website, events on specific issues, regional and local events as well as the training of facilitators for discussion in community groups (Jones, n.d.). This process was part of the development of the new UK NSDS. A draft of this new strategy was discussed in the cabinet in December 2004. The new NSDS was launched in March 2005. The emphasis of the new strategy is on delivery and the continuing involvement of those who deliver SD on the ground (Jones, 2006).
It was important for the UK internal review of the NSDS to involve the views of various stakeholders and those responsible for the delivery of the strategy early in the review process and to focus on the policy integration aspects of the strategy. However, the ‘traffic lights’ were considered to be inadequate for measuring strategy progress, mainly because they show progress on individual issues but not across the various topics (Jones, n.d.).
This concluding section offers a table with the pros and cons of the presented three qualitative evaluation approaches, namely peer reviews, external evaluations and internal reviews. The information provided in the table is based on the following three sources:
BMLFUW (Austrian Ministry of Agriculture, Forestry, Environment and Water Management) (2005) Evaluation Study on the Implementation of Austria's Sustainable Development Strategy, Vienna: BMLFUW, English Summary: http://www.nachhaltigkeit.at/strategie/pdf/summary_en_IU503_06-05-29.pdf, German Long Version: http://www.nachhaltigkeit.at/strategie/pdf/Evaluationsbericht_NStrat_Langfassung_06-05-11.pdf.
Brodhag, C. & Talière, S. (2006) “Sustainable Development Strategies: Tools for Policy Coherence”, Natural Resources Forum, 30: 136-145.
Dalal-Clayton, B. & Bass, S. (2006) A Review of Monitoring Mechanisms
for National Sustainable Development Strategies, London: IIED, Report prepared
for the OECD,
DEFRA (Department for Environment, Food and Rural Affairs) (2004) Achieving a Better Quality of Life: Review of Progress Towards Sustainable Development, UK Government Annual Report 2003, London: DEFRA, http://www.sustainable-development.gov.uk/publications/pdf/ar2003.pdf.
European Commission (2006) A Guidebook for Peer Reviews of National Sustainable Development Strategies, http://ec.europa.eu/environment/pdf/nsds.pdf.
French Ministry of Ecology and Sustainable Development & Ministry of Foreign Affairs (2005) The French Strategy for Sustainable Development: Report on a Peer Review and Shared Learning Process, http://www.ecologie.gouv.fr/article.php3?id_article=4321.
George, C. & Kirkpatrick, C. (2006) “Assessing National Sustainable Development Strategies: Strengthening the Links to Operational Policy”, Natural Resources Forum, 30: 146-156.
IISD (International Institute for Sustainable Development) & GTZ (German Society for Technical Cooperation) (2004) National Strategies for Sustainable Development: Challenges, Approaches and Innovations in Strategic and Co-ordinated Action, http://www.iisd.org/pdf/2004/measure_nat_strategies_sd.pdf.
Jones, B. (2006) “Trying Harder: Developing a New Sustainable Strategy for the UK”, Natural Resources Forum, 30: 124-135.
Jones, B. (n.d.) NSDS : Progress in the UK, available from the ‘National Strategies for Sustainable Development’ (nssd) homepage: http://www.nssd.net/pdf/peer_review/English64.pdf.
Lehtonen, M. (2005) “OECD Environmental Performance Review Programme: Accountability (f)or Learning?”, Evaluation, 11(2): 169-188.
Martinuzzi, A. (2004) “Sustainable Development Evaluations in Europe – Market Analysis, Meta Evaluation and Future Challenges”, Journal of Environmental Assessment Policy and Management, 6 (4): 411-442.
Niestroy, I. (2005) Sustaining Sustainability: A Benchmark Study on National Strategies towards Sustainable Development and the Impact of Councils in Nine EU Member States, EEAC.
OECD (2006) Good Practices in the National Sustainable Development Strategies of OECD Countries, Paris: OECD, http://www.oecd.org/dataoecd/58/42/36655769.pdf.
OECD (2002) Peer Review – A Tool For Cooperation and Change: An Analysis of an OECD Working Method, Paris: OECD (compiled by Fabrizio Pagani), http://www.oecd.org/dataoecd/33/16/1955285.pdf.
Steurer, R. & Martinuzzi, A. (2005) “Towards a New Pattern of Strategy Formation in the Public Sector: First Experiences with National Strategies for Sustainable Development in Europe”, Environment and Planning C: Government and Policy, 23: 455-472.