|You are here: Home > Quarterly reports|
ESDN Quarterly Reports
The ESDN Quarterly Reports provide in-depth documentation of a selected topic on a quarterly basis. The current ESDN Quarterly Report is displayed below. For previous ESDN Quarterly Reports click here.
ESDN Quarterly Report December 2007
Objectives and Indicators of Sustainable Development in Europe:
An indicator can be defined as “a parameter, or a value derived from parameters, which points to, provides information about, describes the state of a phenomenon/environment/area, with a significance extending beyond that directly associated with a parameter value.” (OECD, 2003)
Generally speaking, indicators have three main functions. Firstly, they reduce the number of measurements necessary to give an exact description of a situation (OECD, 2003). As such, they are indispensable for measuring progress towards achieving set goals (Dalal-Clayton & Krikhaar, 2007) and thus constitute a key tool for evaluating the effectiveness of policies (European Commission, 2005). Secondly, indicators simplify the communication of positive and negative developments to politicians, administrators, the public and others (OECD, 2003). Both functions rely on the main feature of indicators, i.e. to summarize complexity into a manageable amount of meaningful information that can be understood and interpreted easily. In doing so, indicators can, thirdly, provide crucial guidance for policymaking processes (Bossel, 1999; UNCSD, 2001), in particular regarding the better integration of policies horizontally across sectors, and vertically between different levels of government. SDIs can facilitate vertical integration when they are compared and benchmarked across Europe.
In how far SDIs fulfil the measuring function is foremost a question of methodological reliability and validity. Because they ought to reveal where we stand on the way to SD, in which areas progress has been made and where further political actions are needed (Dalal-Clayton & Krikhaar, 2007), methodological challenges in developing and applying SDIs are anything but trivial. In how far they can fulfil the communication and guidance functions is primarily a question of political willingness to learn and improve policies based on evidence. As we all know, learning is a difficult process, in particular in political arenas in which opposition parties are eager to benefit from a government’s negative performances and weaknesses. The communication and guiding purposes of SDIs were also the subjects of the first ESDN Workshop on SDIs, hosted by the Portuguese Presidency in November 2007 in Cascais, Portugal (for the documentation of the Workshop, click here).
As emphasised above, SDIs are an integral part of SD strategies. Therefore, several SD strategy guiding documents emphasize the importance of SDIs as means of monitoring, learning and continuous improvement in the context of SD policymaking. Table 1 gives an overview of how some key SD guiding documents refer to SDIs.
Table 1 : SDIs in SD guiding documents
|Guiding document||Emphasis of indicators|
|UN (1992) Agenda 21||“Countries at the national level and international governmental and non-governmental organizations at the international level should develop the concept of indicators of sustainable development in order to identify such indicators. […]” (para. 40.6)“Relevant organs and organizations of the United Nations system, in cooperation with other international governmental, intergovernmental and non-governmental organizations, should use a suitable set of sustainable development indicators […]” (para. 40.7)|
|OECD (2001) The DAC Guidelines – Strategies for Sustainable Development||“Monitoring and evaluation needs to be based on clear indicators […]” (p. 27)|
|UNDESA (2002) Guidance in preparing a National Sustainable Development Strategy: Managing Sustainable Development in the new millennium||“An important element of the M&E process is the development of indicators - benchmarks or thresholds. These indicators […] should reflect the status and trends of a particular process element or product. Based on these indicators annual reports should be prepared to enable stakeholders see progress made.” (para. 43)|
|IIED (2002) Sustainable Development Strategies: A Resource Book||“Indicator-based assessments are more transparent than that of accounts and narrative assessments and can be compared over time.” (p. 135)|
|OECD (2006) Good Practices in the National Sustainable Development Strategies of OECD Countries||“Indicators can be used to track progress along sustainable paths and provide the foundation for performance targets. They also contribute to policy transparency and accountability in sustainable development strategies.” (p. 27)|
SDIs proliferated following the 1992 Earth Summit in Rio de Janeiro. In particular, chapter 40 of the Agenda 21 called for the “development of indicators of sustainable development” both on the national and international level (UN, 1992). Following this call, the OECD in 1994 presented a set of environmental indicators in the so-called ‘Pressure-State-Response’ (PSR) framework. The indicators reflected major environmental preoccupations and challenges in the OECD countries and were classified into (i) indicators of environmental pressures (‘Pressure’), (ii) indicators of environmental conditions (‘State’) and (iii) indicators of societal responses (‘Response’) (OECD, 2003).
Although the PSR-framework, originating from environmental statistics, shows clear limitations when being tied to SD, it was further adopted by various organisations. The UN Commission on Sustainable Development (UNCSD), for instance, used a modified ‘Driving force-State-Response’ (DSR) framework, and the European Environmental Agency (EEA) adopted a ‘Driving force-Pressure-State-Impact-Response’ (DPSIR) version. However, despite these efforts, the weaknesses of the PSR framework are (i) uncertainties regarding the underlying causal linkages the framework implies, and (ii) oversimplification of complex inter-linkages between issues (Pintér et al., 2005; UNDESA, 2006).
In 1996, the UNCSD proposed a set of 134 SDIs in the Driving force-State-Response framework, linked to the thematic chapters of the Agenda 21. In connection to this SDI set, the UNCSD launched an international testing programme aimed at advancing the understanding, development and use of SDIs by governments. 22 countries covering all regions of the world participated in the testing programme, including seven EU Member States (Austria, Belgium, Czech Republic, Finland, France, Germany and the UK). In addition, Eurostat and a number of countries not officially participating in the testing were affiliated with the programme. The testing phase ultimately led to a revision of the UNCSD SDI set, resulting in a smaller but more policy-relevant set of SDIs (UNCSD, 2001; Eurostat, 2007).
Another result of the testing programme was the replacement of the DSR framework by one focusing on themes and sub-themes of SD (Pintér et al., 2005; UNCSD, 2001). This new approach was taken up by the European Commission when designing a “framework for indicators based on themes and sub-themes, which are directly linked to EU policy priorities” (European Commission, 2005). As a result, the Commission endorsed a set of 155 indicators (in the form of a hierarchical three-level pyramid), with 98 indicators forming the basis of Eurostat’s first SD monitoring report published in December 2005 (Eurostat, 2005, 2007).
Following the mandate of the renewed EU SDS, Eurostat has undertaken a review of the 2005 EU SDI set in 2006-2007. This review followed three objectives (European Commission, 2007):
The review of the EU SDI set was carried out by Eurostat in close cooperation with the working group on SDIs, composed of both statistical and policy representatives from EU and Member State levels. This working group was established in 2005 in order to further the work of the previous SDI task force and to “exchange and expand best practices to all Member States” (Eurostat, 2007). The revised EU SDI set was published in October 2007 in the annex to the Commission Staff Working Document accompanying the first EU SDS progress report. It represents the state of the art of SDIs at the EU level.
The following empirical parts of this ESDN Quarterly Report are based on a study that was carried out between January and June 2007 by the Department of Economics and Social Sciences at the University of Natural Resources and Applied Life Sciences (BOKU), together with RIMAS (the Research Institute for Managing Sustainability that operates the ESDN Office) at the Vienna University of Economics and Business Administration. The study was commissioned by Eurostat , the Statistical Office of the European Communities (Lot 2 of Eurostat’s tender No 2006/S 148-159080 on the “Improvement of the quality of the quality of the Structural and Sustainable development indicators”). The complete study can be downloaded here .
The objectives of the study were to
The analysis of objectives and indicators in Lisbon and SD strategies at the EU and national levels was implemented trough three tasks:
The empirical findings summarised here focus on the results of the analysis of NSDS objectives and indicators only (task 2). The comparative analysis of the objectives and indicators of the Lisbon Strategy and respective NRPs will be summarised in the upcoming ESDN Quarterly Report March 2008.
At the beginning of the study, a so-called SISDI (Structural Indicators and Sustainable Development Indicators) database was set up. This database enabled us to conduct
A European comparison of SD objectives and indicators requires a common point of reference that helps to highlight similarities and differences across Member States. Here, the key challenges of the renewed EU SDS and the EU SDI set defined in the Communication “Sustainable Development Indicators to monitor the implementation of the EU Sustainable Development Strategy” from February 2005 were used as the common point of reference. The new EU SDI set was not available as a common point of reference because it was under development at the time when the study was conducted.
In a first step, SDS Coordinators (most of them being also ESDN Members) were contacted in order to verify or identify policy documents containing SD objectives and indicators. Based on the identified documents, SD objectives were entered into the database and categorized by using the hierarchical scheme described in Table 2. Simultaneously, SDIs used in NSDSs as well as in accompanying documents (monitoring, indicator and progress reports) were entered into the database and, if possible, linked to the already listed objectives and actions. In an additional step, SD objectives and indicators were also linked to the key challenges of the EU SDS and the EU SDI set adopted in 2005. In a last step, we also collected information regarding strategy revisions, monitoring methods and trends in the use of indicators. Overall, the results of the study give a comprehensive picture of SD objectives and indicators used across Europe.
Table 2 : Classification of NSDS objectives
|Priority category||Classification criterion|
|“Top-level goals”||Priorities explicitly named in the document; top-level goals and measures|
|“High-level priorties”||Priorities or explicit goals clearly nested within explicit priorities or top-level goals|
|“Key issues / Measures”||Measures, guidelines, actions (or related terms) to which indicators are set up|
When interpreting the findings of the study, challenges and limitations, mainly regarding the country coverage and the reference documents used for the analysis, have to be kept in mind. First, we have covered fewer countries as foreseen, mainly because of three reasons:
As a result, 24 instead of the planned 34 countries were covered by the analysis. Among the 24 countries are the EU Member States that had an NSDS in place (does not apply to Bulgaria, Cyprus, Hungary and Spain) and that was also available in English (does not apply to France, Greece, Luxembourg, Poland and Portugal). However, thanks to the support of the national SDS Coordinators as well as experts from Eurostat, we were able to partly include SD objectives and indicators from France and Greece as well as SD indicators from Luxembourg in the analysis. The study also covered the non-EU countries Iceland, Norway and Switzerland.
For countries that have adopted an English NSDS, the strategy document and related SD objectives were easy to identify and categorise. The situation for indicators was sometimes less clear because not all countries list their SDIs in their strategy documents, or they have updated them later on. In virtually all countries, the SDI set originally adopted with the NSDS differed from the set used later on in progress and indicator reports. As a consequence, we have used the most recently published SDI set in the analysis.
Another challenge emerged regarding the actual size of SDI sets. Almost all countries, especially those that used graphs and figures for illustrating their indicators, used indicators that aggregate several independent component indicators. The UK SDI for ‘road freight’, for instance, is composed of data that corresponds to two separate EU SDIs (‘greenhouse gas emissions by transport’ and ‘volume of freight transport and GDP’). To make the study results comparable, the SDIs entered into the database were broken down into their basic component indicators. This implies that the number of indicators in the project’s database is sometimes different to the ‘official’ number of indicators of a country (for the UK, for instance, we have entered 147 indicators into the database instead of the 68 ‘official’ indicators).
This section gives an overview of the development of NSDSs in Europe, and it characterises them in terms of basic characteristics (types), focus, structure and objectives
Key drivers behind the development of SD Strategies in Europe were the 1992 Rio World Summit, the 2001 Gothenburg European Council, the 2002 Johannesburg World Summit, and finally the 2006 EU SD Strategy (for details see Steurer & Martinuzzi, 2007). As Table 3 shows, only a few countries (namely Iceland, Ireland, Switzerland and the UK) have developed their first NSDS already in the mid-1990s in response to the 1992 Rio Earth Summit, with many more following in the early 2000s around the 2002 Johannesburg World Summit. The renewed EU SDS from 2006 seems to drive some latecomers to elaborate their first SD Strategy, and others to review their existing approaches. Currently, a majority of EU Member States plan to review their NSDS within the next two years (for details, see the right column in Table 3). The years stated in bold in Table 3 indicate the strategy documents that were included in the empirical analysis (countries not included in the analysis are displayed in italics).
Table 3 : Overview of European NSDSs
|Country||First NSDS (year)||Revision(s) of NSDS (years)||Occasion for developing / reviewing the NSDS||Current revisions|
|Belgium||1999 (federal level only)||2004 (federal level only)||-||federal SDS: ongoing;
|Bulgaria||2007 (draft)||-||renewed EU SDS||ongoing|
|Cyprus||2007 (draft)||-||renewed EU SDS||ongoing|
|France||2002||2006||WSSD (2002), renewed EU SDS (2006)||-|
|Hungary||2007 (draft)||-||renewed EU SDS||ongoing|
|Ireland||1997||(2002)*||Rio (1997), WSSD (2002)||ongoing|
|Malta||2006||-||-||scheduled for 2011|
|The Netherlands||2001||2003 (SD Action Plan)||-||ongoing|
|Norway||2002||2004 (SD Action Plan)||WSSD (2002)||ongoing|
|Portugal||2006||-||renewed EU SDS||-|
|Slovakia||2001||2004 (SD Action Plan)*||-||-|
|Spain||2007 (draft)||-||renewed EU SDS||ongoing|
|Sweden||2002||2004, 2006||WSSD (2002)||scheduled for 2010|
|UK||1994||1999, 2005||Rio (1994)||-|
* NSDS Update/Revision, did not replace the original NSDS
** Implementation plan ‘Sustainable Development in Icelandic Society’, no “real” NSDS
*** As the Luxembourgian NSDS was not available in English, only the 2002 SDI set was included in the analysis
**** ‘Slovenia’s Development Strategy’ represents also the Slovenian NSDS and NRP at the same time
As the European Commission’s (2004:11-14) Staff Working Document on NSDSs lines out, strategy documents differ widely in various respects. Regarding the approach taken (or type), some documents communicate a bold vision with a few priorities on some dozen pages, while others come up with a bulk of (often vague) intentions and objectives on more than 200 pages. In order to increase policy coherence, countries structure their strategy documents in broad categories (such as “quality of life or “living space” in Austria), around key sectors (such as transport, industry, energy, agriculture or employment, inter alia, in Lithuania), or alongside the three dimensions of SD. Although most SD Strategies cover all three dimensions of SD, emphases differ. Two strategies (Iceland, Italy) show a clear focus on the environmental dimension. Besides commitments regarding to the global aspect of SD, which are stated in virtually all NSDSs, some countries emphasize additional dimensions such as culture (Estonia, Lithuania, Slovakia and Slovenia) or governance (Czech Republic, the Netherlands). As Table 4 shows in detail, several countries put a special emphasis on research and education (Czech Republic, Finland, Latvia, Lithuania, the Netherlands, Slovakia, Slovenia, Switzerland), sustainable communities, including spatial development and housing (Denmark, Finland, Ireland, Latvia, Lithuania, Slovenia, Switzerland and the UK) or tourism (Latvia, Lithuania, Malta).
However, the most significant difference between NSDSs is related to the document structure. While a number of strategies show a clear link and hierarchy of objectives and actions/measures, others do not specify how (in particular cross-sectoral) objectives are supported by implementation measures. Instead, actions and measures are specified in independent chapters referring to various policy sectors (such as air, water, forestry, agriculture, industry, transport, energy, etc.).
Table 4 : Structure and scope of European NSDSs
|Country||NSDS structure||Number of objectives and actions/measures*||Coverage of the 3 SD dimensions||Additional dimensions / priority areas|
|Austria||clearly hierarchical; 4 ‘fields of action’, each consisting of 5 key objectives; plus an additional main objective (finance)||159 (5/23/131)||equally covered||international|
|Belgium||matrix; 6 themes with 31 ‘actions for SD’||230 (6/31/193)||emphasis of social dimension|
|Czech Republic||hierarchical; 6 priority areas (3 SD pillars + 3 cross-cutting areas)||167 (6/17/144)||equally covered||international; governance; R&D/education|
|Denmark||mixed; 8 key objectives, plus 13 priority areas with actions/measures||200 (21/87/92)||equally covered||international; housing|
|Estonia||clearly hierarchical; 4 ‘goals’ (3 SD pillars + culture), each comprising 3 ‘sub-goals’ and a number of actions||32 (4/12/16)||equally covered||culture|
|Finland||hierarchical; 6 main priority areas||186 (6/26/154)||equally covered||international; R&D/education; sust. communities|
|France||mixed; objectives structured according EU SDS; actions described in a separate part of the NSDS||75 (9/50/16)**||equally covered||international; R&D/education|
|Germany||mixed; 21 objectives embedded in the 3 SD pillars + international dimension; 8 additional priority areas||25 (4/21/0)***||equally covered||international|
|Greece||hierarchical; 5 priority areas||56 (5/25/26)****||emphasis of environmental and social dimension||international|
|Iceland||clearly hierarchical; 4 priority areas, each with 3-6 objectives, on average 3 ‘sub-goals’ per objective||72 (4/17/51)||only environmental dimension covered||international|
|Ireland||hierarchical; 7 priority areas, 2 comprising a comprehensive number of objectives and key issues||193 (7/16/170)||emphasis of environmental and economic dimension||international|
|Italy||hierarchical; 4 priority areas, very detailed key issues, often supported by indicators||142 (4/28/110)||only environmental dimension covered|
|Latvia||mixed; 16 priority areas with objectives and key issues; 10 additional main objectives (‘goals’)||319 (26/79/214)||equally covered||R&D/education; housing; tourism|
|Lithuania||mixed; 16 priority areas with objectives and actions; 11 additional main objectives||610 (27/48/535)||equally covered||culture; R&D/education; housing; tourism|
|Malta||hierarchical; 4 priority areas (3 SD pillars + cross-cutting issues)||246 (4/28/214)||equally covered||tourism|
|The Netherlands||mixed; actually two parts: ‘national strategy’ with 12 priority areas, and ‘international strategy’ with 6 priority areas||89 (13/22/54)****||equally covered||governance; R&D/education|
|Norway||hierarchical; 7 priority areas||167 (7/17/143)||equally covered||international; one priority area dedicated to the Sami people|
|Romania||mixed; NSDS was mainly aimed to introduce SD concept in Romania||NSDS does not clearly specify objectives and actions||emphasis of social and economic dimension|
|Slovakia||mixed; 10 ‘long-term priorities’; additionally 28 ‘strategic objectives’ with actions||277 (11/28/238)||emphasis of social and economic dimension||culture|
|Slovenia||hierarchical; 5 priority areas (only one dedicated to SD)*****||169 (5/19/145)||emphasis of economic dimension||culture; R&D/education|
|Sweden||hierarchical; 4 thematic ‘strategic challenges’ plus further 4 priority areas (relating to implementation)||119 (8/19/92)||emphasis of social dimension|
|Switzerland||clearly hierarchical; 10 priority areas with (on average) 2 objectives each||32 (10/22/0)||emphasis of social and economic dimension||international; R&D/education; sust. communities|
|UK||hierarchical; 6 priority areas (including the 4 ‘shared priorities’ from the UK SD framework)||160 (6/33/121)||equally covered||international; sust. communities|
* total number of objectives and actions; in brackets: top-level goals / high-level priorities / key issues & measures; see Table 2 for details about this classification
** numbers relate to the first part of the NSDS only (“strategic objectives and instruments”); second part (“programmes of action”) not available in English
*** as some priority areas (in particular the ‘key focal points’) are not clearly supported by operational objectives, actions or indicators, only a part of the strategy, namely the 21 objectives that clearly refer to indicators, have been entered in the database
**** numbers refer to an English summary of the NSDS only; complete NSDS not available in English
***** ‘Slovenia’s Development Strategy’ accounts for both the NSDS and the NRP; thus clearly focusing on economics issues related to the EU Lisbon strategy
Table 4 also provides an overview of the total number and hierarchy of objectives stated in SD strategies. The number of objectives ranges from 32 (Estonia) to 610 (Lithuania). As mentioned in the introduction above, some of these objectives are vague, some SMART (i.e. Specific, Measurable, Achievable, Relevant and Timed).
Good examples for vague objectives in SD strategies are the following:
Good examples for SMART objectives that are at least specific, measurable and timed (the relevance and achievability of an objective is not self-evident), are the following:
This Section presents a cross-country comparison of SD objectives in Europe. It shows to what extent the analysed NSDSs address priority areas (i.e. key challenges and crosscutting policies) of the renewed EU SDS from 2006. The full lists of SD objectives for 10 countries can be downloaded from the respective country profiles (category ‘basic information’) at the ESDN homepage.
When interpreting the results of this comparison, it is important to keep in mind that a low score shows how an SD Strategy refers to a priority or challenge of SD the way it was framed in the EU SDS. Despite a low score it could well be that an issue may be reflected strongly in the NSDS, but differently than in the EU SDS.
Figure 2 : European coherence regarding SD objectives (for key challenges and cross-cutting policies)
Note: The colour code used in Figure 2 indicates the degree to which a country’s SD objectives and actions refer to the objectives of the seven key challenges and four cross-cutting policy fields identified in the EU SDS from 2006. The UK’s NSDS, for example, addresses more than two-thirds of the objectives related to the EU SDS key challenge ‘climate change’ (dark-green), between one-third and two-thirds of the EU SDS objectives referring to ‘conservation and management of natural resources’ (light-green), and less than one-third of the objectives and actions specified under the key challenge ‘sustainable transport’ (beige). Key challenges and cross-cutting policies that are not addressed by an NSDS are highlighted in orange.
Before drawing conclusions from this cross-country comparison, we want to give a brief explanation of how national peculiarities influence the results. First, the results for Greece and the Netherlands are based on an English summary of the NSDSs because the complete strategy documents were not available in English. Second, the Romanian NSDS is no “conventional” strategy as it was mainly aimed to introduce the concept of SD rather than specifying detailed objectives and actions. Third, the relatively week coherence in the case of Slovenia may be due to the fact that the Slovenian NSDS is also its Lisbon NRP at the same time. Fourth, as some priority areas of the German NSDS (in particular the ‘key focal points’) are not clearly supported by operational objectives, actions or indicators, only parts of the strategy (that may, however, not comprehensively reflect all German SD priority areas) have been entered in the database.
Figure 2 shows that a majority of countries address the key challenge ‘Conservation and management of natural resources’ both comprehensively and coherently. To a lesser degree, the same applies to ‘Climate Change and clean energy’. This indicates that environmental issues are still the major ingredients of SD strategies (see also Figure 4). As regards social issues (in particular concerning the key challenges ‘Public health’, ‘Social inclusion […]’ and ‘Global poverty’), the picture becomes more ambiguous. Especially strategies emphasizing the environmental dimension (such as the ones of Iceland and Italy) tend to neglect the social dimension of SD, and they consider economic issues only as far as they affect environmental issues (i.e. when it comes to integrating environmental concerns into economic policies). Countries that comprehensively address the key challenge ‘Social inclusion […]’ have one point in common, namely that their NSDSs have been developed by involving various stakeholders from civil society (Austria: stakeholder dialogue; Czech Republic: CSD, Finland: FNCSD; Slovakia: REC). The cross-cutting policy field addressed most comprehensively and coherently across Europe is ‘education and training’ for SD. The one addressed least coherently is ‘research and development’ for SD (see also Figure 4). Overall, it is obvious that several of the countries that address the four cross-cutting policy areas of the EU SDS most coherently (in particular Austria, the Czech Republic, Finland and France) show a similar picture of coherence for the seven SD key challenges.
Figure 3 summarises the findings shown in Figure 2 for the vertical country axis by using the same colour code. It shows that 13 of the 23 NSDSs address all seven key challenges of the renewed EU SDS. Denmark, Finland, the Czech Republic, France and Belgium stand out because they address six out of seven SD key challenges to a high or medium degree. As regards Finland and France, their strong coherence with the EU SDS is also because they have renewed their SD Strategies after the EU SDS has been adopted by the European Council in 2006. Figure 3 also shows that several NSDSs (in particular those from Romania, Greece and Switzerland) address SD obviously in different ways than the EU SDS does.
Figure 3 : Overview of how countries address the seven EU SDS key challenges
Note: The colour code used in Figure 3 indicates the degree to which a country’s SD objectives and actions refer to the key challenges as identified in the EU SDS from 2006. The Danish NSDS, for example, addresses the objectives of three EU SDS key challenges to a high degree (dark-green), three others to a medium degree (light-green), and only one to a lower degree (beige). On the other hand, the Romanian NSDS addresses only four of the seven EU SDS key challenges to a lower degree, and three others not at all.
Figure 4 summarises the findings presented in Figure 2 for the horizontal axis (depicting key challenges and cross-cutting policy fields) by using the same colour code. In addition to the interpretation given above, one can note here that the key challenge of ‘global poverty’ is the one that is addressed the least coherent across Europe. This finding is remarkable not only because the ‘international dimension’ of SD (in particular North-South relations) is at the core of the SD concept going back to the Brundtland Report (WCED 1987), but also because many countries cover it in extra chapters of their SD Strategies (see Table 4).
Figure 4 : Overview of how the seven SD key challenges are addressed by NSDSs
Note: The colour code used in Figure 4 indicates the degree to which the key challenges and cross-cutting policy areas of the renewed EU SDS are addressed by objectives and actions in NSDSs. The key challenge ‘conservation and management of natural resources’, for instance, is addressed by six SD Strategies to a high degree (dark-green), by ten to a medium degree (light-green), by six to a low degree (beige), and by one NSDS not at all.
As mentioned in the introduction, setting objectives of SD and measuring progress in achieving them with indicators are two closely related features that are typical for all SD Strategies in Europe. Table 5 provides an overview of how SD indicators (SDIs) are employed in SD strategy processes across Europe.
Table 5 : Overview of SDIs in European SD Strategy processes
|Country||SDIs in SD strategy processes||Updates / revisions of SDIs||Other indicator processes related to SD||Reporting based on SDIs|
|Austria||SDIs included in 2002 NSDS||SDIs reviewed in 2006; 2007 indicator report based on new SDI set||-||Indicator reports published in 2004, 2006 and 2007|
|Belgium||No SDIs in federal SDS, but included in federal reports on SD||SDI set modified with each federal report||-||Federal reports on SD published in 1999, 2003 and 2005|
|Czech Republic||SDIs included in 2004 NSDS||Progress reports based on a smaller set of SDIs than presented in NSDS||-||Progress reports based on SDIs published in 2006 and 2007|
|Denmark||SDIs included in 2002 NSDS||-||-||Indicator report (headline indicators only) published in 2005|
|Estonia||Preliminary set of SDIs included in 2005 NSDS||-||Separate SDI set published by Statistical Office||Indicator reports published in 2004 and 2006 (based on Stat. Office’s SDIs)|
|Finland||SDIs included in 2006 NSDS||SDIs reviewed in 2006 in conjunction with revision of NSDS||-||no progress or indicator reports published yet|
|France||SDIs included in 2006 NSDS||new SDI set in 2006 NSDS; based on 2005 EU SDIs||-||no progress or indicator reports published yet|
|Germany||SDIs included in 2002 NSDS||Small modifications of SDIs with each progress/indicator report||-||Progress reports (including SDIs) published in 2004 and 2005, indicator report published in 2006|
|Greece||No SDIs included in 2002 NSDS||Elaboration of new SDI set planned||SDIs published in a separate report (2003)||No monitoring of NSDS with SDIs undertaken|
|Iceland||SDIs included in 2002 NSDS||Updated SDI set in 2006 indicator report||-||Indicator report published in 2006|
|Ireland||No SDIs included in 1997 NSDS and 2002 NSDS review||Elaboration of SDI set planned||Indicators, inter alia including SDIs, published by National Economic and Social Committee (NESC) and Central Statistics Office (CSO)||NESC indicators published in 2002; annual CSO indicator reports since 2003|
|Italy||SDIs included in 2002 NSDS||-||-||no progress or indicator reports published yet|
|Latvia||SDIs included in 2002 NSDS||-||Separate SDI set published by Latvian Environment Agency (LEA)||Indicator report published in 2003 (based on LEA indicators)|
|Lithuania||SDIs included in 2003 NSDS||-||-||SDIs published in 2004 ‘Statistical Yearbook’|
|Luxembourg||SDIs included in 1999 NSDS||Indicator reports (2002 and 2006) based on an updated SDI set||-||Indicator reports published in 2002 and 2006|
|Malta||SDIs included in 2006 NSDS||-||-||no progress or indicator reports published yet|
|The Netherlands||SDIs included in 2001 NSDS; no SDIs included in 2003 SD action programme||-||SDI set published by the Dutch Environmental Assessment Agency (MNP)||‘Sustainability Outlook’ published in 2004, includes the list of MNP indicators|
|Norway||Preliminary SDI set included in 2004 SD action plan; revised SDI set published in 2005||-||-||Indicator report published in 2005|
|Romania||Preliminary SDIs included in NSDS||-||-||-|
|Slovakia||SDIs included in NSDS||-||-||no progress or indicator reports published yet|
|Slovenia||No SDIs included in NSDS; SDIs included in annual ‘development reports’||-||-||Annual ‘development reports’ largely based on indicators|
|Sweden||SDIs included in 2006 NSDS||-||-||no progress or indicator reports published yet|
|Switzerland||No SDIs included in 2002 NSDS; SDIs published in 2004||SDIs updated in 2007||-||Indicator reports published in 2004 and 2005|
|UK||SDIs included in 1999 and 2005 NSDSs||2006 indicator report based on updated SDI set||-||Indicator reports published in 2006 and 2007|
As Table 5 shows, a majority of the countries covered here not only monitor their SD performance, but also report about it regularly in monitoring, progress or indicator reports in the course of their SD Strategy cycle.
Regarding the linkage between SD objectives and indicators, two basic approaches in developing SDIs can be distinguished. When the so-called ‘model-based approach’ is used, SDIs are developed on the basis of an underlying model of SD. The risk of this approach is that the SDI set does not reflect political priorities and may lack political salience. When the so-called ‘policy-based approach’ is applied, SD objectives are defined by political documents, and respective SDIs are derived. Because policies change over time, the corresponding SDI set also has to be revised continuously, making it sometimes difficult to track long-term trends (Hass, 2006). Only four countries (Austria, Belgium, Norway and Switzerland) explain their approach taken for developing their SDI set, all four using the model-based approach.1 Most other countries seem to follow the policy-based approach, linking their SDIs to SD Strategy objectives. Consequently, some countries (such as Denmark, Germany, Iceland, Italy and Slovenia) feature a strong and direct link between SD objectives and indicators, making it easy to monitor the SD Strategy. Other countries (such as Belgium, Lithuania, Sweden and Switzerland) feature a weaker linkage between objectives and indicators. In countries that have different SDI sets published by different institutions (for example in Estonia and Latvia), integrating SDI monitoring into the SD Strategy cycle could be improved altogether.
1 Austria and Belgium developed their SDIs based on the DPSIR framework (see ‘From Rio to the current EU SDI set’), Norway used a capital-approach, and Switzerland developed its own model based on the Brundtland definition.
As already described above, the main feature of indicators is to summarize and communicate complexity with a manageable amount of meaningful information. As a result, the size of an indicator set needs to be limited in order to avoid information overload for data users. As a long list of indicators can be counterproductive regarding its functions of communicating SD trends and guiding policies, only a limited number of indicators is usually selected for describing a broader subject (OECD, 2003; Pintér et al., 2005). Thus, “the strength and weakness of indicators lie in their selection, which facilitates decision making but also opens the door to data manipulation” (Bartelmus, 2007).
Table 6 summarises some key characteristics of SDI sets across Europe. It shows that SDI sets across Europe differ strongly with respect to their size. While some countries have a small set with about 20 (headline) indicators (such as France, Germany and Norway), others use rather comprehensive sets with more than 100 indicators (such as Italy, Latvia, Switzerland and the UK). Some of these countries also use a smaller number of headline indicators for communication purposes. A few countries (Finland, Italy, Slovakia and Slovenia) also use aggregate indices such as the Human Development Index (HDI) or the Ecological Footprint. Austria and UK also state explicitly so-called ‘best-needed’ indicators, i.e. indicators that still need to be developed (due to methodological issues or lack of data). For some countries, more than one SDI set can be derived from different documents, often one from the NSDS and another one from indicator or progress reports (Czech Republic, Denmark, Estonia and Latvia). In these cases, the SDI sets have been aggregated in the course of the analysis (indicators that were identical in both sets were only counted once) in order to allow cross-country comparisons going beyond the document (NSDS, indicator report) level.
Table 6 : Characteristics of European SDI sets
|Country||Number of SDIs||Headline SDIs||Sources and comments||Composite SD indicators / indices|
|Austria||95||35||2006 monitoring report; some of the indicators are ‘best-needed’ indicators which still need to be developed||-|
|Belgium||45||-||2005 federal report on SD||-|
|Czech Republic||100||-||87 indicators in 2004 NSDS;
37 in 2006 progress report
|Denmark||119||14||102 indicators in 2002 NSDS;
28 in 2005 indicator report
|Estonia||95||-||42 indicators in 2005 NSDS;
60 indicators in 2004 and 2006 indicator reports
|Finland||35||-||2006 NSDS||Environmental Sustainability Index; Human Development Index|
|Germany||28||28||2006 indicator report||-|
|Greece||70||-||2003 indicator report||-|
|Iceland||56||-||2006 indicator report||-|
|Ireland||93||-||2006 indicator report; no official SDI set||-|
|Italy||190||-||2002 NSDS||Ecological Footprint, MIPS (Material Input Per Service unit)|
|Latvia||187||-||98 indicators in the 2002 NSDS;
126 in the 2003 indicator report
|Luxembourg||27||-||2002 and 2006 indicator reports||-|
|The Netherlands||32||-||2004 report ‘Sustainability Outlook’; preliminary SDI set||-|
|Norway||16||-||2005 indicator report||-|
|Romania||13||-||1999 NSDS; no official SDI set||-|
|Slovakia||71||-||2001 NSDS||Human Development Index|
|Slovenia||71||-||2006 development report||Human Development Index|
|Switzerland||163||-||2004 indicator report||-|
|UK||147||27||2006 indicator report; some of the indicators are ‘best-needed’ indicators which still need to be developed|
This Section compares the SDI sets used in 24 European countries with the EU SDI set from 2005. The full lists of SDIs can be downloaded for 14 countries in the respective country profiles (category ‘SDI and monitoring’) at the ESDN homepage. When interpreting the results of this comparison, it is important to keep in mind that a low score refers to the way the theme was framed in the EU SDI set, and that the issue may be reflected in national SDI sets strongly, but differently than in the EU SDI set.
Before drawing conclusions from this cross-country comparison, we want to briefly highlight some methodological challenges standing behind the data. First, the results for Ireland are based on the 2006 indicator set published by the Central Statistics Office (CSO), which is no “official” SDI set. Second, for the Netherlands, we have included a preliminary SDI set published by the Dutch Environmental Assessment Agency (MNP). Third, the Romanian NSDS is no “conventional” strategy as it was mainly aimed to introduce the concept of SD rather than specifying detailed objectives and indicators. Nevertheless, we have derived a set of 13 SDIs from the strategy document and included them in the analysis.
Figure 5 shows the degree to which the SDI sets of 24 European countries address the indicators of the 10 EU SDI framework themes. Since SDI sets differ strongly across countries in terms of both size and themes, and the EU SDI set from 2005 consisted of 166 indicators, it is no surprise that, so far, SDIs are less coherent than SD objectives across Europe (see Figure 2).
Figure 5 : European coherence regarding SD indicators (compared to the 2005 EU SDI framework)
Note: The colour code used in Figure 5 indicates the degree to which a country’s SDI set addresses the themes of the 2005 EU SDI framework. For instance, the SDI set of the Czech Republic addresses more than two-thirds of the indicators of the EU SDI theme ‘economic development’ (dark-green), but it addresses less than one-third of the indicators specified in the theme ‘public health’ (beige). Themes that are not addressed by national SDI sets are highlighted in orange.
Obviously, the EU SDI framework themes ‘economic development’ and ‘climate change and energy’ are the ones that are addressed most coherently. ‘Public health’ is another prominent issue in all national SDI sets analysed. In contrast, countries obviously use few or different indicators for the themes ‘good governance’ and ‘global partnership’. This goes conform with the revised EU SDI set from 2007 that does not contain a headline indicator for ‘good governance’ anymore. As Eurostat’s 2007 monitoring report points out, “good governance is a new area for official statistics, which is reflected in the lack of robust and meaningful indicators on this topic” (Eurostat, 2007:268).
For Iceland and, to a lesser extent, Italy, Figure 5 shows again a clear emphasis on environmental issues (as for SD objectives). Thus, the linkage between objectives and indicators of the NSDSs in these two countries is strong. Other countries (for instance Sweden and Switzerland) show quite different emphases regarding SDIs and SD objectives (see Figure 2). This comparison will be discussed in more detail in the ‘Conclusions’ Section.
Figure 6 summarises the findings shown in Figure 2 for the vertical country axis by using the same colour code. Notably, countries showing relatively high scores in addressing the EU SDI framework themes (Austria, Czech Republic, Denmark, Switzerland and the UK) all use a comprehensive SDI set. As the EU SDI set from 2005 consists of 166, it is no surprise that countries with rather small sets of indicators (in particular Germany, France and Norway) cannot address the EU SDI framework themes comprehensively.
Figure 6 : Overview of how comprehensively countries cover the EU SDI set
Note: The colour code used in Figure 6 indicates the degree to which national SDI sets refer to the indicators identified in the ten EU SDI framework themes from 2005. For instance, the SDI set of the Czech Republic addresses all EU SDI themes; one to a high (dark-green), six to a medium (light-green) and three to a low degree (beige). On the other hand, the Icelandic SDI set only addresses five of the ten EU SDI themes (to a low extent).
Figure 7 summarises the findings presented in Figure 2 for the horizontal axis (depicting the ten themes of the EU SDI framework) by using the same colour code.
Figure 7 : EU SDI themes addressed by national SDI sets
Note: The colour code used in Figure 7 indicates the degree to which the EU SDI framework themes from 2005 are addressed by 24 national SDI sets. The theme ‘economic development’, for instance, is addressed by one SDI set to a high degree (dark-green), by nine to a medium degree (light-green), by twelve to a low degree (beige), and by two SDI sets not at all.
This report shows that SD objectives and indicators are two related key features of SD Strategies across Europe. It confirms that, “when developed and deployed together, they can play a mutually supportive and strengthening role” (Pintér et al., 2005:10). While SDIs increase the rigour and credibility of the SD governance cycle (from strategy formulation via implementation to strategy renewal), objectives in SD Strategies ought to provide a sense of direction for both actual policies and governance processes (including the monitoring of SD with indicators).
Overall, this report shows that SD objectives are more coherent than SDIs, and that the degree of coherence varies not only between countries, but also regarding topics and themes. Interestingly, the coherence of SD objectives and indicators is strongest for environmental issues, in particular for ‘climate change and clean energy’ and ‘natural resources’. On the other hand, the global dimension of SD is the issue with the lowest degree of coherence in terms of both SD objectives and indicators. For social issues, the picture is ambiguous. While some social themes of the EU SDS (like ‘social inclusion’ and ‘ageing society’) are addressed strongly by some countries, others neglect them in terms of objectives and/or indicators. The coherence of economic objectives and indicators will be at the focal point of the ESDN Quarterly Report March 2008 that will focus on the Lisbon Strategy and respective National Reform Programmes. Overall, it seems that, in the context of SD strategies, vertical policy integration (i.e. the integration of policies across different tiers of government) is stronger for environmental than for social policies. It highlights that environmental issues still play a dominant role in SD Strategies.
Regarding the link between SD objectives and indicators, most countries (in particular Austria, the Czech Republic, Denmark and the UK) show a similar pattern of coherence with the EU reference points. This applies also to Finland and France where NSDSs and SDIs were renewed together after the adoption of the EU SDS in 2006 (with the limitation that both countries use a quite small SDI set that, as described above, is not able to address the themes of the EU SDI framework as comprehensively as other countries with large indicator sets). Although we did not look at the direct link between national SD objectives and indicators for methodological reasons, this finding suggests that these countries feature strong thematic linkages between the two SD Strategy features. Other countries, such as Estonia, Sweden and Switzerland, show different degrees of coherence for SD objectives and indicators. While for Estonia and Switzerland this variation may be due to the fact that their NSDS and SDI documents have been elaborated in separate processes, it is surprising for the Swedish NSDS that contains the SDI set.
When we try to understand the degree of coherence regarding SD objectives and indicators across Europe, a key question is to what degree similarities (and differences) are due to top-down and/or bottom-up processes of vertical policy integration. Regarding objectives, it is important to note that, with the exceptions of France, Finland and Malta, all NSDSs were already in force when the renewed EU SDS was adopted by the European Council in June 2006 (for details see Table 3 ). Thus, the vertical integration of SD objectives from top-down must have played a limited role so far. However, since the EU SDS was developed with a strong involvement of Member States (for details, see the ESDN Quarterly Report June 2006), vertical integration certainly took place bottom-up. In other words, so far, it seems that the objectives of the EU SDS reflect priority areas of the Member States more than the other way round.
Since most countries plan to revise their NSDSs in line with the renewed EU SDS, we can expect that the coherence of SD objectives will increase considerably mainly top-down driven across Europe in the near future. Obviously, bottom-up and top-down processes of vertical policy integration complement each other, sometimes not necessarily in parallel but at different times due to differently timed policy or strategy cycles. Regarding SDIs we have a similar situation. Most countries have developed their national SDI set before the EU SDI set was adopted in 2005, and renewed in 2007. Thus, we can expect that many of them will be revised in the next few years together with their overall SD strategy objectives. Most likely, this renewal will also result in an increased degree of European coherence regarding SDIs.
Overall, improving the coherence between SD objectives and indicators at the EU and national levels is an important step towards a European answer to unsustainable trends. However, what should not be overlooked is the linkage between SD objectives and SDIs within an SD strategy process. Fostering this linkage, adapting implementation efforts and renewing objectives accordingly is one key purpose of SD strategies in particular, and of what Steurer (2007) calls Strategic Public Management in general.
Bartelmus, P. (2007). Indicators of sustainable development. Encyclopedia of Earth Retrieved November 21, 2007, from http://www.eoearth.org/article/Indicators_of_sustainable_development
Bossel, H. (1999). Indicators for Sustainable Development: Theory, Method, Applications. Winnipeg, Manitoba: IISD.
Dalal-Clayton, B., & Krikhaar, F. (2007). A New Sustainable Development Strategy: An Opportunity Not To Be Missed. Report of a Peer Review of The Netherlands Sustainable Development Strategy (No. A.10). Den Haag: RMNO.
Doran, G. T. (1981). There's a S. M. A. R. T. Way to Write Management Goals and Objectives. Management Review (AMA Forum)(November), 35-36.
European Commission. (2004). National Sustainable Development Strategies in the European Union. A first analysis by the European Commission. Commission Staff Working Document.
European Commission. (2005). Communication from Mr. Almunia to the Members of the Commission. Sustainable Development Indicators to monitor the implementation of the EU Sustainable Development Strategy (No. SEC(2005) 161 final). Brussels: Commission of the European Communities.
European Commission. (2007). Accompanying document to the Progress Report on the European Union Sustainable Development Strategy 2007. Commission Staff Working Document (No. SEC(2007) 1416). Brussels: Commission of the European Communities.
Eurostat. (2005). Measuring progress towards a more sustainable Europe. Sustainable development indicators for the European Union. Luxembourg: Office for Official Publications of the European Communities.
Eurostat. (2007). Measuring progress towards a more sustainable Europe. 2007 monitoring report of the EU sustainable development strategy. Luxembourg: Office for Official Publications of the European Communities.
Favell, I. (2004). The Competency Toolkit. Ely, Cambridgeshire: Fenman.
Hass, J. (2006). Challenges in establishing Sustainable Development Indicators (Working Paper prepared for the 2nd meeting of the Joint UNECE/OECD/Eurostat Working Group on Statistics for Sustainable Development). Oslo.
IIED. (2002). Sustainable Development Strategies: A Resource Book. Paris and New York: OECD and UNDP.
Mintzberg, H., Ahlstrand, B., & Lampel, J. (1998). Strategy Safari: A Guided Tour Through theWilds of Strategic Management. New York: The Free Press.
OECD. (2001). The DAC Guidelines. Strategies for Sustainable Development. Paris: OECD.
OECD. (2003). OECD Environment Indicators. Development, Measurement and Use. Paris: OECD.
Pintér, L., Hardi, P., & Bartelmus, P. (2005). Sustainable Development Indicators. Proposals for a Way Forward. Winnipeg, Manitoba: IISD.
Poister, T. H., & Streib, G. D. (1999). Strategic management in the public sector. Public Productivity and Management Review, 22, 308-325.
Rametsteiner, E., Pülzl, H., Bauer, A., Nussbaumer, E., Weiss, G., Hametner, M., Tiroch, M., & Martinuzzi, A. (2007) Analysis of national sets of indicators used in the National Reform Programmes and Sustainable Development Strategies. Luxembourg: Eurostat & Office for Official Publications of the European Communities.
Schick, A. (1999). Opportunity, strategy, and tactics in reforming public management.
Steurer, R. (2002). Der Wachstumsdiskurs in Wissenschaft und Politik: Von der Wachstumseuphorie über 'Grenzen des Wachstums' zur Nachhaltigkeit („The Economic Growth Controversy in Science and Politics: From ‚Growthmania’ to Sustainable Development, via ‚Limits to Growth’“). Berlin: Verlag für Wissenschaft und Forschung.
Steurer, R. (2007). From Government Strategies to Strategic Public Management: an Exploratory Outlook on the Pursuit of Cross-Sectoral Policy Integration. European Environment, 17, 201-214.
Steurer, R., & Martinuzzi, A. (2005). Towards a new pattern of strategy formation in the public sector: first experiences with national strategies for sustainable development in Europe. Environment and Planning C: Government and Policy, 23, 455-472.
Steurer, R., & Martinuzzi, A. (2007). Editorial: From Environmental Plans to Sustainable Development Strategies. European Environment, 17, 147-151.
UN. (1992). Agenda 21. New York: United Nations.
UNCSD. (2001). Indicators of Sustainable Development: Guidelines and Methodologies.
UNDESA. (2002). Guidance in Preparing a National Sustainable Development Strategy: Managing Sustainable Development in the New Millennium. New York: UN Department of Economic and Social Affairs.
UNDESA. (2006). Global Trends and Status of Indicators of Sustainable Development. New York: UN Department of Economic and Social Affairs.
|This website is maintained by the
ESDN Office Team at the WU Institute for Managing Sustainability
|Imprint | Privacy statement|