Tags:

Purpose

“We commit to engage in systematic follow-up and review of implementation of this Agenda over the next fifteen years. A robust, voluntary, effective, participatory, transparent and integrated follow-up and review framework will make a vital contribution to implementation and will help countries to maximize and track progress in implementing this Agenda in order to ensure that no one is left behind.”

The 2030 Agenda for Sustainable Development (UN 2015)

Follow-up and review is a key aspect of The 2030 Agenda for Sustainable Development. Ensuring that the statistical systems, capacities, methodologies and mechanisms are in place to track progress and ensure accountability, with the engagement of citizens, parliaments and other national stakeholders. This is especially critical with regard to the most excluded and marginalized populations, which are often not represented or under-represented in current national data collection. The 2030 Agenda also requires follow up and review processes to be informed by country-led evaluation and notes the need to build capacity for national data systems and evaluation programmes. The purpose of this section is therefore to provide guidance in relation to approaches and tools for monitoring, reporting and accountability in relation to the implementation of national development plans and strategies.

Guidance

This section provides specific guidance in relation to key aspects of monitoring, reporting and accountability. The guidance addresses four specific aspects:

  1. Indicator development and data collection: to follow the progress of the Inter Agency and Expert Group on SDG Indicators (IAEG-SDGs) and begin working toward identifying nationally-relevant and human rights-sensitive indicators and targets, and establishing baseline data;
  2. Disaggregating data: the commitment to ‘leaving no one behind’ and tackling inequality and discrimination in the SDGs will require going beyond averages to target efforts towards reaching the most excluded population groups. To do so requires disaggregation of data by sex, age and other salient socio-economic characteristics, including income/wealth, location, class, ethnicity, age, disability status and other relevant characteristics as a means for ‘leaving no one behind.
  3. Monitoring and reporting systems: to work with existing data and metadata reporting systems and to create online systems for information exchanges, including reporting on key indicators and providing opportunities for both horizontal and vertical coordination; and
  4. Review processes and mechanisms: for reviewing progress on nationally and sub-nationally adapted SDGs.

Indicator Development and Data Collection

“The Goals and targets will be followed-up and reviewed using a set of global indicators. These will be complemented by indicators at the regional and national levels which will be developed by member states, in addition to the outcomes of work undertaken for the development of the baselines for those targets where national and global baseline data does not yet exist. The global indicator framework, to be developed by the Inter Agency and Expert Group on SDG Indicators, will be agreed by the UN Statistical Commission by March 2016 and adopted thereafter by the Economic and Social Council and the General Assembly, in line with existing mandates. This framework will be simple yet robust, address all SDGs and targets including for means of implementation, and preserve the political balance, integration and ambition contained therein.”

                                                                                            The 2030 Agenda for Sustainable Development

The most important guidance to be provided at this early point in time with regard to indicator development, is to stress the importance of the country implementing/coordinating agency to establish a partnership as soon as possible with the agency that currently tracks progress indicators for the national development plan or strategy. In most countries this would require close coordination between the National Statistical Office, other data producers within the National Statistical System (such as line Ministries) and the Ministry of Planning or specialized designated agency in charge of leading the implementation of the National Development Strategy. This ideally would be in the public awareness stage (see Section B1). Both partners can then follow the progress of the Inter Agency and Expert Group on SDG Indicators (IAEG-SDGs) and begin working toward identifying nationally-relevant indicators that can be used to track progress toward nationally-adapted SDGs[1]. This type of indicator assessment can, in fact, be initiated just from knowledge of the specific SDG targets. For example, in the case of Germany, recommendations from the German Council for Sustainable Development (RNE) to the federal ministries already discussed which indicators are relevant and in need of amending (See Innovative Case Example in Section B3).

As summarized by the European Sustainable Development Network (ESDN 2015), “In most countries, the National Statistical Offices are responsible for the development and monitoring of SD indicators (e.g. Estonia, France, Finland, Germany, Hungary, Italy, Portugal, Slovenia, Sweden, Switzerland). In other countries, different bodies have this responsibility, for instance, Belgium (Task Force on SD of the Federal Planning Bureau), Cyprus (Inter-Governmental Committee), or Denmark (Environment Protection Agency).”

Where indicator and data gaps are identified, proposals can be made to address them, including establishing baseline data. In countries with limited national statistical capacity, the revision of the National Strategy for the Development of Statistics and the elaboration of five-year or ten-year plans for data collection for the monitoring and evaluation of the SDGs can be undertaken. The UN system can assist in creating a joint programme for the implementation of the data collection plan.

Serious consideration should also be given to going beyond governance as usual and pursuing participatory-based monitoring opportunities (see Innovative Case Example below).

One lesson from the global conversation leading up to the adoption of The 2030 Agenda is that crowd-sourced data can be a powerful complement in advocating for policy change. Building on the MY World survey (see Innovative Case Example below), MY World 2030[2] will seek to build upon the global network of MY World partners and undertake a “people’s baselining” exercise as part of the global rollout of the SDGs.

Through an online, mobile and offline component, MY World 2030 will contribute to efforts to report back on progress by collecting globally comparable data to monitor how people feel their lives are changing. This data could feed into official monitoring efforts both locally and globally and contribute to an enhanced mechanism for effective monitoring and implementation of the goals. A second contribution will be to build dialogue between decision makers such as parliamentarians, local governments, mayors and citizens, with young people in particular to contribute a “people’s perspective” on how to implement the new agenda at different levels. It is envisaged that this dialogue will be aggregated at national, regional and global levels. Volunteers will be a key component for the new phase of offline rollout of the survey in order to enhance people’s engagement with the agenda beyond the collection of data. The demand for this has been demonstrated by the MY Municipality initiative in Macedonia and the continued expansion of U Report globally.

Box

Innovative Case Example: Participatory Monitoring and Data Collection

UNICEF – Peru: “UNICEF Peru, in its paper ‘Community Surveillance Systems for Early Childhood and Development: A participatory approach’, exemplified how community surveillance systems (CSS) in Peru were essential to the growth and development of children and pregnant mothers.” (UNDG 2015, p 19)

UNICEF U-Report: An “innovative communication technology developed by UNICEF and revolutionizes social mobilization, monitoring and response efforts: It equips mobile phone users with the tools to establish and enforce new standards of transparency and accountability in development programs and services (UNICEF 2012).”

Thailand iMonitor: “Thailand described how its iMonitor application for smart phones and other devices is tracking and evaluating public HIV services, as well as creating an opportunity for dialogue with authorities to address challenges.” (UNDG 2015, p 19)

Zambia M-WASH: “Zambia noted the use of M-WASH, a mobile/web-based monitoring, evaluation and reporting system that covers 1.7 million people and advances accountability by making water and sanitation data transparent. The technological component inspires competition among districts by publishing results and maps that demonstrate which districts and provinces are making the most progress towards improved access to water and sanitation.” (UNDG 2015, p 19)

MY World survey: The global MY World survey was an options survey requesting people to choose six out of sixteen key issues important for themselves and their families. The survey gathered over 8.4 million votes through online, mobile and offline channels. 80% of the votes were collected offline through volunteer effort and 80% of the voters were under 30 years of age. The survey facilitated the dialogue among different stakeholders and increased interest in and momentum for The 2030 Agenda. Its open-source, real-time results were fed into the intergovernmental negotiations of the agenda and were used by many stakeholders for advocacy purposes.

Some countries in special circumstances, such as fragile states, small islands, or least developed countries, might need to evaluate whether the SDG indicator framework is sufficient to capture the specificities of their development needs. If additional indicators are required, countries are encouraged to look at existing commitments, statistical coordination groups and progress monitoring frameworks that might be able to guide their indicator selection process. For instance, the New Deal for Engagement in Fragile States and the Indicators to monitor SC Resolution 1325 might be able to capture the needs and specificities of fragile and conflict-affected countries, and region-specific indicators designed by ESCAP, ECLAC and the AU might be able to provide solutions for small islands, landlocked and least developed countries.

Disaggregating Data

The importance of the disaggregation of data was a critical lesson from the MDG implementation period. In The 2030 Agenda, the disaggregation of data will be one of the mechanisms for realizing the ‘Leave no one behind’ principle. And so important is this aspect that it forms the basis for SDG Target 17.18: By 2020, enhance capacity-building support to developing countries, including for least developed countries and small island developing States, to increase significantly the availability of high-quality, timely and reliable data disaggregated by income, gender, age, race, ethnicity, migratory status, disability, geographic location and other characteristics relevant in national contexts.

Box

MDG Lessons

Monitoring activities need to be Sufficient in Terms of Coverage, Disaggregation of Data and Timeliness

Looking ahead to the post-2015 era, more monitoring and evaluation investments are going to be required at the national as well as the international level to effectively monitor and evaluate the sustainable development goals. In the words of the Secretary-General, “we must significantly scale up support to countries and national statistical offices with critical needs for capacities to produce, collect, disaggregate, analyse and share data crucial to the new agenda” (see A/69/700, para. 142).

UNDG has also recommended that the United Nations development system “intensify support to strengthening of national statistical capacity, greater disaggregation and ‘localization’ of national data and address all data ‘dark spots’, using the distinctiveness of the United Nations global footprint and the capacities and scope of the United Nations system’s joint data coverage”

Source: UN ECOSOC (2015).

It has been noted that, in the case of the MDGs, progress on the goals focussed on tracking changes in national averages. The focus on averages can mask disparities between groups and exclude population groups that may be among the poorest of the poor or the most vulnerable and marginalized (OHCHR 2015).

Therefore, the guidance imparted in this section for UNCTs is to recommend to Member States that when working with their statistical agencies in the formulation of indicators in support of nationally-relevant SDG targets, it is important to support the ‘data revolution’ by investing in the regular and systematic collection of disaggregated data in accordance with SDG Target 17.18. This might require larger sample sizes, specialized surveys to capture specific marginalized groups, as well as specific training for survey enumerators and recording officers (in the case of administrative records).

Addressing gaps in the production of gender statistics in particular will be critical for tracking progress in achieving the SDGs for women and girls. Moreover, to better capture intersectional inequalities throughout the framework, disaggregation by sex, age and other salient socio-economic characteristics, including income/wealth, location, class, ethnicity and other relevant characteristics will be required.

Monitoring and Reporting Systems

“They [follow-up and review processes] will be open, inclusive, participatory and transparent for all people and will support the reporting by all relevant stakeholders.”

Online indicator information systems already exist in many countries for monitoring and reporting on progress toward the national development plan, strategy and/or MDGs. These systems can be updated to incorporate any new or revised indicators that are identified in the process of adapting the SDGs to national contexts (Section B3) and the indicator assessment described above.

For example in Mexico, a National Coordinating Committee helped put in place an MDG information system in 2011 that provides national and sub-national dis-aggregations. Approximately 80% of MDGs are updated annually (UNDESA-DSD 2015c).

Ideally, national data repositories should be in line with international statistical definitions and exchange standards, which would facilitate reporting to international statistical mechanisms and dramatically reduce reporting burden. For instance some of the statistics produced by Mexican INEGI are currently archived in SDMX-XML based databases, which allow for automatic exchanges with international entities.

Other national monitoring and reporting systems are also quite innovative, incorporating multiple ways to view and examine the indicators. The Swiss MONET system is a prime example (see the Innovative Case Example below).

Box

Innovative Case Example: The Swiss MONET Indicator System

The MONET Indicator System is Switzerland’s mechanism for tracking progress towards its sustainable development strategy. It combines several novel ways to view and analyse indicators:

  • All Indicators: a view to all 75 indicators that describe the “current situation and development in Switzerland with regard to the social, economic and environmental aspects of sustainable development.”
  • Global Indicators: a subset of indicators showing “how sustainable interactions between Switzerland and other countries are related to the use and distribution of the environmental, economic and social resources.”
  • Key indicators: a view of progress relating to 17 aggregated indicators.
  • The cockpit: designed so that the ends of both Use can see how the result comes about, and can view the individual indicators. To this end, the cockpit provides access to the data and to the detailed description of individual indicators.                                                
  • Klartext card game: The card game with exciting information about Switzerland based on the MONET indicators for sustainable development. A game for the whole family for 2 to 4 people over 14 years with 161 cards.

MONET is a joint activity of the Federal Statistical Office (FSO), the Federal Office for the Environment (FOEN), The Federal Office for Spatial Development (ARE) and the Swiss Agency for Development and Cooperation (SDC).

Source: FSO (2015)

Monitoring and reporting systems provide a mechanism for both horizontal and vertical coordination. Horizontally, the relationship among seemingly disparate indicators (i.e., issues) can more readily be explored, as in the case of Belize (See Section B3). Vertically, local indicators can aggregate up to sub-national indicators, and similarly, sub-national indicators can aggregate up to national indicators. Growth in the use of online sustainability monitoring and reporting systems at all levels of government are creating new opportunities for the coordination of plans across levels of government given their transparent and accessible nature.

It is also suggested that innovative monitoring approaches including the collection of qualitative data be developed and implemented in order to assess early outcomes, learn and adapt interventions and strategies at national, sub national and even local levels.

Review Processes and Mechanisms

The 2030 Agenda for Sustainable Development provides guidance for reviewing progress toward the SDGs at the national, regional and global levels, building on existing monitoring mechanisms, including the international human rights monitoring mechanisms.

At the national level, The 2030 Agenda for Sustainable Development notes the following:

“We also encourage member states to conduct regular and inclusive reviews of progress at the national and sub-national levels which are country-led and country-driven. Such reviews should draw on contributions from indigenous peoples, civil society, the private sector and other stakeholders, in line with national circumstances, policies and priorities. National parliaments as well as other institutions can also support these processes (The 2030 Agenda for Sustainable Development).”

Good practice examples for national review of progress can be seen in many European countries implementing national sustainable development strategies, where it is noted that “Multi-level and multi-stakeholder review processes also receive great importance, together with for instance, national parliaments or existing institutions such as the National SD Councils (ESDN 2015).” In a summary of national review practices, the European Sustainable Development Network describes a three-part typology that captures the state-of-practice across Europe (ESDN 2015):

  1. “Internal Reviews: Some countries have a bi-annual review process that culminates with the publication of a so-called progress report (e.g. Austria, Luxembourg, Latvia, and Lithuania). Some others perform annual reviews or annual progress reports (e.g. France, Slovakia, Slovenia, and Switzerland). Several countries have a less tight schedule that does not display regularity or is represented by a one-off exercise (e.g. Poland, Spain). Germany has a four-year review process cycle. Also, for the Austrian ÖSTRAT (the Austrian joint national strategy addressing both the federal and regional levels), evaluation is intended to be done every four years.”
  2. External Reviews: “Two options are usually employed: Either the responsible institution for the NSDS review process commissions a private consultant (e.g. Switzerland, Finland) or the task is given to independent researchers (e.g. Austria).”
  3. Peer Reviews: “Peer reviews have been conducted in four countries: France (2005), Norway (2007), the Netherlands (2007), and twice in Germany (2009, 2013). The idea behind the peer reviews of NSDSs is to identify and share good practices in a process of mutual learning where, usually, other countries are taken as peers in the process. The peer review of an NSDS is voluntary and is undertaken upon the initiative of the country concerned. The peer reviews are intended to address all three SD pillars and the peer-reviewed country is free to choose to undertake a review of the whole NSDS or focus on one or more specific issues.”

Additionally, countries with a long history and culture of planning also have well-developed review processes for their respective national development plans. In a 2014 review of practices in Latin America and the Caribbean undertaken by the Sustainable Development Planning Network, it was observed that “There are national monitoring systems that track progress towards the goals of the national plan in four-year cycles, attempting to gauge the percentage of progress made over time. A central body such as the planning department oversees the process, engaging stakeholders and the public in the monitoring process at these intervals. In Costa Rica, for example, the National Assessment System operates in the Planning Ministry (Ministerio de Planificación Nacional y Política Económica), which carries out monitoring and evaluation of goals and policies of the plan and of public policies. Furthermore, the legislature and the Comptroller General’s Office give periodic accountability reports (SDplanNet 2015).”

  1. Audit Agencies: A fourth type of national review mechanism can be considered in addition to the three listed above in the European context. Audit departments in many countries currently provide an independent internal review mechanism for governments that covers the full range of government operations and services. And some countries have development specific functions within their audit departments for addressing sustainable development issues. For example, Canada’s Commissioner of the Environment and Development resides in the Office of the Auditor General of Canada (OAG 2015). An interesting innovation in audit agencies is the trend toward creating commissioners that act on behalf of future generations. For example, in Wales a ‘Future Generations Commissioner’ was recently established under the innovative ‘The Well-being of Future Generations (Wales) Act’ (see Innovative Case Example Below). Hungary was a pioneer in this regard with their efforts in creating an Ombudsperson for Future Generations (World Future Council 2007).
  2. Evaluation of public policy:  A number of countries have developed strong evaluation systems to evaluate public policy and inform national decision making.  For example, Mexico and Brazil have both used evaluations of social protection systems to confirm the benefits of such systems and inform expansion of these systems.  The USA and Canada have each made periodic evaluation of government funded programmes mandatory in order to provide assurance that such programmes are appropriate, effective and cost effective, providing a powerful mechanism for follow up.

Box

Innovative Case Example: Welsh Future Generations Commissioner

On 29 April 2015 ‘The Well-being of Future Generations (Wales) Act’ became law in Wales. The Act “strengthens existing governance arrangements for improving the well-being of Wales to ensure that present needs are met without compromising the ability of future generations to meet their own needs (Wales 2015a). Specifically, the Act:

  • “Identifies goals to improve the well-being of Wales;
  • Introduces national indicators, that will measure the difference being made to the well-being of Wales;
  • Establishes a Future Generations Commissioner for Wales to act as an advocate for future generations; and
  • Puts local service boards and well-being plans on a statutory basis and simplifies requirements for integrated community planning.”
  • The Future Generations Commissioner will “be an advocate for future generations who will advise and support Welsh public authorities in carrying out their duties under the Bill (Wales 2015b).”

Toolkit

Data and Indicators

  • National SDG Data Assessments in the Asia and Pacific region (UNDG Asia-Pacific, forthcoming)
  • Data for Development: A Needs Assessment for SDG Monitoring and Statistical Capacity Development (SDSN 2015)
  • UNEPLive (UNEP 2015)

Participatory monitoring systems

  • Peru Community Surveillance Systems for Early Childhood and Development (UNDG 2015)
  • Thailand iMonitor (UNDG 2015)
  • Zambia M-WASH (UNDG 2015)
  • UNICEF U-Report (UNICEF 2012)
  • Most Significant Change (MSC) Technique (Davies and Dart (2005)

Online Monitoring Systems

  • Swiss MONET System (FSO 2015)
  • Mexico MDG Information System (Mexico 2015)

Review processes

  • Internal Review: Belgium (ESDN 2015b)
  • External Review: Finland (ESDN 2015c)
  • Peer Review: German Peer Review (RNE 2013)
  • Audit Offices: The Well-being of Future Generations (Wales) Act (Wales 2015b).
  • Outcome Mapping: Building learning and reflection into development programs  (IDRC 2001)

Human Rights Guidance

  • Human Rights Indicators: A Guide to Measurement and Implementation (OHCHR 2012).
  • Who Will Be Accountable? Human Rights and the Post-2015 Development Agenda (OHCHR (2013).

Gender Mainstreaming Guidance

  • UN Statistical Commission Guide to Minimum Set of Gender Indicators (UN 2013)
  • UN Women Position Paper: monitoring gender equality and the empowerment of women and girls in the 2030 agenda for sustainable development: opportunities and challenges (UN Women 2015)
  • UN Guidelines for Producing Statistics on Violence against Women (UN 2014)

Decent Work Indicators

  • ILO Manual on Decent Work indicators (ILO 2012).

References and Links

Atkisson (2015). Introduction to the VISIS Method: Vision > Indicators > Systems > Innovation > Strategy. Presented at the UNDESA Workshop on Integrated Approaches for Sustainable Development, New York, May 27-292015.

Davies, R. and J. Dart (2005). The Most Significant Change (MSC) Technique: A Guide to Its Use. 

ESDN (2015a). The European context for monitoring and reviewing SDGs: How EU Member States and the European level are approaching the Post-2015 Agenda. European Sustainable Development Network, Quarterly Report.

ESDN (2015b). Belgium Country Profile. European Sustainable Development Network. 

ESDN (2015c). Finland Country Profile. European Sustainable Development Network. 

FSO (015). MONET Indicator System. Federal Statistical Office (FSO). Government of Switzerland. 

IDRC (2001). Outcome Mapping: Building Learning and Reflection into Development Programs. International Development Research Centre: Ottawa.

ILO (2012). Manual on Decent Work Indicators. International Labour Organization. 

Mexico (2015). MDGs in Mexico. Government of Mexico.

OAG (2015). Commissioner of the Environment and Sustainable Development – Office of the Auditor General of Canada.

OHCHR (2012). Human Rights Indicators: A Guide to Measurement and Implementation. 

OHCHR (2013). Who Will Be Accountable? Human Rights and the Post-2015 Development Agenda. 

OHCHR (2015). SDGs Indicator Framework: A Human Rights Approach to Data Disaggregation to Leave No One Behind. United Nations Office of the High Commission on Human Rights.

SDplanNet (2015). Summary of Capacity-building Needs to Advance Sustainable Development Planning and Implementation: Synthesis of Regional Perspectives from Africa, Asia and the Pacific, and Latin-America and the Caribbean. Sustainable Development Planning Network. Available at: www.SDplanNet.org and http://www.iisd.org/publications/summary-capacity-building-needs-advance-sustainable-development-planning-and.

RNE (2013). Sustainability – Made in Germany: The Second Review by a Group of International Peers, Commissioned by the German Federal Chancellery. The German Council for Sustainable Development (RNE).

SDSN (2015). Data for Development: A Needs Assessment for SDG Monitoring and Statistical Capacity Development. Sustainable Development Solutions Network. 

UNDG (2015). Delivering the Post-2015 Development Agenda: Opportunities at the National and Local Levels. 

UNDG Asia-Pacific (forthcoming). National SDG Data Assessments

UN ECOSOC (2015). Thematic evaluation of monitoring and evaluation of the Millennium Development Goals: lessons learned for the post-2015 era: Report of the Office of Internal Oversight Services. United Nations Economic and Social Council.

UNEP (2015). UNEPLive. United Nations Environment Program.

UNICEF (2012). U-report Application Revolutionizes Social Mobilization, Empowering Ugandan Youth. United Nations Children’s Fund.

UN Statistics Division (2014). UN Guidelines for Producing Statistics on Violence against Women.

UN Statistics Division (2015). UN Statistical Commission: Guide to Minimum Set of Gender Indicators.

UN Women (2015). Position Paper: monitoring gender equality and the empowerment of women and girls in the 2030 agenda for sustainable development: opportunities and challenges.

Wales (2015b). The Well-being of Future Generations (Wales) Act.

Wales (2015b). The Well-being of Future Generations (Wales) Act.

World Future Council (2007). Interview with the Hungarian Ombudsman for Future Generations. World Future Council. 

[1] See the IAEG-SDGs website

[2] See www.myworld2030.org

Related Blogs and Country stories

Silo Fighters Blog

Innovation scaling: It’s not replication. It’s seeing in 3D

BY Gina Lucarelli | September 12, 2018

My brother is a mathematician and on family vacations, he talks about data in multi-dimensions. (Commence eyes-glazing over). But as the family genius, he’s probably on to something. Lately, in my own world where I try to scale innovation in the UN to advance sustainable development, I am also thinking in 3D, or, if properly caffeinated,  multi-dimensionally. As new methods, instruments, actors, mutants and data are starting to transform how the UN advances sustainable development, the engaged manager asks: when and how will this scale?  To scale, we need to know what we are aiming for.  This blog explores the idea that innovation scaling is more about connecting experiments than the pursuit of homogeneous replications. Moving on from industrial models of scaling innovation In the social sector, the scaling question makes us nervous because the image of scaling is often a one dimensional, industrial one: let’s replicate the use of this technology, tool or method in a different place and that means we’ve scaled. This gives us social development people pause not only because we can’t ever fully replicate [anything] across multiple moving  elements across economic, social and culture. Even if we could replicate, it would dooms us to measuring scaling by counting the repeated application of one innovation in many places.   Thankfully, people like Gord Tulloch have given us a thoughtful scaling series that questions the idea that scaling social innovation is about replicating single big ideas many times over. [Hint: he says scaling innovation in the public sector is less about copy-pasting big ideas and more about legitimizing and cultivating many “small” solutions and focusing on transforming cultures.]  Apolitical’s spotlight series on scaling social impact includes a related insightful conclusion: when looking at Bangladesh’s Graduation Approach as one of the few proven ways out of poverty, they suggest that while the personalized solutions work best, they might be replicable, but too bespoke to scale. So if scaling ≠ only replication, how do we strategize for scale? I’ve got a proposal:  what if we frame the innovation scaling question more about doing deep than broad? The scaling question becomes: How will we move from distinct prototypes managed by different teams at the frontier of our work to a coherent, connected use of emergent  experiments in programme operations? Scaling also means moving from fringe to core Scaling innovation in a large organization like the UN has a glorious serendipity to it. Did you hear that we are looking into impact bonds in Armenia? What about the food security predictor in Indonesia? Nice collective intelligence approach in Lesotho. Blockchain is being used for cash transfers in Pakistan and Jordan. Check out the foresight in Mauritius. UNICEF is using Machine learning to track rights enshrined in constitutions. UNHCR is using it to predict migration in Somalia. UNDP is testing out social impact bonds for road safety in Montenegro. These organic innovations are beautiful and varied and keep us learning, but we as a UN system are not yet scaling in 3D. These days, I’ve been talking to people (my brother’s eyes glaze over at this point) about how to see various methods of innovations not as distinct categories of experiments, but rather as connected elements of an emergent way of doing development. Towards a connected kind of 3D.  Yes innovation is more of an evolving set of disruptions than a fixed taxonomy of new methods, but if we narrow our scope for a moment to the subset of innovations which have passed the proof of concept stage, can we start thinking seriously about how they connect? [As an important side note, thinking in terms of taxonomies of innovations is not a panacea. Check out @gquagiotto’s slides for a more thorough story on how classification is trouble for public sector innovation because it means we limit our vision and don’t see unexpected futures where they are already among us.] Projectizing innovation without keeping an eye on the links among the new stuff won’t get us far, and might even be counter-productive.  Instead, what would it be like if innovations were deployed in an integrated way? A bit like Armenia’s SDG innovation lab where behavioral insights, innovative finance, crowd-sourced solutions and predictive analytics [among others] are seen as a package deal.  I am looking for collaborators to learn more about how are all these methods and tools related. Do they help or hinder each other? Are there lessons that can be learned from one area and applied to others? Should some new tech and methods not be combined with others? 9 elements of next practices in development work A few of us UN experimenters came together in Beirut in July to pool what we know on this.  We had a pretty awesome team of mentors and UN innovators from 22 countries. We framed our reflections around the 9 elements of innovation which I see as approaching critical mass in the field. This is by no means exhaustive, but it’s a start to moving these methods from fringe tests led by various teams to core, connected operations. Here are the “nine elements of next practice UN” we are working with: Tapping into ethnography, citizen science and amped up participation for collective intelligence to increase the accuracy, creativity, responsiveness and accountability of investments for sustainable development. Using art, data, technology, science fiction and participatory foresight methods to overcome short-termism and make sustainable futures tangible. Complementing household survey methods with real time data and predictive analytics to see emerging risks and opportunities and design programmes and policies based on preparedness and prevention. Building on the utility of “superman dashboards”  for decision makers to helping real people use their own data for empowerment, entrepreneurship and accountability. Leveraging finance beyond ODA and public budgets by finding ways to attract private capital to sustainable development. Evolving the way we do things and even what services we offer by managing operations through new technologies Applying psychology and neuroscience for behavioral insights to question assumptions, design better campaigns and programmes and to generate evidence of impact when it comes to people’s behavior. Carving out space for science and technology partnerships within the UN’s sustainable development work Improving how we support our national partners in managing privacy and ethical risks Moving from “that’s cool” to “aha it’s all connected” We need to start thinking of these 9 elements as connected. It might be that they reinforce each other - whereby focusing on data empowerment gives meaning, context and legitimacy to the use of big data to understand behaviors and online activity. Or that they undermine each other - in the way that citizen science can undermine innovative finance pay-outs, or behavioral insights are helping companies get around privacy regulations. Looking for the practical connections, here’s what we’ve got so far: Collective intelligence methods that listen to people organically can help determine whether your behavioral campaigns are resonating.  Because people’s intell is often more granular than statistics, they could also be used to test whether new forms of finance are making an impact on health, education and other development issues. Small scale and/or internal experiments in the UN to manage operations with new technology help us know what the next generation privacy and ethics risks are. Experiments in gray zones can then inform future-oriented regulatory frameworks. Keeping a focus on helping people use data for empowerment is a good northstar when using new data and predictive analytics to ensure that cultivating realtime sources of data isn’t deepening the digital or data privacy divide. Using foresight methods or predictive analytics can point to signals of where to invest with innovative finance instruments [Follow Ramya from IFRC innovations for more on this. Hence some early connections form a budding conspiracy theory! If you are thinking multi-dimensionally too, or using a few of these methods and see where this line of thinking can be improved, help me draw more lines on the innovation conspiracy board! [Or tell me why this is the wrong tree to be barking towards… That’s always helpful too.]   We’re working on a playbook to codify what we know so far in terms of principles and methods for each of these 9 elements. Stay tuned for that... and please do get in touch to throw your own knowledge in!

Silo Fighters Blog

Promise to data: What the SDGs mean for persons with disability in China

BY Marielza Oliveira, Elin Bergman | August 29, 2018

China has strong and capable statistical systems, no surprises there. After all, China is known for its ambitious Five-Year Plans, which have shifted focus from economic growth to policy planning, environmental protection, and social programmes for its population of 1.4 billion. What's different and unique about its 13th Five-Year Plan is that it's very much aligned with the 2030 Agenda for Sustainable Development. Even so, China faces a daunting challenge to implement Agenda 2030. For starters, it only has official data for less than 30 percent of the Sustainable Development Goals (SDGs) indicators, and much less when considering data that covers vulnerable groups, such as persons with disabilities. With more than 85 million, China has the largest population with disabilities in the world. The good news is that China keeps a record of people with disability, so the official data sources are up-to-date. To support the Chinese government’s efforts to improve monitoring of the SDGs addressing people with disabilities, we at UNFPA, UNESCO, UNRCO, UN Women and WHO came together to test innovative approaches to collect focused and disaggregated data. Starting in Qinghai We selected the Qinghai Province in Northwest China as the pilot location to test new ways of collecting data. In Qinghai, the estimated number of persons with disability is five percent of the total population, of which about 70 percent live in rural areas. There are about 150,000 people registered in Qinghai Disabled Persons’ Federation, the local chapter of China Disabled Persons’ Federation. Therefore, it was important for us to look at their administrative data, which are key for crosslinking data from various sectors, including public services data. To demonstrate how data collection in underdeveloped regions can be operationalized in a smart way, we collected, analyzed and crosslinked all the administrative data of people with a disability ID with the following big data sources: Data from the national survey of basic services and needs for people with disabilities which is developed and updated by China Disabled Persons’ Federation, the National Bureau of Statistics and local Disabled Persons’ Federations; Data from the public services and various sectors including health, education, employment, social security, poverty alleviation and community services. This type of data is gathered from crosslinking disability ID data with public services data. Data from internet-based platforms. It's possible to use big data to integrate and crosslink all data from the disability ID system, administrative data of disability services from China Disabled Persons’ Federation and the administrative data of public services. By expanding the existing official data with information from other sources, China has the potential to not only monitor the additional SDG indicators, but it can also compile additional disaggregated views of SDG progress to monitor specific groups and locations in need of support while strengthening “real-time” monitoring and analytics. During this process, we engaged the vulnerable groups in the analysis and interpretation of data. For us, knowing what people living with disabilities think and need is key. We carefully examined their views to highlight the SDGs indicators that could directly benefit their well-being. The hindrances of data collection We experienced a few setbacks throughout the process, but, we adopted coping mechanisms to address the issue of data collection and analysis: Quality control of data. The disability data available from different sectors uses very different standards and follows different collection approaches. Moving forward, we propose to check and purify the data using standard disability datasets and a data crosslink approach. We also optimized the timeliness and the mechanisms to update the data. Sharing data among sectors. The key index of disability and people with disabilities was determined using the disability ID. The data across sectors was crosslinked with key index such as disability ID and others. What we discovered The administrative data platform of people with disability was recently updated with the results from the annual survey of unmet needs and services for people with disabilities nationwide. This platform provides timely data for monitoring SDGs that address people with disabilities. Other sectors have developed big data platforms using citizens’ ID. To continue enhancing the administrative data records, it's important to collaborate with other stakeholders, such as health care and educational departments to extend the existing data sources. Household surveys can also be used to fill in the gaps of official disability statistics. We shared our discoveries with an expert panel, which included representatives from the Chinese government, the National Bureau of Statistics, China Disabled Persons' Federation and its Qinghai branch, Qinghai Department of Commerce, Institute of Rehabilitation Information/WHO Family International Classifications Collaborating Center China, China Disability Data Research Institute, Soochow University, Nanjing Special Education Teachers College, UN agencies, as well as Chinese IT giants What's next The methodology implemented in Qinghai province can easily be extended to other vulnerable groups since they also face similar challenges. Stakeholders can also adopt similar tactics to develop specific SDG indicators, data collection and analysis to evaluate their progress. As for next steps, the UN country team will continue to research protocols and methods to monitor disability-inclusive SDGs. We will also develop a knowledge platform in Chinese to promote capacity building for the implementation of Agenda 2030 and conduct an international comparative study of technical approaches of data collection and analysis. Data and internet-based surveys will also be developed to learn more about the needs of people with disabilities and improve services for them, while at the same time using those statistics to make sure that we leave no one behind. What methods are you disaggregate the SDGs to ensure data for action with people living with disabilities? If you have some tips, do tell! Photography: Jonathan Kos-Read. License by Creative Commons