Sustainable development is becoming a central tenet of legal international frameworks, agreements, recommendations and economic strategies. Under the overall umbrella of the United Nations Agenda 2030: 17 sustainability goals (SDGs) are defined to be achieved by the end of 2030. International, subregional or local adaption plans focusing on the implementation of the SDGs and their appropriate juridical frameworks. For example – the Paris Agreement, which focuses on collective reductions of CO2 emissions; the Nagoya Protocol, for access and benefit-sharing to protect biodiversity; and the vision to completely stop soil erosion and achieve Land Degradation Neutrality (LDN) by the year 2030, the Sendai Framework to reduce disasters risk reductions etc.
An organized international political discourse around sustainability is supported by evidence from the scientific community and launched originally 1992 as a result of the United Nations General Assembly where the so called “Rio Conventions” were adopted and ratified. It was trigged by the first scientific assessment report of an Intergovernmental Panel on Climate Change (IPCC AR-1). Even though there is a long history of science-based political discussions and decision, there is still a big gap between reliable scientific data and socioeconomic political strategies. Although most of the environmental observation and model data are nowadays freely available, and despite dramatic computing advances, relatively few people have both the resources and skills to access and interpret this data. There is a risk for climate science to remain in the hands of a minority, denying large swaths of the population the benefits of evidence-based planning and decision-making.
Independently of the political dialogue, the scientific communities attempting to understand climate change, biodiversity loss and desertification are experiencing a “Big Data” problem. The volume of scientific data is growing too rapidly for many research units to house and process them locally. The tools of data analytics need to evolve to face this challenge.
A potential solution to this big data challenge is to run part of the analysis remotely, on servers collocated with the data archives. Indeed, nowadays the common practice is typically to download the full archived datasets then perform the analytics locally. This approach puts pressure on data providers’ bandwidth and local storage and compute resources and requires specific programming skills of the user.
One idea that environmental data and service providers are developing is to offer analytical services in addition to the raw data. Instead of performing analyses on local computers, scientists would run them on powerful servers close to the data. This setup lets users access data and analytics almost independently of their local computing resources, granting people even in less and least developed countries access to valuable scientific reliable information that can be customized to their needs and priorities.
The current implementation of such remote analytical services builds on an international standard called Web Processing Services (WPS). Using an interoperability standard that allows independent, heterogeneous organizations to offer their data and services through a common interface, facilitating access for users and de facto creating a loosely federated spatial data infrastructure for climate and environment data services. Users are then free to compare and combine data and analytics from different providers to create novel, customized numeric data products. This service model has the potential to drastically change the knowledge-building landscape and enable new collaborations, levelling the playing field for scientists and policy makers everywhere for socioeconomic strategies.
Want to receive exclusive content? Sign up through the short form below.