OGC to work on Quality of Spatial Data APIs

Tools header

Post by Ilkka Rinne, originally published in the OGC Blog on 7th Feb 2017 with a title “OGC invests in improving Quality of Service and Quality of Experience”

We live in a world stealthily powered by Web Services and APIs: nearly everything we do on our laptops and mobile devices uses background services to talk over the Internet. These services are especially important for applications providing access to small subsets of information, based on a user’s location, fed from large, remotely stored datasets. Any quality issues in the communication between the applications and their backend services quickly become critical, causing bad user experience for tens of thousands of people.

Systematic improvement of the Quality of Service (QoS) for Web Services, including factors like availability, capacity, and performance, requires using well-defined metrics in order to provide comparable QoS measurements. Defining these QoS indicators and metrics, as well as declaring the expected service levels for Spatial Data Services, have been identified as priority topics of the newly founded OGC Quality of Service and Experience Domain Working Group (QoSE DWG). OGC member activity leading into founding of the new DWG in late 2016 clearly shows that QoS and more user-oriented Quality of Experience (QoE) topics are currently of great interest within the OGC.

In addition to the QoS metrics, the initial list of tasks for the OGC QoSE Domain Working Group includes gathering and defining a list of the essential QoS and QoE terms, and collecting good community practices in evaluating and improving the user experience of OGC Web Services. As an open DWG, the group acts as a forum for discussion and sharing information in QoS and QoE related topics for OGC members. Regular online meetings will be held monthly, and the group intends to meet face-to-face in as many OGC Technical Committee meetings as possible.

Charter members of QoSE DWG include several active OGC members with critical business interests in QoSE. Tom Kralidis, Senior Systems Scientist from the Meteorological Service of Canada, Government of Canada, highlights the importance of QoSE for both the data providers and data users: “Health check monitoring of geospatial services provides value for more than just uptime, focusing on the specific functionality of a given service or API. The work of the QoSE DWG will be of value to both organizations wishing to communicate their quality of service levels as well as monitoring applications wishing to evaluate and measure service quality in an interoperable manner.”

In Europe, the EU INSPIRE Directive and e-Government development are key drivers for QoSE. Danny Vandenbroucke, Research Manager, KU Leuven (SADL): “With the development of a European wide Spatial Data Infrastructure (SDI) steered by the INSPIRE Directive, QoSE has been recognized as a critical factor in the successful integration and usage of INSPIRE web services in e-Government processes. KU Leuven has been involved in the assessment of SDIs throughout Europe since 2002 and the testing and validation of its components, including QoSE, are a very important part of these assessments.”

Natural Resources Canada, Government of Canada (NRCan) is eager to contribute to the QoSE DWG best practices based on their experience. Cindy Mitchell, Lead, Operational Policies and Standards, Federal Geospatial Platform Initiative: “Quality of Service and Experience is fundamental to operational Spatial Data Infrastructures by ensuring services originating from a wide variety of publishers are available, usable, and relevant to applications and their users. We lead several initiatives of interest to the QoSE DWG in OGC, including Spatial Data Infrastructure assessment methodologies and key performance indicators, automated web services harvesting approaches, Federal Geospatial Platform data and service quality assessments, standards validation, and international collaborative projects (Pan-Arctic DEM, WaterML) that ensure data interoperability via standards. NRCan is pleased to collaborate within the QoSE DWG to bring best practices for highly reliable and usable web services to the web.”

Sampo Savolainen, the Managing Director of Spatineo, is thrilled to see the growing OGC interest for QoS: “In Spatineo, our entire business model is based on leveraging standard interfaces for letting our customers measure the quality of Spatial Data Services they are providing and using, and helping them leverage spatial data on the web. OGC activities in this field will make it easier for our customers to provide and find high quality Spatial Data Services.”

Scott Simmons, the Executive Director of the OGC Standards Program, notes that “geospatial web services include some unique characteristics, especially considering that the visual nature of a map rendered to a browser does not necessarily reflect the method of service nor the user interaction with the data. We need metrics tailored to the use case of the service and fair comparisons that target the services, not the IT environment and internet bandwidth in which the services reside.”

Raising the customer awareness in QoSE issues, and harmonizing QoSE measurement where it makes sense, were primary reasons for us at Spatineo to join the OGC. I’m honoured to co-chair the group with Tom Kralidis, and looking forward to active discussion and contributions from the group members.

The next QoSE DWG face-to-face meeting will be held at the upcoming OGC TC in Delft, The Netherlands on Wednesday the 22nd of March 2017. For more up-to-date information, including the mailing lists, work programme and meeting minutes, see the QoSE DWG wiki.

Spatineo leading OGC activities for Quality of Service


We all know that sometimes computer-related things just don’t work: cryptic error messages, programs stuck for an indefinite time, or crashing altogether. Sometimes the culprit can be outside your laptop or mobile phone: the resources it’s trying to fetch are just not there or cannot be accessed as fast as usual. Whatever the reason behind this behaviour is, the ultimate result is the same: the user experience sucks.

To make things work better together you need to agree on how they are expected to function. The standardisation work by the Open Geospatial Consortium (OGC) has greatly influenced how computers and software from different vendors are able to handle and exchange spatial information. Still, the product level functional standards compliance alone is not enough to guarantee a fully functional and reliable networked systems in the real world: Even the best products can be configured and connected sub-optimally, making them inefficient, unreliable and difficult to use. As any computer systems, the spatial data servers and clients also suffer from occasional technical failures and connectivity problems, resulting in degraded experienced quality from the user perspective. The key factor for systematically improving the experienced Quality of Service is to constantly measure it using well-defined and consistent metrics.

When we joined the OGC in early 2016, we wanted to bring the Quality of Service issues of the OGC spatial data services on the table. At the Opening Plenary of OGC Technical Committee meeting held Dublin in June 2016, I presented the initial idea of the new OGC working group for improving how the OGC Web Service standards, such as WMS, WFS, and WCS, could help the data providers in creating and maintaining reliable, high-quality spatial data infrastructures. These ideas were well received by the OGC members, and the Quality of Service ad hoc group has since met 6 times to discuss the focus and the type and to draft the charter document of the proposed new OGC working group.

The preparatory phase for the new OGC working group concentrating on Quality of Service and Quality of Experience related topics is now about the come to a conclusion as OGC released the draft charter document for the OGC Quality of Service and Experience Domain Working Group (QoSE DWG) for a three week public comment period on 25th October 2016. The plan is to have the OGC Technical Committee to vote for approving the final version of the charter, and thus approve forming of the new working group during the next OGC meeting in Taichung, Taiwan in December.

Key issues actions of the proposed OGC QoSE DWG include the following:

  • Make it easier to evaluate and compare the Quality of Service of live spatial data service instances implementing OGC standards.
  • Further the standardisation work within the OGC for declaring the expected level of Quality of Service for OGC compliant services in a formal, machine and human readable way.
  • Create and improve guidance and best practices for improving the Quality of Experience of OGC spatial data services by encouraging the data and software providers to publish better and more understandable metadata, layer names, legends and descriptions and other properties helping the users of these services to understands and use the services and data more effectively.
  • Encourage activities for piloting technical means for improving automatic evaluation and monitoring the Quality of Service of spatial data services in the OGC Testbeds and other activities of the OGC Interoperability Program, and present the results of these real-world experiments to the OGC TC for further actions.

The OGC Quality of Service ad hoc group will have a face-to-face meeting during the Taichung OGC meeting between 4th – 8th December. If all goes well, this will be the last meeting of ad hoc group and the kick-off for the first year of the new OGC Quality of Service and Experience DWG. I hope to see you there.

More information:

The future is already here

The future is already here
By Sampo Savolainen and Anita Lankinen, originally published on Geo International Nov/Dec 2016 Issue

Sampo Savolainen and Anita Lankinen look at some of the latest technological advances and analyse how they use spatial data infrastructures, what else can and should be done, and explore what role they will play in our future

Transport agencies, cities and municipalities, environmental agencies, mapping and cadaster authorities – all of these public organisations have been collecting and storing spatial data for many years. However, often companies end up developing their own spatial data infrastructures (SDIs) and creating their core data by themselves instead of leveraging this publicly collected data.

Look at self-driving cars. Instead of relying on available information on road infrastructure, car manufacturer Tesla has mostly relied on sensors to guide its Autopilot system. The sensors are placed all around the car to help the car understand its environment, so that it can safely steer itself in most situations.

Every Tesla car, with or without Autopilot, is connected to the cloud. The company is constantly monitoring and collecting data from each car to create and update its maps. In short, the company acquires all its data through drivers of its cars. As a result, Tesla will have its very own map of the world, built from data collected from the hundreds of sensors embedded in each of its cars. Founder Elon Musk calls this a ‘fleet learning network’, where every car learns from the other. Musk referred to an example of a Californian highway, where the lines are badly marked, which, however, doesn’t affect Autopilot data, as the system uses information from Tesla drivers who have used this section of road. This means that the data Tesla is collecting goes far beyond two-dimensional road maps.

Why not use existing maps?

So why do Tesla and other private companies prefer building SDIs from scratch instead of using data governments were collecting for years? We have come across the following typical arguments for this choice.

Firstly, governmental data is often missing some particular information that is crucial for the private organisation. For instance, TomTom could not rely on base maps to contain information about allowed turns at intersections, so had to collect this crucial information by driving through every intersection in every country in which this information was missing.

Secondly, the update cycle of suitable public data might not match the needs of private enterprises. For example, it might be dangerous for autonomous cars to rely on old data.

Lastly, spatial data is often recorded differently across countries or administrative areas. This is one of major issues the European Inspire directive is aiming to address by developing easily accessible, harmonised data and interoperable infrastructure for spatial information that supports environmental policy-making. Similar goals are pursued by the member of the Arctic SDI – a collaboration between eight national mapping agencies (Canada, Finland, Iceland, Norway, Russia, Sweden, USA and Denmark) to provide politicians, governments, policy makers, scientists, private enterprises and citizens in the Arctic with access to harmonised data, digital maps and tools to assist in monitoring and decision-making in the region.

Moreover, in our experience, it looks like private companies often simply assume that the collaboration with public authorities would end up being too difficult.

But what if there was less friction, greater collaboration and data sharing between authorities and companies could work two ways? What if, say, Tesla shared some of the data it collected with public authorities? The data would be integrated into a common reliable data set and could be shared with more partners. One clear benefit of this would be for autonomous cars: cars from different manufacturers would be able to better coordinate and exchange information that would lead to even safer journeys.

This is just one example of how new technologies would evolve better and faster by sharing more data. It would not be the first time Tesla shared some of its innovations and hard work for the common good, either – back in 2014, it opened its patents for electric vehicle technology to advance the development of sustainable transport. This was not an act of charity either – having more electric cars and charging stations on the market was in their commercial interests.

This makes it even clearer that the quality of public spatial data should be constantly monitored, improved and recorded, and that it should be accessible using common standards across boundaries. Having reliable, good quality data readily available, our cities and countries will become smarter and more interconnected faster, while collaboration between governments and the private sector to build better services would become a natural step to help create more efficiencies. Data is the foundation of smart cities and our improved tomorrow, so we need to make sure it is available when it is required.

Artificial intelligence

In the past few years, artificial intelligence (AI) technology has been commoditized, and is now being used in applications as varied as autonomous cars, Google search and evaluating and categorising produce. This is largely due to advances in AI research such as recurrent neural networks, as well as large software companies opening the source code of the technology platforms necessary for massive data analysis.

Soon, we will reach the point where every building collects and analyses the data about its inhabitants to better serve their needs, every traffic light constantly analyses traffic and reacts accordingly, and waste management is monitored to allow for optimisation, just as well as any other system in a city. The amount of data collected will be so massive, its analysis so complex, the only way to truly process and make sense of all this data will be through AI. This will allow both lower costs due to automatisation of previously work-intensive tasks and new truly innovative products and services.

The cities would need to run several digital applications simultaneously, with SDIs serving as the base on top of which to merge all this data, as location is one of the most intuitive way to connect different characteristics of a specific place. It is interesting to see how companies are already enriching their maps with data from the Internet of Things (IoT). Smartphones are one of the most used sensors, collecting a wide palette of signals and data, and acting as an enormous distributed sensor network.

Google has used the location data of smartphone users to find out what are the most crowded areas in cities. Take a closer look at your city in Google Maps and you will notice that some areas are highlighted with darker shades of brown. This signifies which areas and buildings are hot spots – a tremendous resource for real estate agents and buyers, as well as for tourists who want to visit areas of interest.

Humankind has a great track record in innovation and disruption but historically we have not been able to fully understand the repercussions of our new inventions beforehand. During the early days of the commercial internet, most companies were unable to see the benefits it could offer them and reputable papers were publishing articles proclaiming that e-commerce will never happen.

In 1995, Newsweek said: ‘The truth is no online database will replace your daily newspaper.’ Looking at this quote today highlights not only the power of our inventions but also our nearsightedness in understanding their true value. Just as AI already has been, IoT is now being commoditised quickly; the combination of the two will bring unforeseen advancements and implementations. This is the right time for fearless innovators to take action and seize the moment.