OGC_QoS_1k

We all know that sometimes computer-related things just don’t work: cryptic error messages, programs stuck for an indefinite time, or crashing altogether. Sometimes the culprit can be outside your laptop or mobile phone: the resources it’s trying to fetch are just not there or cannot be accessed as fast as usual. Whatever the reason behind this behaviour is, the ultimate result is the same: the user experience sucks.

To make things work better together you need to agree on how they are expected to function. The standardisation work by the Open Geospatial Consortium (OGC) has greatly influenced how computers and software from different vendors are able to handle and exchange spatial information. Still, the product level functional standards compliance alone is not enough to guarantee a fully functional and reliable networked systems in the real world: Even the best products can be configured and connected sub-optimally, making them inefficient, unreliable and difficult to use. As any computer systems, the spatial data servers and clients also suffer from occasional technical failures and connectivity problems, resulting in degraded experienced quality from the user perspective. The key factor for systematically improving the experienced Quality of Service is to constantly measure it using well-defined and consistent metrics.

When we joined the OGC in early 2016, we wanted to bring the Quality of Service issues of the OGC spatial data services on the table. At the Opening Plenary of OGC Technical Committee meeting held Dublin in June 2016, I presented the initial idea of the new OGC working group for improving how the OGC Web Service standards, such as WMS, WFS, and WCS, could help the data providers in creating and maintaining reliable, high-quality spatial data infrastructures. These ideas were well received by the OGC members, and the Quality of Service ad hoc group has since met 6 times to discuss the focus and the type and to draft the charter document of the proposed new OGC working group.

The preparatory phase for the new OGC working group concentrating on Quality of Service and Quality of Experience related topics is now about the come to a conclusion as OGC released the draft charter document for the OGC Quality of Service and Experience Domain Working Group (QoSE DWG) for a three week public comment period on 25th October 2016. The plan is to have the OGC Technical Committee to vote for approving the final version of the charter, and thus approve forming of the new working group during the next OGC meeting in Taichung, Taiwan in December.

Key issues actions of the proposed OGC QoSE DWG include the following:

  • Make it easier to evaluate and compare the Quality of Service of live spatial data service instances implementing OGC standards.
  • Further the standardisation work within the OGC for declaring the expected level of Quality of Service for OGC compliant services in a formal, machine and human readable way.
  • Create and improve guidance and best practices for improving the Quality of Experience of OGC spatial data services by encouraging the data and software providers to publish better and more understandable metadata, layer names, legends and descriptions and other properties helping the users of these services to understands and use the services and data more effectively.
  • Encourage activities for piloting technical means for improving automatic evaluation and monitoring the Quality of Service of spatial data services in the OGC Testbeds and other activities of the OGC Interoperability Program, and present the results of these real-world experiments to the OGC TC for further actions.

The OGC Quality of Service ad hoc group will have a face-to-face meeting during the Taichung OGC meeting between 4th – 8th December. If all goes well, this will be the last meeting of ad hoc group and the kick-off for the first year of the new OGC Quality of Service and Experience DWG. I hope to see you there.

More information: