Spatial web services & data journalism, the Talvivaara case

We had an interesting real-world case of using open environmental data for journalism a couple of weeks ago in Finland. In the early hours of Saturday the 10th of November Yle, the Finnish public broadcasting company, published a background news item at their site related to the continued pollution leakage at Talvivaara mining site in Sodankylä, Finland.

In the post “Kaikki Talvivaaran alueesta” (“All about Talvivaara area”) they point to the interactive mashup map of the mining area, including natural protection areas, mining reservations etc., aggregated at the Paikkatietoikkuna geoportal of the Finnish National Land Survey.

A few hours later the map was rendered practically useless because of the serious performance problems of the background WMS services providing the data.

The map window application at Paikkatietoikkuna makes it possible for any user to aggregate and publish web maps with their preferred selection of visualized geospatial data layer provided by the various Finnish governmental organizations. The data layers are served by the WMS servers hosted by the organizations, the application only provides an interactive graphical user interface for displaying them as a mashup. In this case Yle reporters had been able to make an up-to-date, interactive map covering soil types, lakes and rivers, ground water reserves, land claims for minings and natural protection areas just by selecting the layers and publishing the link pointing at it in their news item.

The data layers in the mashup was provided by the Geological Survey of Finland (soil types), Finnish Environment Institute (river, lake, natural water reserves and natural protection area) and the Finnish Ministry of Employment and the Economy (the mining-related information). The attached report from our Spatineo Monitor clearly shows the increased response times for all the WMS servers providing the selected data layers starting in the morning of 10th Nov 2012. At 04 UTC (06 local time) the Soil type service were struggling with the first traffic peak, and by 06 UTC the server was unresponsive. The situation started to improve only at evening, about 17 UTC.

The one month time series of one of the services (Soil data) shows the average response times on10th Nov. were considerably above normal for that service:

It seems that the journalists are really starting to take advantage of the public open geospatial data resources and easily available web map tools like Paikkatietoikkuna, but the data providers are not very well prepared for even pretty minor “slashdot effects” caused by sudden increased traffic at their services.

We at Spatineo are quite glad to be able to report things like this based on our continuous monitoring of thousands of spatial web services around the world. It confirms us that our proactive monitoring strategy is the right one: In most cases we have been collecting the performance data already before our customers experience performance problems in their spatial web services.

OGC to switch to WC3 XLink in July 2012

Open Geospatial Consortium (OGC) will make a backwards incompatible change to it’s XML Schema files of a large part of it’s standards in July 21st 2012. This change is done as a global corrigendum to move into using the W3C XLink version 1.1 instead of the OGC-specific XLink XML Schema implementation. See my previous post at for details on the reasons behind this pretty large-scale change.

Basically the change is quite a simple one:

  • all existing OGC standards that reference the OGC XLink shall be updated to reference the W3C XLink 1.1 schema and
  • going forward any new standards work shall only reference the W3C XLink schema.

By far the most used XLink attribute in OGC schemas is the locator attribute xlink:href, which contains an URL pointing to a link between two XML documents. In the XML Schema documents, the XLink href attribute is usually included in a complex type by adding an attribute group named simpleLink. In schemas using GML this is often done indirectly by using a pre-defined gml:AssociationAttributeGroup:

<complexType name="ReferenceType">
  <annotation>
    <documentation>
    gml:ReferenceType is intended to be used in application schemas directly,
    if a property element shall use a "by-reference only" encoding.
    </documentation>
  </annotation>
  <sequence/>
  <attributeGroup ref="gml:OwnershipAttributeGroup"/>
  <attributeGroup ref="gml:AssociationAttributeGroup"/>
</complexType>

The gml:AssociationAttributeGroup GML 3.2.1 (before the XLink corrigendum) in turn refers to the simpleLink attribute group defined in the XLink namespace:

<attributeGroup name="AssociationAttributeGroup">
  <annotation>
    <documentation>
    XLink components are the standard method to support hypertext referencing in XML. An XML Schema 
    attribute group, gml:AssociationAttributeGroup, is provided to support the use of Xlinks as 
    the method for indicating the value of a property by reference in a uniform manner in GML.
    </documentation>
  </annotation>
  <attributeGroup ref="xlink:simpleLink"/>
  <attribute name="nilReason" type="gml:NilReasonType"/>
  <attribute ref="gml:remoteSchema">
    <annotation>
      <appinfo>deprecated</appinfo>
    </annotation>
  </attribute>
</attributeGroup>

In non-corrected GML 3.2.1 schema files the XLink namespace is imported from the OGC version of the XLink schema:

<import namespace="http://www.w3.org/1999/xlink" schemaLocation="http://schemas.opengis.net/xlink/1.0.0/xlinks.xsd"/>

In this file the simpleLink attributeGroup is defined like this:

<attribute name="href" type="anyURI"/>
...
<attributeGroup name="simpleLink">
  <attribute name="type" type="string" fixed="simple" form="qualified"/>
  <attribute ref="xlink:href" use="optional"/>
  <attribute ref="xlink:role" use="optional"/>
  <attribute ref="xlink:arcrole" use="optional"/>
  <attribute ref="xlink:title" use="optional"/>
  <attribute ref="xlink:show" use="optional"/>
  <attribute ref="xlink:actuate" use="optional"/>
</attributeGroup>

The thing that will change in July 2012 is all the schema files of all affected OGC standards will modified to point to the W3C official XLink 1.1 schema available at http://www.w3.org/XML/2008/06/xlink.xsd. The href attribute definition in the W3C XLink schema is only slightly different from the OGC version:

<xs:attribute name="href" type="xlink:hrefType"/>
<xs:simpleType name="hrefType">
  <xs:restriction base="xs:anyURI"/>
</xs:simpleType>
...
<xs:attributeGroup name="simpleAttrs">
  <xs:attribute ref="xlink:type" fixed="simple"/>
  <xs:attribute ref="xlink:href"/>
  <xs:attribute ref="xlink:role"/>
  <xs:attribute ref="xlink:arcrole"/>
  <xs:attribute ref="xlink:title"/>
  <xs:attribute ref="xlink:show"/>
  <xs:attribute ref="xlink:actuate"/>
</xs:attributeGroup>

This means that all XML files using xlink:href attribute valid against the OGC XLink schema are also valid against the W3C XLink 1.1 schema. However because the attribute group “simpleLink” in the OGC schema is called “simpleAttrs” in the W3C schema, the XML schema files using this attribute group will no longer be valid after the change. To fix this all the schema files using the “simpleLink” attribute group will have to be changed to use the “simpleAttrs” instead.

This change has to be done simultaneously to as many schema files as possible, because the XML validators become confused if they encounter two different schema versions of the same XML namespace. In addition to the OGC’s schema files, the same change should also be done to any other schemas using the OGC version of the XLink schema available at http://schemas.opengis.net/xlink/1.0.0/xlinks.xsd. To force the users to do this change, the OGC Architecture Board has decided to remove the OGC XLink schema file along with the other schema changes.

According to a mailing list post by Carl Reed, the CTO of the OGC, on 12th April 2012, at least the following OGC standards are affected by this change:

  • All versions of WM context
  • All versions of GML since version 2.0.0
  • All profiles of GML since 2.0.0
  • Image CRSs
  • All versions of OpenLS since version 1.1.0
  • All versions of OWS Common since 1.0.0
  • Symbology Encoding 1.0
  • All versions of SLD since 1.0.0
  • All versions of SensorML (including 2.0)
  • All versions of SWE Common
  • Table Join Service
  • All versions of Web Coverage Service
  • Web Feature Service 2.0
  • Web Map Service 1.3
  • WMTS
  • Web Processing Service

There are probably other schemas and standards in addition to this list because the schemas are inter-linked. Especially the different version of GML are used in many other OGC schemas.

Further quoting the announcement from Carl Reed about the OGC actions to be taken:

The target date for implementing change is the weekend of July 21, 2012.

The process will be:

  • Scan schema repository for import of xlink to find a list of standards that use xlink.
  • Also scan for strings such as Gml:ReferenceType to find other possible places that xlink is required.
  • Whatever schema uses any of XLink schema components will need to replace the schema location. We need to do this for all schemas that import xlink. All these changes will be done to a copy of the existing OGC schema repository.
  • For software developers, they need to patch their products to use the revised OGC schemas.
  • Everyone will need to delete local copies, get a new copy from the OGC schema repository, and use the new schemas. There is also the possibility to use a tool such as the OASIS XML Catalogue to override the required change and to continue using the old XLink.
  • In July, we will then issue one global corrigendum for all the affected standards. Essentially, the current OGC schema repository will be replaced with the schemas that have been changed (and tested). The actual standards documents will not change – only the schemas. OGC policy is that the schemas are normative and that if there are differences between a standards document and a schema, then the schemas are normative.

This is pretty much the approach I expected the OGC to take when I wrote about this in January.

If you are running or developing software dealing with OGC compliant data or services you really should check that it will still work with the modified versions of the schema files. You can begin testing your software as soon as the modified OGC schema files are made available in the alternative OGC schema repository. One of the simplest ways to test this is to use the OASIS XML Catalog to temporarily redirect the requests for the schema files of the modified standards’ namespaces to the alternative OGC schema locations. If your software supports the XML Catalog a catalog.xml file with directives something like the following should do the trick (assuming that the modified OGC schemas would be made available under the domain alternative.schemas.opengis.net):

<!DOCTYPE catalog
  PUBLIC "-//OASIS//DTD Entity Resolution XML Catalog V1.0//EN"
         "http://www.oasis-open.org/committees/entity/release/1.0/catalog.dtd">
<catalog xmlns="urn:oasis:names:tc:entity:xmlns:xml:catalog"
         prefer="public">
  <rewriteURI uriStartString="http://schemas.opengis.net/gml/"
		rewritePrefix="http://alternative.schemas.opengis.net/gml/" />
  <rewriteURI uriStartString="http://schemas.opengis.net/wfs/"
		rewritePrefix="http://alternative.schemas.opengis.net/wfs/" />
  ....
  [etc for all affected standards]
</catalog>

When an XML validator using this catalog needs to fetch any xml files from URLs beginning with “http://schemas.opengis.net/gml/” it will try to fetch them from “http://alternative.schemas.opengis.net/gml/” instead. The benefit from this approach is that you will be able to simulate schema switch-over well before the actual change in July without making any changes to your code or data files.

You can also use the XML Catalog if you find that you must delay the schema changes for your local system. To do this you can take local copies from the unmodified OGC schema files and create another set of rewriteURI directives. Assuming that the local schema files are stored under /etc/xml/schemas/original/ogc/:

<!DOCTYPE catalog
  PUBLIC "-//OASIS//DTD Entity Resolution XML Catalog V1.0//EN"
         "http://www.oasis-open.org/committees/entity/release/1.0/catalog.dtd">
<catalog xmlns="urn:oasis:names:tc:entity:xmlns:xml:catalog"
         prefer="public">
  <rewriteURI uriStartString="http://schemas.opengis.net/gml/"
		rewritePrefix="file:///etc/xml/schemas/original/ogc/gml/" />
  <rewriteURI uriStartString="http://schemas.opengis.net/wfs/"
		rewritePrefix="file:///etc/xml/schemas/original/ogc/wfs/" />
  ....
</catalog>

What is an O&M Observation and why should you care?

Observations & Measurements (O&M) is an international standard for modeling observation events and describing their relations to the target spatial objects under observation, the measured properties & measurement procedure, and the captured data resulting from those observation events. It’s based on Geography Markup Language (GML), another standard by the Open Geospatial Consortium (OGC) enabling a common base for it’s notation of location based information.

In addition to the most obvious cases of representing records of scientific measurement data, the O&M model is also used for modeling predicted information like weather forecasts. Because of it’s general ability to model perceived values of spatial objects’ properties at specific times, it’s a good fit for many kinds of application domains where it’s necessary to capture time-based changes on objects of interest.


The basic O&M observation event concepts.

The O&M conceptual model is published both as an Open Geospatial Consortium (OGC) Abstract Specification Topic 20 and as an ISO standard with number ISO/DIS 19156. The XML implementation of O&M model is also an OGC standard “Observations and Measurements – XML Implementation“. The origins of the O&M is in the Sensor Web Enablement (SWE) initiative of the OGC. It was needed as the common standardized data model for handling the measurement events occurring in different kinds of sensors from thermometers inside an industrial process to satellites taking images of the Earth from the space. Together with other SWE framework open standards like SensorML and Sensor Observation Service (SOS), O&M provides a system-independent, Internet-enabled ways of data exchange between different parts of sensor networks and other systems using the captured sensor information.

Even though the O&M model was originally created for modeling measurement acts that have already happened, there is no technical difficulty in using the same model also for describing acts of estimating the values of some properties of spatial objects at some point in the future. After all, event the most precise measurements are still only estimations of the actual properties of the target objects, limited by the method and the tools used, as well as our capabilities of interpreting the measurement results. The captured data itself is often very similar in both measurement and prediction cases, so it makes sense to try to store and deliver those data sets using the the same basic concepts.

One of the facts that makes the O&M model interesting right now is the increasing affordability of IP-based environmental sensors: these days almost anyone can afford to buy a basic weather observation station, place it in their backyard, and plug it in the Internet for sharing the data with others. Or buy an Internet-connected web camera. This also means that it’s becoming possible for anyone to gather and refine detailed environmental information about the world around us, both locally and globally. What used to the playground of the big, closed national and international institutes and governmental offices, is now opening up also to ordinary citizens. Of course this also means, like in everything based on the Internet, that as the amount of information and the heterogeneity of the sources producing it grows, the quality range of the available information also inevitably becomes wider.

The Sensor Web movement is so promising that also the organizations that used to deploy and maintain their own sensor networks with their proprietary data and control interfaces built for their specific software and hardware systems, are moving towards these open standards. Even though they might not put their data publicly in the Internet, they definitely want to take advantage of the IP-based networks for communicating, and they’s love to be able to easily switch between two sensor equipment boxes made by different vendors in plug-and-play fashion. The extra network traffic caused by a higher level communication protocols and more verbose content encoding is less and less of an issue in this ever more broadband world of ours.

Still, it would be nice if the increasing amounts of sensor data collected by publicly funded organizations would also be made available to the public, wouldn’t it? In many cases it already is available for someone who knows where to ask. Sometimes it’s even freely available in the Internet, like the various public web cams, but mostly it’s still accessible only to professionals. This is bound to change gradually however as international legislation aiming at data opening and harmonization, like the EU INSPIRE directive in Europe, is being implemented around the world. The O&M concepts form the basis of the EU-widely harmonized INSPIRE data models for meteorological, climatological and air quality information, as well as for physical oceanographic measurement data and the structure and the capabilities of the various environmental observation networks. This basically means that in the near future the ordinary citizens will be able to access the environmental data provided by the government officials in pretty much using the same protocols and data formats that they’re used to while accessing their neighbor’s off-the-self sensor equipment. Ain’t that cool?

I’m currently involved in the international expert teams on behalf of our customer the Finnish Meteorological Institute for creating the data specifications and writing guidelines for some of the O&M based INSPIRE data sets. We’re currently finalizing our work on the guidelines documents, but the actual work to make the INSPIRE spatial data infrastructure reality goes on of course. Fortunately there are deadlines: initial bunch of the view and download services for these INSPIRE data sets should be publicly available in May 2013, and even the last bits should be fully in INSPIRE compliant shape by 2020.