Observations & Measurements (O&M) is an international standard for modeling observation events and describing their relations to the target spatial objects under observation, the measured properties & measurement procedure, and the captured data resulting from those observation events. It’s based on Geography Markup Language (GML), another standard by the Open Geospatial Consortium (OGC) enabling a common base for it’s notation of location based information.

In addition to the most obvious cases of representing records of scientific measurement data, the O&M model is also used for modeling predicted information like weather forecasts. Because of it’s general ability to model perceived values of spatial objects’ properties at specific times, it’s a good fit for many kinds of application domains where it’s necessary to capture time-based changes on objects of interest.


The basic O&M observation event concepts.

The O&M conceptual model is published both as an Open Geospatial Consortium (OGC) Abstract Specification Topic 20 and as an ISO standard with number ISO/DIS 19156. The XML implementation of O&M model is also an OGC standard “Observations and Measurements – XML Implementation“. The origins of the O&M is in the Sensor Web Enablement (SWE) initiative of the OGC. It was needed as the common standardized data model for handling the measurement events occurring in different kinds of sensors from thermometers inside an industrial process to satellites taking images of the Earth from the space. Together with other SWE framework open standards like SensorML and Sensor Observation Service (SOS), O&M provides a system-independent, Internet-enabled ways of data exchange between different parts of sensor networks and other systems using the captured sensor information.

Even though the O&M model was originally created for modeling measurement acts that have already happened, there is no technical difficulty in using the same model also for describing acts of estimating the values of some properties of spatial objects at some point in the future. After all, event the most precise measurements are still only estimations of the actual properties of the target objects, limited by the method and the tools used, as well as our capabilities of interpreting the measurement results. The captured data itself is often very similar in both measurement and prediction cases, so it makes sense to try to store and deliver those data sets using the the same basic concepts.

One of the facts that makes the O&M model interesting right now is the increasing affordability of IP-based environmental sensors: these days almost anyone can afford to buy a basic weather observation station, place it in their backyard, and plug it in the Internet for sharing the data with others. Or buy an Internet-connected web camera. This also means that it’s becoming possible for anyone to gather and refine detailed environmental information about the world around us, both locally and globally. What used to the playground of the big, closed national and international institutes and governmental offices, is now opening up also to ordinary citizens. Of course this also means, like in everything based on the Internet, that as the amount of information and the heterogeneity of the sources producing it grows, the quality range of the available information also inevitably becomes wider.

The Sensor Web movement is so promising that also the organizations that used to deploy and maintain their own sensor networks with their proprietary data and control interfaces built for their specific software and hardware systems, are moving towards these open standards. Even though they might not put their data publicly in the Internet, they definitely want to take advantage of the IP-based networks for communicating, and they’s love to be able to easily switch between two sensor equipment boxes made by different vendors in plug-and-play fashion. The extra network traffic caused by a higher level communication protocols and more verbose content encoding is less and less of an issue in this ever more broadband world of ours.

Still, it would be nice if the increasing amounts of sensor data collected by publicly funded organizations would also be made available to the public, wouldn’t it? In many cases it already is available for someone who knows where to ask. Sometimes it’s even freely available in the Internet, like the various public web cams, but mostly it’s still accessible only to professionals. This is bound to change gradually however as international legislation aiming at data opening and harmonization, like the EU INSPIRE directive in Europe, is being implemented around the world. The O&M concepts form the basis of the EU-widely harmonized INSPIRE data models for meteorological, climatological and air quality information, as well as for physical oceanographic measurement data and the structure and the capabilities of the various environmental observation networks. This basically means that in the near future the ordinary citizens will be able to access the environmental data provided by the government officials in pretty much using the same protocols and data formats that they’re used to while accessing their neighbor’s off-the-self sensor equipment. Ain’t that cool?

I’m currently involved in the international expert teams on behalf of our customer the Finnish Meteorological Institute for creating the data specifications and writing guidelines for some of the O&M based INSPIRE data sets. We’re currently finalizing our work on the guidelines documents, but the actual work to make the INSPIRE spatial data infrastructure reality goes on of course. Fortunately there are deadlines: initial bunch of the view and download services for these INSPIRE data sets should be publicly available in May 2013, and even the last bits should be fully in INSPIRE compliant shape by 2020.