Better understanding and interoperability with data models - Spatineo

lkka Rinne, Interoperability Architect at Spatineo, believes that data models are the key to mutual understanding and interoperability in data flows. Well-designed data models crystallise and unify the key data content of the operating environment, and enable data to be communicated efficiently and meaningfully between different organisations and end-users.t pays off to invest in data models. In today’s world, there are countless information systems, but they can all be made to work together using common data models and interface integrations.

Data models are mutual agreements

Data models are largely agreements on what data is relevant in a given application domain and how to describe it for data exchange or storage. Data models are typically designed on three levels of abstraction ensuring interoperability from different perspectives: conceptual, logical and physical data models conceptual model defines common terms for the chosen concepts, their definitions and relations within an application domain. A solid conceptual model improves the understanding of common data content among application domain experts, information system developers and the organisation as a whole, and creates a common language for discussing it. Concept analysis and conceptual models can also be used to make well-informed and documented decisions about the boundaries of the data content of different information systems. A conceptual model at the right level describes a common understanding of the subject matter using a natural language.

A logical level data model describes the data content of the concepts of a conceptual model in a technical and sufficiently detailed manner to allow for the lossless and unambiguous storage and transfer of data from one information system to another. In a logical data model, the data contents are described in a system-independent way, but with sufficient precision to leave as little room for interpretation as possible at the implementation level.

A logical data model must be implemented in all the different data processing and transfer environments through which the data flows. These system-dependent implementations are called physical data models. Examples of physical data models are database structures, data transfer and storage formats and software data structures. They enable the concrete data flow and processing of the information content described by logical level data models. In physical data models, the descriptions of data structures are highly technical and depend on the information system and the used programming languages. Physical data models often focus on the technical efficiency of data access and processing. They are driven by the common practices, capabilities and constraints of the used technical environment. For example, the data content of logical data model classes in physical data models can be slightly rearranged and regrouped, and even duplicated to multiple locations, in order to maximise the efficiency of data processing.

Building blocks for efficient data models and spatial data

But how do data models relate to spatial data? Data modeling and location data have leveraged existing software and data model components, common building blocks and especially standards for a long time. The standards enable semantic and technical interoperability between many different systems. For example, the Open Geospatial Consortium (OGC) has been building a range of standards for companies and communities producing and using geospatial information since the mid 1990’s.

Widely used OGC standards aim to improve the interoperability of data content from different application domains, including Geography Markup Language (GML), CityGML for urban models, Geoscience Markup Language (GeoSciML), and Observations and Measurements. OGC standards related to the interfaces and transfer of spatial data include OGC API Features, OGC API Processes, Web Maps Service (WMS), Web Feature Service (WFS), GeoPackagem NetCDF, and GeoTIFF. Many of the key OGC standards have also been published as ISO standards.

Creating effective standards and building blocks lowers the threshold to start building effective data solutions that leverage geospatial data. While there are many different types of standards, the unifying factor is the goal of encapsulating best practices and approaches in the application domain so that data can be described and transferred between different data systems and components as effortlessly as possible. The best standards contain the common knowledge and understanding built up over years or decades within a community, described in clear requirements and rules that ensure that the data produced can be used as widely and as long as possible. The use of today’s standards does not require years of experience with geospatial data, but also newcomers to the field can start designing, for example, cloud-based systems with geospatial data at their core.

How to access and what to do with data models?

Data models can be seen as tools in the same way as technology choices. Start your project by defining the needs of your users as well as the capabilities and limitations of your technical environments, and the choice of the right tools becomes easier. Most data models are built to meet specific objectives, and if your objectives are aligned with them then pre-built data models and standards are probably the best choices.

OGC and ISO are good starting points to start exploring what you can choose for your technology arsenal. We here at Spatineo are also able to help you onward with your data journey in your applications and needs. Contact us for more information!

But what can you do with data models? There is no ready answer to this, and in practice, the sky’s the limit. An organisation or company capable of building on data models can create information systems that create tangible benefits for internal or external needs.




  • How Modern Analytics Tools Improve Open Data Services
    Modern analytics tools can illuminate how geospatial APIs are used. They can reveal which services are most popular, what data is acquired through the API, the geographical areas of interest for those requests, and the origin of these requests. Additionally, they can monitor service availability, track downtimes, and identify performance […]
  • Spatineo Building NATO Standards Compliant Metadata Capability for the Finnish Defence Forces
    Ensuring good and efficient geospatial data management is crucial for successful training, planning and operations in the defence sector. The ability of NATO members to exchange geo-information in a secure and standards compliant way is key to successful and timely collaboration and building cross-national operational capability. Spatineo continued successful collaboration […]
  • New Age of Data Security: AI’s Role in Enhancing FTIA’s Digital Twins’ Cybersecurity
    Spatineo proudly facilitated the Finnish Transport Infrastructure Agency’s (FTIA) workshop, focusing on the future of digital twins and information security. The workshop aimed to innovate AI-based solutions to enhance the cybersecurity of digital twins. Leveraging Spatineo’s expertise in digital infrastructure and data flows, we explored FTIA’s innovative plans for digital […]
  • Spatineo and Elenia Join Forces to Create a Groundbreaking Digital Service: Capacity Map
    Spatineo developed an interactive electrical grid Capacity Map to simplify communication between Elenia, their project developers and customers.
  • Exciting News: Spatineo Inc. Awarded NRCan Tender for Geospatial Web Harvester Development and Operations
    We are thrilled to announce that Spatineo Inc. has been awarded both Stream 1 and Stream 2 of the tender by Natural Resources Canada (NRCan) for developing and operating the “Geospatial Web Harvester Development and Operations” project. The contract value is CAD 213 570.00. We are happy to continue working […]
  • GIS & CYBER
    Cybersecurity in the Geospatial Industry is Still in its Infancy The headline might seem sensational, but there’s a good reason for it: Cybersecurity is on shaky ground when it’s not given enough thought, and the possibility of threats is not recognized or considered in service design and data handling. Information […]