Better understanding and interoperability with data models - Spatineo

lkka Rinne, Interoperability Architect at Spatineo, believes that data models are the key to mutual understanding and interoperability in data flows. Well-designed data models crystallise and unify the key data content of the operating environment, and enable data to be communicated efficiently and meaningfully between different organisations and end-users.t pays off to invest in data models. In today’s world, there are countless information systems, but they can all be made to work together using common data models and interface integrations.

Data models are mutual agreements

Data models are largely agreements on what data is relevant in a given application domain and how to describe it for data exchange or storage. Data models are typically designed on three levels of abstraction ensuring interoperability from different perspectives: conceptual, logical and physical data models conceptual model defines common terms for the chosen concepts, their definitions and relations within an application domain. A solid conceptual model improves the understanding of common data content among application domain experts, information system developers and the organisation as a whole, and creates a common language for discussing it. Concept analysis and conceptual models can also be used to make well-informed and documented decisions about the boundaries of the data content of different information systems. A conceptual model at the right level describes a common understanding of the subject matter using a natural language.

A logical level data model describes the data content of the concepts of a conceptual model in a technical and sufficiently detailed manner to allow for the lossless and unambiguous storage and transfer of data from one information system to another. In a logical data model, the data contents are described in a system-independent way, but with sufficient precision to leave as little room for interpretation as possible at the implementation level.

A logical data model must be implemented in all the different data processing and transfer environments through which the data flows. These system-dependent implementations are called physical data models. Examples of physical data models are database structures, data transfer and storage formats and software data structures. They enable the concrete data flow and processing of the information content described by logical level data models. In physical data models, the descriptions of data structures are highly technical and depend on the information system and the used programming languages. Physical data models often focus on the technical efficiency of data access and processing. They are driven by the common practices, capabilities and constraints of the used technical environment. For example, the data content of logical data model classes in physical data models can be slightly rearranged and regrouped, and even duplicated to multiple locations, in order to maximise the efficiency of data processing.

Building blocks for efficient data models and spatial data

But how do data models relate to spatial data? Data modeling and location data have leveraged existing software and data model components, common building blocks and especially standards for a long time. The standards enable semantic and technical interoperability between many different systems. For example, the Open Geospatial Consortium (OGC) has been building a range of standards for companies and communities producing and using geospatial information since the mid 1990’s.

Widely used OGC standards aim to improve the interoperability of data content from different application domains, including Geography Markup Language (GML), CityGML for urban models, Geoscience Markup Language (GeoSciML), and Observations and Measurements. OGC standards related to the interfaces and transfer of spatial data include OGC API Features, OGC API Processes, Web Maps Service (WMS), Web Feature Service (WFS), GeoPackagem NetCDF, and GeoTIFF. Many of the key OGC standards have also been published as ISO standards.

Creating effective standards and building blocks lowers the threshold to start building effective data solutions that leverage geospatial data. While there are many different types of standards, the unifying factor is the goal of encapsulating best practices and approaches in the application domain so that data can be described and transferred between different data systems and components as effortlessly as possible. The best standards contain the common knowledge and understanding built up over years or decades within a community, described in clear requirements and rules that ensure that the data produced can be used as widely and as long as possible. The use of today’s standards does not require years of experience with geospatial data, but also newcomers to the field can start designing, for example, cloud-based systems with geospatial data at their core.

How to access and what to do with data models?

Data models can be seen as tools in the same way as technology choices. Start your project by defining the needs of your users as well as the capabilities and limitations of your technical environments, and the choice of the right tools becomes easier. Most data models are built to meet specific objectives, and if your objectives are aligned with them then pre-built data models and standards are probably the best choices.

OGC and ISO are good starting points to start exploring what you can choose for your technology arsenal. We here at Spatineo are also able to help you onward with your data journey in your applications and needs. Contact us for more information!

But what can you do with data models? There is no ready answer to this, and in practice, the sky’s the limit. An organisation or company capable of building on data models can create information systems that create tangible benefits for internal or external needs.

Want to stay updated?

Subscribe to Spatineo Newsletter!

Spatineo newsletter features news and topical articles on data flows, written by our experts.
I would like get my newsletter in:

  • Spatineo auttoi Puolustusvoimia paikkatietoaineistojen hankinnan automatisoinnissa
    Spatineo auttoi Puolustusvoimia paikkatietoaineistojen hankinnan automatisoinnissa Puolustustusvoimien tavoitteena on tehostaa paikkatietoaineistojen hankinta-, käsittely- ja jakeluprosesseja mm. kehittämällä em. prosesseihin olemassa olevia automatisointimenetelmiä ja luomalla uusia. Toimeksiannossa tunnistettiin hankintaprosesseista automatisoitavia vaiheita ja asetettiin automatisoinnit potentiaaliseen toteutusjärjestykseen. Järjestyksen kriteereinä käytettiin aineistohankintaan nykyisin kuluvaa suhteellista aika-arviota ja automatisointiin kuluvaa suhteellista työmääräarviota. Työssä tunnistettiin […]
  • Who are we? – Suvi-Tuulia “Suffa” Haakana
    Suvi-Tuulia, also known as Suffa, is our latest team member, and her expertice is focused on GIS and especially ESRI solutions. What does she do in her normal working day, and what are her favorite GIS tools?
  • Spatineo recognized as one of Global Top 100 Geospatial Companies
    Spatineo has been selected as one of the Global Top 100 Geospatial Companies of 2023. Read more about us and how we did it from this blog!
  • What Is Geospatial Intelligence?
    In this guide, we explore more about geospatial intelligence, what it is, use cases, and examples, so keep reading to learn more.
  • How to Make Data Flow in the Cloud
    If you only knew what your geospatial data would be capable of achieving in the cloud. If you want to make your data flow more efficiently, you need the cloud.
  • Who are we? – Tiina Le
    Tina Le is Spatineo’s HR Specialist, and her work involves making Spatineo a great place to work! Read more about Tiina and her work from out blog!