Better understanding and interoperability with data models - Spatineo

lkka Rinne, Interoperability Architect at Spatineo, believes that data models are the key to mutual understanding and interoperability in data flows. Well-designed data models crystallise and unify the key data content of the operating environment, and enable data to be communicated efficiently and meaningfully between different organisations and end-users.t pays off to invest in data models. In today’s world, there are countless information systems, but they can all be made to work together using common data models and interface integrations.

Data models are mutual agreements

Data models are largely agreements on what data is relevant in a given application domain and how to describe it for data exchange or storage. Data models are typically designed on three levels of abstraction ensuring interoperability from different perspectives: conceptual, logical and physical data models conceptual model defines common terms for the chosen concepts, their definitions and relations within an application domain. A solid conceptual model improves the understanding of common data content among application domain experts, information system developers and the organisation as a whole, and creates a common language for discussing it. Concept analysis and conceptual models can also be used to make well-informed and documented decisions about the boundaries of the data content of different information systems. A conceptual model at the right level describes a common understanding of the subject matter using a natural language.

A logical level data model describes the data content of the concepts of a conceptual model in a technical and sufficiently detailed manner to allow for the lossless and unambiguous storage and transfer of data from one information system to another. In a logical data model, the data contents are described in a system-independent way, but with sufficient precision to leave as little room for interpretation as possible at the implementation level.

A logical data model must be implemented in all the different data processing and transfer environments through which the data flows. These system-dependent implementations are called physical data models. Examples of physical data models are database structures, data transfer and storage formats and software data structures. They enable the concrete data flow and processing of the information content described by logical level data models. In physical data models, the descriptions of data structures are highly technical and depend on the information system and the used programming languages. Physical data models often focus on the technical efficiency of data access and processing. They are driven by the common practices, capabilities and constraints of the used technical environment. For example, the data content of logical data model classes in physical data models can be slightly rearranged and regrouped, and even duplicated to multiple locations, in order to maximise the efficiency of data processing.

Building blocks for efficient data models and spatial data

But how do data models relate to spatial data? Data modeling and location data have leveraged existing software and data model components, common building blocks and especially standards for a long time. The standards enable semantic and technical interoperability between many different systems. For example, the Open Geospatial Consortium (OGC) has been building a range of standards for companies and communities producing and using geospatial information since the mid 1990’s.

Widely used OGC standards aim to improve the interoperability of data content from different application domains, including Geography Markup Language (GML), CityGML for urban models, Geoscience Markup Language (GeoSciML), and Observations and Measurements. OGC standards related to the interfaces and transfer of spatial data include OGC API Features, OGC API Processes, Web Maps Service (WMS), Web Feature Service (WFS), GeoPackagem NetCDF, and GeoTIFF. Many of the key OGC standards have also been published as ISO standards.

Creating effective standards and building blocks lowers the threshold to start building effective data solutions that leverage geospatial data. While there are many different types of standards, the unifying factor is the goal of encapsulating best practices and approaches in the application domain so that data can be described and transferred between different data systems and components as effortlessly as possible. The best standards contain the common knowledge and understanding built up over years or decades within a community, described in clear requirements and rules that ensure that the data produced can be used as widely and as long as possible. The use of today’s standards does not require years of experience with geospatial data, but also newcomers to the field can start designing, for example, cloud-based systems with geospatial data at their core.

How to access and what to do with data models?

Data models can be seen as tools in the same way as technology choices. Start your project by defining the needs of your users as well as the capabilities and limitations of your technical environments, and the choice of the right tools becomes easier. Most data models are built to meet specific objectives, and if your objectives are aligned with them then pre-built data models and standards are probably the best choices.

OGC and ISO are good starting points to start exploring what you can choose for your technology arsenal. We here at Spatineo are also able to help you onward with your data journey in your applications and needs. Contact us for more information!

But what can you do with data models? There is no ready answer to this, and in practice, the sky’s the limit. An organisation or company capable of building on data models can create information systems that create tangible benefits for internal or external needs.

    Cybersecurity in the Geospatial Industry is Still in its Infancy The headline might seem sensational, but there’s a good reason for it: Cybersecurity is on shaky ground when it’s not given enough thought, and the possibility of threats is not recognized or considered in service design and data handling. Information […]
  • Spatineo Assisted the Finnish Defense Forces in Automating the Acquisition of Geospatial Data
    The Defense Forces aimed to enhance the efficiency of their geospatial data acquisition, processing, and distribution processes. This included developing existing automation methods and creating new ones for these processes. The assignment identified phases in the procurement processes that could be automated and arranged this automation in a potential implementation […]
  • Spatineo provided the Ministry of Defence with an assessment related to national security
    The assignment involved preparing a report on national security and supporting it with extensive geospatial and statistical analysis. One of the critical perspectives in the analysis was understanding the data changes over time. The main goal of the assignment was to answer the client’s research questions, which were carried out […]
  • 🌟 Black Friday Special at Spatineo! 🌟
    Unlock the full potential of your spatial web services with Spatineo Performance – now at an irresistible price starting from Black Friday! 🔵 40% OFF on Self-Service Credits. Credit are valid after purchase and the whole year 2024. Spatineo Performance, the ultimate addition to your Spatineo Monitor, is designed to […]
  • Even GeoJSON is better in 3D
    JSON encoding of geometries are becoming internationally standardised as OGC Features and Geometries JSON or JSON-FG for short.
  • Who are we? – Suvi-Tuulia “Suffa” Haakana
    Suvi-Tuulia, also known as Suffa, is our latest team member, and her expertice is focused on GIS and especially ESRI solutions. What does she do in her normal working day, and what are her favorite GIS tools?