lkka Rinne, Interoperability Architect at Spatineo, believes that data models are the key to mutual understanding and interoperability in data flows. Well-designed data models crystallise and unify the key data content of the operating environment, and enable data to be communicated efficiently and meaningfully between different organisations and end-users.t pays off to invest in data models. In today’s world, there are countless information systems, but they can all be made to work together using common data models and interface integrations.
Data models are mutual agreements
Data models are largely agreements on what data is relevant in a given application domain and how to describe it for data exchange or storage. Data models are typically designed on three levels of abstraction ensuring interoperability from different perspectives: conceptual, logical and physical data models conceptual model defines common terms for the chosen concepts, their definitions and relations within an application domain. A solid conceptual model improves the understanding of common data content among application domain experts, information system developers and the organisation as a whole, and creates a common language for discussing it. Concept analysis and conceptual models can also be used to make well-informed and documented decisions about the boundaries of the data content of different information systems. A conceptual model at the right level describes a common understanding of the subject matter using a natural language.
A logical level data model describes the data content of the concepts of a conceptual model in a technical and sufficiently detailed manner to allow for the lossless and unambiguous storage and transfer of data from one information system to another. In a logical data model, the data contents are described in a system-independent way, but with sufficient precision to leave as little room for interpretation as possible at the implementation level.
A logical data model must be implemented in all the different data processing and transfer environments through which the data flows. These system-dependent implementations are called physical data models. Examples of physical data models are database structures, data transfer and storage formats and software data structures. They enable the concrete data flow and processing of the information content described by logical level data models. In physical data models, the descriptions of data structures are highly technical and depend on the information system and the used programming languages. Physical data models often focus on the technical efficiency of data access and processing. They are driven by the common practices, capabilities and constraints of the used technical environment. For example, the data content of logical data model classes in physical data models can be slightly rearranged and regrouped, and even duplicated to multiple locations, in order to maximise the efficiency of data processing.
Building blocks for efficient data models and spatial data
But how do data models relate to spatial data? Data modeling and location data have leveraged existing software and data model components, common building blocks and especially standards for a long time. The standards enable semantic and technical interoperability between many different systems. For example, the Open Geospatial Consortium (OGC) has been building a range of standards for companies and communities producing and using geospatial information since the mid 1990’s.
Widely used OGC standards aim to improve the interoperability of data content from different application domains, including Geography Markup Language (GML), CityGML for urban models, Geoscience Markup Language (GeoSciML), and Observations and Measurements. OGC standards related to the interfaces and transfer of spatial data include OGC API Features, OGC API Processes, Web Maps Service (WMS), Web Feature Service (WFS), GeoPackagem NetCDF, and GeoTIFF. Many of the key OGC standards have also been published as ISO standards.
Creating effective standards and building blocks lowers the threshold to start building effective data solutions that leverage geospatial data. While there are many different types of standards, the unifying factor is the goal of encapsulating best practices and approaches in the application domain so that data can be described and transferred between different data systems and components as effortlessly as possible. The best standards contain the common knowledge and understanding built up over years or decades within a community, described in clear requirements and rules that ensure that the data produced can be used as widely and as long as possible. The use of today’s standards does not require years of experience with geospatial data, but also newcomers to the field can start designing, for example, cloud-based systems with geospatial data at their core.
How to access and what to do with data models?
Data models can be seen as tools in the same way as technology choices. Start your project by defining the needs of your users as well as the capabilities and limitations of your technical environments, and the choice of the right tools becomes easier. Most data models are built to meet specific objectives, and if your objectives are aligned with them then pre-built data models and standards are probably the best choices.
OGC and ISO are good starting points to start exploring what you can choose for your technology arsenal. We here at Spatineo are also able to help you onward with your data journey in your applications and needs. Contact us for more information!
But what can you do with data models? There is no ready answer to this, and in practice, the sky’s the limit. An organisation or company capable of building on data models can create information systems that create tangible benefits for internal or external needs.
- Who are we? – Gerald “Junnu” LeeWho is Gerald Lee and what technologies are his specialty at Spatineo?
- Value tree analysis visualizes climate change mitigation and habitat loss preventionValue tree analysis can be used to get to the bottom of which information products will help in the fight against climate change.
- Location Innovation Hub Starting in FinlandThe Location Innovation Hub supports businesses and society in the use of spatial information, especially in the built environment, bioeconomy, transport and well-being.
- Nooralotta Neziri learns how Spatineo is fighting the skills shortageSpatineo, an IT company specialising in data processing, data management and geospatial technology, wants to employ the best coders in the world.
- Better understanding and interoperability with data modelsWe believe that data models are the key to mutual understanding and interoperability in data flows, but how do they work?
- Who are we? – Stanley FestusWho is Stanley Festus and what technologies are his speciality at Spatineo?