The interoperability of information systems
The importance of interoperability is the logical consequence of the digital transformation that all economic sectors are experiencing today. The proliferation of applications used on a daily basis in our personal or professional lives necessarily implies an increase in the exchange of information between heterogeneous systems. New norms and standards are regularly emerging, certainly bringing new solutions but also an anthology of new projects to put them into action.
Faced with this, CIOs have the heavy task of managing the urbanization of their information system as well as possible so that it effectively accompanies and supports the missions and strategy of their company.
Definition of interoperability
The following definition is the one agreed upon by the AFUL (French Association of Free Software Users) and Wikipédia. It has been used as a basis for the RGI, the General Interoperability Repository in which the state defines the norms and standards of communication of applications managed by ministries, administrations and communities :
“Interoperability is the ability of a product or system, whose interfaces are fully known, to operate with other existing or future products or systems without access or implementation restrictions.”
In the case of information systems interoperability, this definition has several aspects, such as the possibility for an application to interact with several database systems (DBMS) using the ODBC standard, for example, or the standardization of programming languages allowing heterogeneous systems to communicate via the Internet.
There are also several levels of interoperability depending on the layer at which the communication takes place.
The three levels of interoperability
As its name indicates, it constitutes the technical part of the exchanges. It covers the hardware aspects used to connect systems, but also the types of interface used (APIs, connectors, etc.), the format of the data exchanged and the protocols used for these exchanges. Technical interoperability describes the ability for different technologies to exchange data based on well-defined interface standards. This level of interoperability is supported by models such as MDA (Model Driven Architecture) which allows, starting from a business model independent of computerization, the transformation of this model into a platform-independent model and finally the transformation of the latter into a platform-specific model for the concrete implementation of the system using specific or general languages such as Java, C# or Python.
This level of interoperability ensures that the meaning of the information exchanged is understandable by any other application that accesses the data. This requires the creation of a repository in which the meaning of each piece of information is described and shared. Semantic interoperability represents information in the form of a model containing concepts linked by relationships to model a set of knowledge on a given subject. This is called an ontology.
The purpose of this layer is to translate the meaning of the data into symbols that can be used by other systems. The difference between semantic interoperability and syntactic interoperability is similar to the difference between form and content in a text. Syntactic interoperability concerns the way data is coded and formatted. It ultimately leads to the notion of an open system, i.e. a system that can support the heterogeneity of the components involved in the exchange. Syntactic interoperability is supported by mechanisms such as the Unified Modeling Language (UML), the Business Process Model and Notation (BPMN), the Extensible Markup Language (XML) and the Interface Definition Language (IDL).
Norms and standards
The importance of standardization
Interoperability is a concept related to communication. In order to exchange information, trade and prosper, mankind first created languages that allowed him to interact with other tribes, to understand and to be understood. Today, there are many idioms, some of which are more widely used than others and are considered “standards”, such as English. Man has also had to develop standards to agree on practices shared with other communities, such as the cutting of stones for the construction of large buildings.
We can, as Jacob Holmblad tells us in this editorial, see the importance of interoperability through the biblical myth of the Tower of Babel. One day, a vain man wanted to create an edifice that would reach the heavens, thus competing with God. But God, in order to stop the construction of the building, decided to create multiple languages so that the builders could no longer communicate with each other. Thus, undermining the interoperability, he makes the project fail.
The big publishers’ game
Increasing globalization, the growing complexity of interactions and the multiplication of so-called “communicating” systems complicate interoperability requirements and force the emergence of new, more robust and reliable standards.
This also forces the major publishers to open up their systems more and to propose gateways with their competitors so as not to find themselves isolated or even excluded from their ecosystem. They do not hesitate to get closer to standardization organizations and working groups such as ECMA, ETSI, OASIS, OMA, IEEE, ISO or W3C, to name but a few, in order to have a say in the debates and to impose their norm as a standard.
The bad students
Every rule has its exceptions. Some publishers have chosen not to open up their system and to remain on what is known as a proprietary or manufacturer-exclusive system. This is a strategic choice that allows them to keep a captive clientele, which is then totally dependent on the editor and the resources available to develop new functionalities or provide assistance. It is also complicated to interface these applications with others unless reverse engineering techniques are used to develop specific connectors. These vendors have no choice but to “pamper” their customers to prevent them from giving in to the siren call of open systems.
The future of interoperability
As we have just seen, interoperability is now at the heart of all systems, in all sectors, with crucial issues at stake in some cases (health, logistics, transport, etc.). For companies, it has become necessary, even vital, to master this notion and to put it into practice through a well thought-out urbanization of the information system, clear processes and open tools supporting norms and standards.
Even technologies that were previously closed, such as blockchains, have understood the need to open up in order to be democratized, as Philippe Limantour’s article shows.