Standards and Interoperability

Author: Colin Robbins

In this blog series on interoperability, I have discussed the differently levels of interoperability, the security issues related to interoperability and the role of the Zero Trust approach to interoperability.  For the final blog of the series, I discuss the role of standards.

Standards are a good thing, but not all standards are good

To achieve interoperability, the communicating systems either needs to agree on a common format for data exchange, or a gateway needs to be deployed to convert from one formation to another.

So, at a base level, standards are good – they enable you to agree a common format for data exchange.

Too Many Standards

The first challenge is there are lot of standards to choose from.  If you are not careful, to achieve interoperability you need to adopt lots of them, which pushes complexity onto each system.  Images are a classic example, BMP, GIF, JPEG, WEBP, AVIF.  To achieve interoperability, you either need to agree to implement them all, or chose one and have something convert between them.

Why are there so many standards?  Technologies evolve, techniques improve.  JPEG worked well for a while, but then WEBP came along offered better compression for web sites displayed on mobile phones, further improved by AVIF.  So now your poor little web browser on your phone needs to support them all to achieve interoperability, which means it’s not so little anymore – in fact, many reports suggest your web browser software is now more complex than the underlying operating system.

Another common solution is to define something specific for the system in question – commonly mocked as “there were too many standards to choose from, so we created our own”.  As we shall see, this is not actually such a bad thing!

Complexity

The second challenge is standards can be complex.  Consider MIP4 data format defined to the Multilateral Interoperability Programme for Command and Control systems, the list of data types supported is comprehensive, covering over 40 separate schemas and sub schemas.

When a standard becomes complex, implementation will often take pragmatic decisions to implement subsets.  However, when it comes to interoperability, unless both systems use the same subset, you risk semantic interoperability failures – one system may make an incorrect assumption based on the absence of a certain data field, not supported by the other.  Worse, where a system only implements a subset of the standard, users may overload a supported element of the standard to cover the absence (for example, provide a grid reference in a data field marked home address) – a pending issue when interoperating with other systems that do not understand the semantics.

Security

Where systems cannot agree on a common format, to achieve interoperability, gateways are deployed to convert from one format to another.  This poses a security dilemma.

  • The implied data transformation is good, implementing NCSC Cyber Security Design  Principle 2 - “Make compromise difficult”;

  • The implied data transformation is bad, it breaks, any security controls based on digital signatures (such as end-to-end approaches).

So, the interoperability gateway, inadvertently becomes an implied part of a Zero Trust security solution as discussed in Blog 3.  With this comes the need to perform data validation, which is where the standard complexity dilemma hits. 

  • The more complex a standard is, the harder it is to achieve assurance that the data is schema compliant and malware free;

  • Easy to assure standards, by implication are simple, and thus less likely to meet full interoperability goals (as discussed above in relation to data sub‑sets).

Conclusion

Standards are good, but you need to choose the standards wisely if you are to achieve secure interoperability.  A poor standard choice (or subset implementation) will likely hinder interoperability.  Consequently, the choice of standards needs to be factored into your overall Secure by Design process.

This blog series has looked at interoperability, concluding that interoperability is not something that can be taken for granted by adopting standards; nor can secure interoperability be taken for granted by adopting an end-to-end approach or Zero Trust.  Interoperability and specifically secure interoperability only happen when a structured and methodical process is applied to ensure a successful business interoperability outcome.


The Interoperability Series

This is part 4 of a 4 part series on Interoperability explore the full story:

  1. Interoperability – Are we there yet?

  2. Security and Interoperability – a Conflict

  3. Zero Trust and Interoperability

  4. Standards and Interoperability

Read more posts on

About the author

Colin Robbins is a Principal Security Consultant, leading customer-funded research activities in secure interoperability and information exchange. He has specific technical interests in the Single Information Environment and Data Centric Security, as well as the processes of security, such as Secure by Design and Information Security Management Systems (ISMS). He is a Fellow of CIISec, and a former NCSC certified Security and Information Risk Adviser (Lead CCP).

Colin Robbins on Linkedin

Read more posts by Colin Robbins