#22 Why data economy requires standardization?

We all are aware of the significant role of APIs in modern business and IT architecture. APIs changed the way systems are built, enabled flexibility and agility to match customer and business needs faster than before. APIs have been around in the current function for around 2 decades but the turning point in the growth of the API economy occurred around 2011. During 2016 the estimate of the growth was shown in multiple graphs by Gartner and alike.


(https://nordicapis.com/tracking-the-growth-of-the-api-economy/)


Now 2021 the number of APIs in one API catalog alone is over 24 000. The growth has followed pretty much the predictions. Lets also keep in midn that catalogues mostly list just public APIs. Millions and more are partner APIs and private APIs which are not listed anywhere. The total number of web APIS is just enormous and growing. Why this has happened?


REST APIs became the de facto standard in APIs and industry started to revolve around the one selected approach defined in doctoral thesis written by Roy Fielding. REST was easy, light-weight and fast to implement. That probably is one reason for the success.


De facto standard to define API in machine-readable format


I was lucky to live through the growth phase of API economy developing API management software, writing a book (see below) and consulting dozens of companies. I see a lot of similarities in current data economy development compared to the API Economy.


Previously SOAP APIs were the "winning" approach. SOAP had good solid standards developed over time and tooling was built to match the standards. This in turn enabled scaling and unified solution sharing. All this was missing in REST API world. REST was a jungle. There was no shared understanding how to define REST API until Swagger came along (2011).


That was a turning point in API Economy. Swagger provided machine-readable standard method to describe APIs and their functionalities. That in turn boosted the API development tools development. And when developing APIs become easier and easier the amount of APIs skyrocketed. The same API description also enabled automated documentation with testing capabilities as well as generating tests for APIs to verify functionality. In short, the whole stack needed to develop and offer APIs efficiently started to build around de facto standard.


Eventually, Jan 2016 Swagger had become so important in global IT development and business that Linux Foundation took the idea under their wings as OpenAPI spec.


De facto standard to describe Data Commodity

Lack of standards is slowing down the growth of data economy. How? Well let's think about it. If we would have "Open Data Commodity" spec which would be similar to OpenAPI spec, the discoverability of any data commodities would increase, productizement and servitization of data would be faster, and the tooling around data products and data as a service would most likely follow the path wittnessed in API Economy.



Having the standard would enable competition among the market places since one lock would be removed (proprietary metadata format). Another significant benefit would be the ability to compare two or more products more easily. Finally, APIs are compared and taken automatically into use since there is machine-readable specification by automated systems. Needless to say that code reuse in tool development would also be possible if the standard would exist.


Discovering, comparing, and taking into use of data commodities is expected to follow the same path and even further.


We need the Open Data Commodity Spec to enable Data Economy growth and development scaling.