As I look back on the breakthroughs that define major transitions in computing, these occur by overcoming constraints. It has been a long journey from the early days of computing to where we are today. In trying to envision where we are going it often helps to look at the path that got us to where we are today and the patterns that moved IT innovation. Artificial Intelligence (AI) was discussed in the 1970s (I know, I studied it then). But in all practicality, it is just now coming into use in commercial situations. Why didn’t AI take off in the 1970s? Constraints.
Raising the Bar
In the early days, programming computers took an incredible amount of effort to make anything happen. Sure, there was little processor speed, memory, networking, etc. compared to today but the biggest constraint was the detailed level that programmers needed to work at to create a program. The earliest machines were physically hard wired to build a program. This was raised to a higher level through operating systems, then through high level languages, on to code generators, then through the additions of middleware, databases, and onward to SOA. (I have left out many additional items that could go in this list.) The raising of the level that programmers work at continues to be a constraint that is addressed as newer technologies, such as APIs, provide asset constructs that programmers can consume to build new offerings. Today applications can be programmed far more quickly using higher level constructs. Attacking this constraint continues.
Moore’s law states that processor speeds, or overall processing power for computers will double every two years. You can debate whether it will continue or whether it really doubles, but the point is that computers get faster. Similarly, memory becomes less expensive and available in larger quantity, networking bandwidth improves, storage is less expensive and denser, etc. We have seen major cycles in the IT industry based on the availability of breakthroughs in technology. The entire late 1990s were spent on Y2K fixes because of constraints that led programmers to use 2-digit years. Constraint removed and the turning of the century caused the rewriting of millions of programs for 4-digit years. (Those of us reading this will not be around in the year 9999 to deal with this the next time!)
My earlier question on why AI did not take off in the 1970s was due to technological constraints. We knew what we wanted to do with AI but bringing enough compute power to do it was too costly and network speeds to access all the necessary data would not suffice.
Technology will continue to improve with breakthroughs such as 5G happening now and quantum computing emerging. Each technology barrier broken allowing tremendous new capabilities.
With apologies to Edgar Allan Poe, the pendulum I refer to here is the swinging from centralized to decentralized IT and back and forth again and again. While not as potentially harmful as the one in Poe’s story, it is one that is the result of constraints that cause major shifts in IT and major expense.
The constraints that cause this swinging are speed and management. Decentralization is caused by a desire for more speed. Centrally controlled IT is not delivering fast enough for the business! So, allowing decentralized teams to rapidly develop new offerings provides the desired speed. But the decentralized teams do not want to take on the backups, maintenance, upgrades, patches, etc. The lack of management and control is recognized as a serious concern (e.g. security, availability, data duplication, etc.) which results in a re-centralization. Examples of this are client server computing in the 1980s, followed by server consolidation. Each swing of the pendulum seemed to be less severe than before as we learned from the prior mistakes. But, the swings still occurred. Business APIs and two-speed IT are our latest attempt to try to come to equilibrium. Will this solve the IT and the pendulum issue? It should if done well, but it is too early to declare success.
Removing the Digital Transformation Constraint
What is constraining the success of Digital Transformation? According to several analysts, the answer is “integration”. Businesses executing a digital transformation are taking advantage of many new strategies and technologies including Cloud, Microservice architectures, IoT, AI, Social, Mobile, Ecosystem relationships, new business models, and emerging technologies. This myriad of assets both owned and accessed stresses the ability to pull all the components together rapidly to build new digital solutions. For this reason, Gartner stated, “Integration has become an obstacle to success because traditional, centralized and systematic integration approaches cannot cope with the volume and pace of business innovation.”1.
IDC wrote, “By 2021, driven by LoB needs, 70% of CIOs will deliver ‘agile connectivity’ via APIs and architectures that interconnect digital solutions from cloud vendors, system developers, startups and others.”2 Listing “Connectivity” first in their worldwide CIO agenda.
Integration is the constraint that needs to be removed to enable digital transformation. Solving the integration challenge requires a combination of people, process, technology, and architecture. I addressed these topics in a 2 part blog: “Digital Transformation Requires Integration Modernization” and “Integration Modernization Requires Good Parenting”. As referenced by Gartner and discussed in these blogs, we need to expand the population that performs integration tasks. Relying on a centralized highly skilled group of integration experts to do everything does not scale. The answer is to distribute the effort to a larger group and provide the appropriate level of integration tooling required for each task and each skill level. This means that more than one capability is required. Multiple forms of integration for different integration types are required – API, Events, Application, Messaging, and Files are all involved. And yes, sometimes a highly skilled resource is going to be required. But the point is to drive most of the integration tasks out to an application builder to perform the simpler integration tasks once we enable them with the right tools and processes – without needing to call on an integration expert.
Recognizing the need for agile integration is the first step. IBM has introduced the IBM Cloud™ Integration Platform which offers a single, unified platform for integration. The Cloud Integration Platform provides flexibility to use the various integration capabilities as needed and in combination to solve your integration challenges. Capabilities can be offered to non-integration specialists to allow the expanded team to perform appropriate tasks to offload the simpler integration tasks from the highly skilled experts. This scales the resources to meet the required integration demand and removes the integration constraint that could potentially get in the way of our digital transformation.
To understand more about IBM’s thoughts on Digital Transformation and the API Economy visit the IBM API Economy website. IBM API Connect is IBM’s complete foundation to Create, Secure, Manage, Test, and Monitor APIs. You can find more information about IBM API Connect at the API Connect website. And you can also experience a trial version of API Connect.
If you have questions, please let me know. Connect with me through comments here or via twitter @Arglick to continue the discussion.
1 Gartner – Modernizing Integration Strategies and Infrastructure Primer for 2018
2 IDC FutureScape: Worldwide CIO Agenda 2019