Sometimes it’s the journey that teaches you a lot about your destination.Drake
Software driven labour productivity (or lack thereof) has been under the lens for a very long time. Through the various generations of software technologies, there have been consistent attempts to provide relief to human fingers as they trudge wearily over keyboards.
The simplest form of relief started with the concept of macros. These are sequences of actions that are pre-configured and later triggered when required. The facility to define macros is usually provided by the user interface technology used by the application software. Each application software tool provides its own macro facility. Platforms such as Microsoft and Apple have generic macro definition and execution layers that can span user interfaces (UI) built natively on their respective platforms.
There were also various disjointed user interface technologies such as Microsoft DDE, and HLLAPI based emulators that were used in various applications. Screen-scrapers, DDE and HLLAPI were used to for creating macros that would assist humans to accomplish a sequence of tasks in “handsfree” mode by doing things on user interface (UI) screens as if the human was performing those sets of actions. Macros did the same. Tools such as Microsoft Excel were built with extremely powerful macro and script execution capabilities.
This was followed by technologies such as Auto Hotkey and Autoit, which were meant for Microsoft Windows based user interfaces. There are probably dozens of tools and apps in this genre. The advantage of Autoit, Auto Hotkey, and similar software is that they provided powerful scripting facilities for deploying across multiple Windows applications present on a desktop session. However, all of these technologies were standalone and could not deliver a full-scale automation of complex business processes that humans were having to perform in those days.
So in the early 2000s came two products that addressed this directly. One was In-Q-Tel (CIA) funded OpenSpan (acquired by Pegasystems in 2016) and the other was Inventys Fusion, from the company Inventys that I founded in Singapore, along with Balaraman and Shiva Kumar. Somewhere in between, we renamed our product “Intermix”, but later we reverted to “Fusion”. There was a third company, Blue Prism, based in the United Kingdom, but I am not clear whether they had a full-fledged downloadable and installable commercial off-the shelf product (COTS), or a customised “consulting-ware” that was getting incrementally refined until it became a COTS product many years later.
My reasons for creating Inventys Fusion were twofold. The first relates to an incident nearly twenty years ago, when I was working in BEA Systems (makers of WebLogic) and was in a consulting engagement with the Group CIO of a very large telco and our product sales folk were trying to sell WebLogic Integration (WLI). He wasn’t impressed at all with all the grand ideas that we were selling to him about WLI and other contemporary integration mechanisms. He then made a statement that deeply resonated with me. He said that they (the telco) had made significant investments in dozens and dozens of software systems, and each of them on their own was working brilliantly well. His problem was that things were slowing down because of the inefficiency caused by humans having to move data between the screens of these various apps. He said that all he wanted was “… a way for data to jump from the screen of one system and enter the screen of another system”. That was the seed for what became Inventys Fusion, after I resigned from BEA and started Inventys.
The second reason why we created Fusion was because I had a philosophical disagreement with the concept of Portal Server technologies that many companies (including BEA and IBM) were selling in those early 2000 days. A “portal server” according to my logic should have been a technology that can assemble a single consolidated view from pre-existing user interface screens. This was however not the case with all the commercial portal servers that were in fashion in those days. So if you had many apps with UIs and you wanted to create a composite portal out of these, you would practically have to trash all those existing apps and create a new “mega” app on the portal server and define “portlets” (little rectangles and squares within the browser window) to represent the component apps. The whole idea of rewriting perfectly functional web apps just to make them fit into this new thing called “portals” was wrong, according to me.
Our main competition at that time had none of these, what we used to call, “enterprise strength features”. Their product had to be distributed using Microsoft’s enterprise administration functionalities — transporting the desktop installer MSI file during upgrade cycles of the desktop. They did not have native identity management facility and required customers to spend extra dollars to buy and install a separate identity management system. They also did not have any form of connecting to Citrix in the early days. In 2006 they lost a deal to us in which one of the deciding factors was Citrix integration. It was perhaps that loss which prompted them to later consider adding Citrix integration features.
We (Inventys) were actually “hardcore” enterprise server-side technology creators. But we cast aside any deep fundamentalist and bigoted thoughts that everything has to be “server side only”; and we explored the concept of user interface integration technologies as something that would complement conventional EAI (enterprise application integration) techniques.
Enterprises had hundreds of reasonably well-performing software systems. Their only flaw was that they did not interoperate with each other. Traditional server-side middleware vendors such as IBM, Oracle, BEA, etc., flanked by “consulting firms” and system integrators (SIs) were taking advantage of the situation and virtually forcing enterprises to embark on expensive transformation projects requiring complete tear-down and replacement of existing systems. Most of these legacy systems were doing a good job in their respective functions; the only problem was that they were not easy to interoperate. As a result, complex business processes that required workflows spanning across multiple applications were not easy to achieve. If all the instances that required application integration were to be stacked in descending order of value of the integration the resulting graph would look like this:
Each vertical bar represents an instance of an integration between two or more existing enterprise applications. The height of the bar depicts the value of having that particular software integration so that the business processes can function without human intervention. In most enterprises, this kind of data will form a “long-tail” graph. There will be a few instances (stacked on the left-hand side) of integration that yield a very high value and a long tail of low-value integrations. For the sake of statistical analysis, you can assume that the numbers have been normalised such that the cost of implementing each vertical bar (an instance of an integration between some systems) is constant.
Since all vendors’ and consultants’ “expert” advice is to rip-out old systems and build new systems, the logical decision that businesses take is to implement these transformational projects only for the highest value situations; that is, the instances on the left-hand side of the graph. The other instances of application integration, forming the long-tail of the graph, are left to be handled by humans.
This human intervention is also called software driven labour, which I had described previously in the context of complex processes and flow composition. This is how software driven labour is connected to organisations’ decision to opt for human intervention over expensive transformation software integration.
From around the late 1990s large companies figured out that labour arbitrage was a convenient way to tackle software driven labour. The idea has been to hire people in low-wage countries and to get them to perform these human interventions in business processes for a fraction of what it would cost in the home locations of these enterprises. Thus began the “golden era” of Business Process Outsourcing (BPO) and Shared Services Outsourcing (SSO).
To its credit, the BPO industry has attempted to change their game and establish a more permanent position in the value-chain of the enterprises that they serve. I will discuss the value-chain positioning of BPOs in a separate post.
When we took Inventys Fusion to the market, our purpose was to address application integration for the long-tail of the integration value curve. From a mathematical perspective it can be seen that in the graph shown above, the area under the curve depicts the total value of the integration efforts. It is the integral of the curve. The area under the initial part of the curve shows a large value. Enterprises therefore concentrate their budgets to address those integration issues that the blue area represents. However the concept of long tail teaches us that the area under the long tail is often the same as or sometimes greater than the area under the fat curve. The interpretation of this in the application integration concept is that the area of the red part of the curve is the same as or sometimes greater than the area of the blue part of the curve. This implies that value of integrating and automating all of the instances under the red part of the curve would be equal to or greater than the value of the integrations under the blue part. However, using conventional integration approaches the cost of implementing the integrations in the blue part would be a fraction of the cost of implementing all the integrations in the red part (the long tail) — and hence the relegation of those affected processes to be performed as software driven labour.
With Inventys Fusion, the aim was to create a new kind of integration technology that would make the cost of implementing the integrations in the red part of the curve (the long tail) very inexpensive, not involving replacing and transforming existing (“legacy”) systems. While I cannot speak for OpenSpan, I trust the promoters of that company may also have been driven by this ideal.
To summarise the journey thus far:
- Software driven labour has been on the rise due to the inability of legacy server-side enterprise software to interoperate.
- Previous generations of screen macro utilities were unable to provide solutions that could orchestrate data and control flow across heterogeneous user interface screens.
- The emergence of more sophisticated screen flow orchestrator products such as Inventys Fusion and OpenSpan provided a much needed generational upgrade to front-end-integration techniques.
We built this product and technology based on many years of experience and understanding of issues relating to process integration in enterprises. But in order to provide it to organisations that need such solutions desperately, we still need to package it correctly. By ‘package’ I did not mean gift-wrapping or cardboard cartons, or even software installer tools. I am referring to “marketing” and “messaging”. The people employed in the organisations that were suffering due to lack of process integration — and who had hitherto relied on sub-optimal methods to alleviate the issue — still needed marketing machinery to be deployed on them. They were/are busy people who manage complex enterprise systems that are unable to interoperate and smoothly execute complex business processes without human intervention. Therefore they need metaphors and marketing copy to help them fully understand the capabilities of new products that are introduced by unknown companies. The well-known high-end enterprise software product companies have armies of employees whose only job is to create and implement market positioning strategies for their products and services, so that these busy executives on the buyer side are able to fully comprehend and appreciate value of the products and services on offer.
From 2005 to 2010 we searched for various terms that could describe and categorise our technology. “Enterprise mashup” was a term that was in fashion during those days and we thought we could adopt and extend that phrase. “An enterprise mashup is the integration of heterogeneous digital data and applications from multiple sources for business purposes.“ Inventys Fusion’s key feature was its ability to integrate with legacy applications via their user interfaces and to dynamically orchestrate data-flows between the various user interface screens. One of the early uses of Fusion was in customer care contact centers where agents handing voice calls would use a “single view of customer” screen generated by Fusion, which would search for a given customer in multiple legacy applications and extract and present the data in one screen, resulting in significant time and effort savings for the agent. Fusion’s dynamic and bi-directional data orchestration with the legacy applications meant that changes or updates made on the new “single view” screen would automatically be propagated to the respective legacy systems. At that time I thought that “Enterprise Mashup” was a suitable phrase to co-opt for Fusion. The term had already been coined in the context of server-side mashups; and now Fusion extended it to user-interface driven mashups.
Our competitor at that time, OpenSpan, would also fit in the same category since that product also did much the same as Fusion. Our advantage was that we had native support for identity management, automatic sign-on, and Citrix integration way before any other product. The OpenSpan guys came up with the phrase “Surface Integration”. I really liked that term; it resonated well with me, and for a long time I was jealous of them for having created such an apt phrase. During that time many other phrases also appeared: “integration at the glass”, “swivel-chair integration”, to name a couple.
The problem of process integration was known from over a decade before (early to mid 1990s) — it was a well known ailment. By mid 2000s a couple of companies had discovered/invented novel ways to address this issue by interoperating the user interface screens of legacy applications — a medicine was now available. We then needed a way to get the patients to take the medicine.
In the technology world, the process of getting patients to understand and consequently accept the medicine that would alleviate their problem is called “technology adoption curve”. There is a well respected and established theory about this topic. These days every successful software entrepreneur and his VC (venture capitalist) have sacks of wisdom to impart on an even larger topic called “product market fit” — which deals with non-technology-based-products as well. I will discuss all these, in the context of process integration, in my next article.
: https://searchbusinessanalytics.techtarget.com/definition/enterprise-mashup, accessed 21 Nov 2021