IBM and the Cognitive Era in Computing

A few weeks ago there was a conference in the San Francisco Bay Area on Cognitive Computing. Several vendors made presentations, and, as I thought about them, I realized that a significant portion of the software market has changed rather dramatically.

I am used to thinking of new markets as conforming to Geoffrey Moore’s Technology Adoption Cycle, where new startups are created in response to new technological developments and acquired by Innovators who are willing to take significant changes to discover exciting new stuff. A bit later, the new companies get a little larger and begin selling to Early Adopters, companies that aggressively pursue new technology, and so forth. At some point the little companies are bought up by larger companies, stumble across Moore’s well-known chasm and the Majority begins to adopt the new stuff from larger vendors who make it respectable. This model has described new software markets for several decades.

There is a sense in which the current BPMS market followed Moore’s curve, with small companies developing early BPMS software and gradually merging with larger companies, etc. But there was always something not quite right about the BPMS market, partly because it was built, conceptually, on top of the existing workflow approach, and partly because it kept getting sidetracked by still other new technologies, like business rules products, and then business intelligence products, and then case and process mining products.

As I listened to discussions of the new Cognitive Software tools, which are clearly dominated by IBM’s Cognitive offering, I realized just how different this new software market is going to be. First off, there really aren’t any new technologies on offer. Most of these technologies were new in the Eighties, when we called them Expert Systems, or Natural Language, or neural nets, or case-based reasoning tools. No really new concepts have been added – the main difference that I can see is that computers have grown bigger and faster and can now execute applications fast enough to be commercially useful.

But something else is different too. As with BPM, we aren’t looking at specific new technologies, but are, instead, looking at many different technologies that are being combined. Thus we have decision management and analytics and natural languages combined with big data and massive parallel processing, the internet of things and cloud computing. Combining and integrating very complex enterprise technologies isn’t the kind of thing that little startups are good at. On the other hand, this is the kind of thing that IBM is very good at.

It’s as if we are looking at a new kind of market – a market that arrives 10-20 years after the initial technologies are introduced when new generations of computers have arisen to support these existing technologies and where the challenge is to combine several new technologies into new, powerful package.

There is, in addition, another twist. These new packages are very complex and they depend on the specific knowledge used in particular industries. Thus, one can’t easily develop generic software tools, but, rather, has to develop software tools tailored for banks, for for telecoms, for oil companies, or for auto manufacturers. This is an approach that IBM has come to specialize in. IBM’s software division  offers generic BPM tools, for example, but IBM’s consulting division offers a BPM environment for banks, and another for insurance companies, and still another for oil companies, and so on.

In addition, developing large, complex, specialized tools like this would be nearly impossible if one was limited to selling them to only those companies sophisticated enough to configure and maintain them. Luckily, that’s where the cloud comes in. IBM no longer has to hand over a package and hope a company can install it and then make it work. Instead, they can let a company access it in the cloud, where IBM has created and can continue to evolve and maintain the application. The cloud changes the whole nature of the sales and adoption cycle and eliminates lots of local company constraints that used to make it so expensive for companies to adopt new technologies.

Thus, it seems to me that computing is going through a kind of perfect storm, in reverse. The future is going to involve international companies buying very complex software applications that will combine several cutting edge technologies with very large databases that contain knowledge of specific industries. Few of even the largest companies would be able to do this on their own, but a company like IBM is perfectly positioned to provide this as a service in the cloud. This isn’t a game in which small software startups can play. It isn’t even a game at which most mid-sized software companies can succeed.   Perhaps Fujitsu can compete with IBM, perhaps a few other enterprise software companies can join in this game, but mid-sized and small software companies won’t even make it to the starting line.

What I realized, as I thought about how the cognitive computing market is going to be so perfect for IBM, is that the BPM market was an earlier version of a market that rapidly evolved in the same way. Initially there were lots of small companies involved, but as organizations realized how beneficial it was to have software development and execution environments that could combine a number of very sophisticated technologies, it became obvious that only a few large vendors could master and offer this new generation of large BPM platforms.

And what proved true in BPM after 15 years is now true of the emerging Cognitive Computing market even as it is just beginning. Analysts thinking about software markets are going to have to develop some new ideas if they are going to understand what comes next.



  1. Claude Patou says

    So well as, a great products panel to BPM refounding

Speak Your Mind