From its infancy, the computer industry has been characterised as a restless, fast-moving, ever-changing environment, young, exciting, vibrant and not a place for the faint-hearted to play. But from the perspective of the 1980s, let alone the 1970s, you ain’t seen nothing yet. At the end of the 1970s, people began to think that the industry would begin to settle down and start to mature in the 1980s, IBM Corp was king, the race had been won, and all we were trying to spot was who was going to come a distant second in a race that was already over. In 1981, IBM announced the original IBM Personal Computer and within a couple of years, had created a completely new world order, albeit one in which IBM seemed to be even more dominant. One by one, the second line minimakers fell by the wayside and only Digital Equipment Corp continued to prosper.

Botched

But by 1984, IBM was already showing evidence that it had become fat and lazy, and for those that cared to probe deep enough, the seeds of its long-delayed destruction had already been sown. It was making much too much money out of the clapped-out XT to enhance the thing at the rate technology – courtesy of Intel Corp – dictated, and it was not only critically late with the AT but it botched the launch. As the clonemakers led by Compaq Computer Corp began to grow and prosper, and the concept of a single standard of desktop computer for which the whole software industry could write began to gather strength, a handful of visionaries began to realise that users who had been liberated from the vendor-proprietary scourge would soon begin to clamour for the same freedom in their mid-range and mainframe computing, and inadequate as it was, Unix was the only readily available catalyst for such a trend. IBM recognised the potential threat, but complacently assumed that its effortless domination of the industry would see the pretender off – the mainframe businesses of Honeywell Inc, Sperry Corp, Burroughs Corp were doomed anyway, but IBM users were an elite that wore their Blue badges next to their hearts and would never deviate from their adherence to the gospel according to Armonk and the privilege of paying way over the odds for their mainstream computing in order to keep the gods of IBM in the manner to which they were accustomed. By 1991, it was already too late for IBM to do anything to stem the tide, and the orthodoxy was that open systems powered by Unix would dominate the mainstream commercial computing world. Yet here we are in 1993, and all the certainties of the turn of the decade are being questioned. In the 1970s, critical mass for a minimaker was seen at around $500m a year: today, having witnessed the vanishing of Prime Computer Inc, the slow fade of Control Data Corp, people talk openly about $5,000m a year not being big enough to save a company from oblivion. One false step, botched transition and Compaq Computer Corp, Sun Microsystems Inc are on the slippery slope to the eternal bonfire. Is DEC on its way to becoming a $20,000m a year company in 1995, or is it the next Unisys Corp? The rock of IBM as the anchor of the industry has been blasted to smithereens – but far from a single desktop and a single data centre standard moving in to take its place, for the rest of the 1990s, we face a computing world in which even the most recently established certainties are under unprecedented challenge. Apart from IBM’s MVS and OS/400 only two proprietary standards have any strength left in them – Apple Computer Inc’s Macintosh System and Digital Equipment Corp’s VMS. And Apple enthusiastically, and DEC tentatively, are both moving to open their fortresses to the barbarians at the gates. Apple is taking the super-high-risk strategy of not only moving Macintosh System to an IBM chip architecture and giving IBM the right to sell it in competition, but moving to make it possible to run the entire Mac software base on Unix machines, while considering putting a version of Macintosh System on the successors to the original IBM Personal Computer.

Hidden agenda

Is it mad? The answ

er is that it is a high-risk strategy, but Apple feels that it is almost certainly doomed if it tries to remain proprietary. With much less conviction, DEC is making the same kinds of noises about OpenVMS, although no-one has seriously bitten yet. The impending dominance of Unix looks much less certain now, threatened not only by Microsoft Corp’s NT, but also potentially by any hidden agenda that Novell Inc may have in its impending takeover of Unix System Laboratories Inc. And with the promise of another great Windows leap forward in Chicago or Windows 4, is there a big market for NT, or will it be left to slug it out with OS/2, which will never be the runaway success that IBM hoped, but is not going to curl up and die anytime soon. Is there room for Open Macintosh, NeXTstep, UnixWare on the desktop – or is the whole world as we know it going to be swept away by Taligent? Is everybody going to end up running today’s applications in emulation mode under an operating system and on a hardware architecture for which they were never designed? Intel’s desktop dominance is far less absolute than it appears today: all the evidence is that the inherent advantages of RISC will make it harder and harder for Intel to keep up with its complex instruction set architecture and the teething problems with Pentium already seem more serious than those with the early 80486 and 80386. Intel faces a walk across the Grand Canyon on a tightrope as it evolves a strategy to migrate all that iAPX-86 software to RISC without losing the game to one or other of the established contenders. And which are the winners and which are the next Clipper, 88000, NS32000 from Alpha, PowerPC, Precision Architecture, R-series, Sparc? Care to put money on it? As we face a world in which increasingly it seems anything will run on anything (although of course it won’t, really, in any great volumes), mention of Taligent heralds the most frightening revolution of all. Taligent is of course only one of a string of object-oriented operating environments under development, and the winner is highly likely to come not from one of the majors but from one or other of a string of start-up companies of which few people have yet even heard. But the object revolution – almost certainly allied to the parallel processing revolution – threatens to sweep away almost the entirety of today’s software industry. The astonishing pace at which multimedia technologies are taking hold in the US, with a whole string of new developments announced every week, is going to demand a phenomenal amount of new software and only object-oriented techniques, which have to lead to the process of programming becoming one of buying tested objects off the shelf and linking them to create the desired application, offer a possibility of plugging the yawning software gap. And software engineering techniques will then not be much use – the inherent inefficiencies of object-oriented programming dictate that objects will have to be written in the same way as the most advanced video games machines, by people capable of following a program right down to the machine code level. And on that thesis, those investing for their retirement should think about selling all traditional software and software engineering companies, all traditional hardware companies, all leading edge microprocessor makers that can’t make a convincing case for having an architecture to which object environments can be effortlessly mapped. And they should buy a portfolio of parallel porcessing and object-oriented start-ups. That’s the way it looks today. But remembering how different the whole computer world looked in 1990, let alone back in 1980, only a fool would be confident that parallel processing and object-orientation won’t turn out to be the artificial intelligence, the robotics, the desk-top publishing of the 1990s. Here at Computergram, all we can do is strive to keep subscribers up to date with the developments shaping the computer world of the 1990s, and provide the evidence on which informed projections can be made.