The company that stands to gain or lose the most from Netscape’s ability to maintain leadership in both Internet software and its graphical user interface is not Netscape at all. It is IBM. It is easy to say, as IBM’s chairman has declared, that the battle for the desktop is no longer significant, and that if it is indeed over, well so what. Easy, but completely wrong. More than ever, the graphical user interface will shape client-server applications, whether the desktop machine is a whole personal computer or a thin client dependent on servers for its ability to link users to applications. In theory, any graphical interface can be made to talk to any server application. In practice, the close coupling between client software and server software enables the company that controls the client to influence the software and hardware of the server, too. In an environment with pedestrian mainframe applications and green screen terminals, the terminal has no personality. It makes practically no contribution to the character of the end user’s work and it certainly cannot be personalized by the end user. The glass house is the whole system, so the vendor that controls the glass house controls everything.
The glass house
When that same environment is upgraded so that personal computers or Network Stations rather than green screens are on desks, the rules change. The end user has – depending on whether the device is a personal computer or thin client – some degree of control over the workplace. This control may be modest and, in terms of its impact on a server-based applications, irrelevant. It may be limited to aspects of a screen’s appearance or layout, for instance. But it does empower the end user and, in cases where a more capable terminal enables an end user to use local tools at will, ask for on-line help or otherwise shape the work process, it attains real importance. If, in addition, the graphical client is found to boost productivity, its influence will no longer be confined to the end user’s limited realm. At that point, the importance of the client becomes great enough to exert real influence on server and glass house applications. As long as the personal computers merely emulate green screens, the power in the glass house will of course prevail. So it may not have been immediately obvious to large organisations whose personal computers were hobbled by the early limits of client-side technology just how much conditions would eventually change. Sure, since the advent of the personal computer there have been plenty of forecasters that said the personal computer would become influential, but in the absence of obvious change, many information processing managers as well as their key suppliers saw little fundamental shift in power away from the glass house, except in departments that actually used personal computers for major tasks. Where the dominant applications remained those hosted on a mainframe, the presence of personal computers did little to alter the shape of computing. But now we can see that wherever a personal computer or thin client becomes an important part of the application, the power starts moving to the client end of the wire. And then if local servers are interposed between the glass house and end users, the supplier of the server software begins to grow stronger for much the same reason that client software vendors influence server choices. If a perimeter server is a Windows NT machine, chosen at first only because it was the cheap option, the server’s other software may well come from Microsoft. After a while, in situations where the server can really pitch in and do some of the work, it will become important for more central systems and even the glass house system itself to be adjusted so that the perimeter server can be more effective. So, wherever perimeter servers run Windows NT, NT’s personality will sway the way a mainframe is balanced and, possibly, affect strategic choices, such as the selection of mainframe database management software or disk subsystems. Right now, at typical mainframe sites, that may not seem any more likely than the transfer of influence to the client end of the wire did when personal computers were first used as green screen emulators. But, we suspect, this time the users are going to spot the change a lot faster. Also, in the case of servers (and in contrast to client terminals), when it becomes clear that an application and its associated database can be supported on a non-mainframe server more cheaply and just as reliably, it is going to be moved. IBM can no longer count on legacy iron and legacy code to anchor it to enterprise customers. NT servers and their BackOffice suite of programs are not yet in the mainframe class when it comes to resiliency. Neither are Unix systems, although the better ones are certainly closer. But both NT and Unix environments provide considerably more functionality and conformance to standards than OS/390 or OS/400 when it comes to Internet technologies such as Web page serving and electronic mail. So, wherever Internet technologies become primary and legacy software is relegated to a background role, traditional systems are put at a disadvantage – possibly a significant one. The domino effect – choices of clients influencing the selection of perimeter servers and perimeter servers affecting central systems – does not have to be absolute in nature to have a telling impact on IBM. All it takes is for customers to decide that new applications should go on non- mainframe (or non-AS/400) machines, even though older applications may remain where they are for years to come. IBM will experience slow growth or no growth in its base of proprietary systems, but it will have continually to cut the cost of those machines to stay in line with the pace of progress in growing markets. Clearly, this process is already under way, but the growing pressure on IBM’s mainframe business – offset in the most recent quarter by an upturn in AS/400 shipments – is only a hint of the situation that could unfold if IBM’s proprietary systems fail to mesh with the servers and ultimately the clients that surround them. Moreover, technologies perfected far from the mainframe will invade the glass house at an ever increasing pace. Storage devices and controllers, communications sub systems and other peripherals from the far larger markets for small servers and their clients will either be adapted for use in central sites, or their work will be removed from core systems and moved outward.
Lack of vision
As is the case with processors, the broader market in ancillary devices has enabled competitors to take away unprecedented portions of IBM’s former domain. Also as with central processors, recent market developments are only the forerunners of major changes to come. IBM is late with cross-system storage products and it must pay a price for its errors until it can offer machinery that will serve the changing requirements of its customers. Perhaps surprisingly, this is the case even though the cost of hardware is a diminishing part of total information processing budgets. The reason IBM’s markets are sliding away is not as simple as the price of Big Blue’s products, but the lack of vision that has led to their improper positioning. IBM has greatly improved the value of products for legacy requirements, but these products may not fit as well into the future strategic needs of customers as alternatives offered by IBM’s rivals. Even when IBM says the right things about its direction – extending its system software with Unix and Internet features, for example – it still appears to view computing as a top-down process. But that is no longer the case. Computing today is driven more by clients than by considerations emanating from the glass house.