|
OPINION: Middleware: A Brief History
osOpinion.com May 28, 2002
![]() ![]() Why is control of middleware seen as such a key issue? Can middleware generate significant revenue? To understand the issues involved, we must go back to the elder days of
computing. Before the introduction of the IBM (NYSE: IBM) All of the hardware exposed similar functionality to software developers: parallel port for printing, serial port for modem, keyboard for input and text display for output. It would have made sense to write a common layer that sat above the different operating systems and exposed the same API (application programming interface), so developers would have had an easier time porting their applications. But this idea went nowhere. The reason was performance: On an Apple (Nasdaq: AAPL) Along came the IBM PC and DOS. Microsoft marketed DOS to developers
with a middleware argument. It was apparent that 16-bit microprocessors,
and the Intel (Nasdaq: INTC) So, Microsoft presented DOS as a middleware layer, between the application and the BIOS and raw hardware, that software developers could code to. Microsoft then would do the work of porting DOS to whatever 16-bit systems became popular. One of the other operating systems that IBM sold for the original IBM PC, the UCSD p-system, was explicitly a middleware system that compiled binaries to intermediate "p-code" that was then interpreted by the operating system at run-time. With some prodding by Microsoft, the market for 16-bit computers quickly narrowed to just the IBM PC and its clones, all with the same BIOS interfaces and hardware. Those machines soon had enough memory and processing power that the performance issues of middleware became non-issues. In the late 1980s, however, a different problem arose that prevented middleware from gaining traction. That problem was "feature lag." As the PC platform gained popularity, it became more heterogeneous, with graphics cards, joysticks, CD-ROM drives and sound cards appearing in some hardware configurations. This situation complicated things for both middleware designers and software developers targeting middleware. Applications need conditional code to handle optional hardware. Much worse, however, is the fact that when a new class of hardware appears, there is a delay before the middleware layer is altered to support it. Application writers who avoid middleware and write directly to the underlying system don't have this problem. Unfortunately, the direct coding approach can be a lot of work. For
example, when DOS application developers wanted to go beyond basic text,
they had to include per-application drivers for the various graphics
printers This meant that the DOS/BIOS model was not ideal from Microsoft's point of view. While the executable format was DOS-specific, and such functions as memory management were exported by DOS, BIOS calls were made directly to the firmware and remained unchanged if an application was ported to another PC operating system. Therefore, so-called "DOS" applications were only partially bound to DOS (keep in mind that the original IBM PC had no hard drive but instead booted from a floppy disk, so it was entirely reasonable that a user might use different operating systems for different tasks). Then Microsoft came along with Windows, which solved all of those problems. Windows began as a middleware layer on top of DOS/BIOS, smoothing out differences between hardware that was functionally equivalent but technically different. For application writers, Windows was a huge advance over DOS, just as DOS had been a huge advance over the fragmented 8-bit world. It also became ubiquitous, allowing it to overcome the feature lag problem: Today, no new piece of hardware is released without a manufacturer-provided Windows driver, and new functionality in the BIOS/firmware is coordinated with new releases of Windows. Since the mid-1990s, middleware has attempted a comeback. In fact,
there are two classes of middleware: the traditional kind that exposes an
API for applications, such as Java What are the chances that these new systems will become strategic
advantages for the companies that produce them? Stay tuned for Part 2.
Author's background: Adam Barr worked at Microsoft for over 10 years before leaving in April 2000. His book about his time there, Proudly Serving My Corporate Masters, was published in December 2000. He lives in Redmond, Washington. Adam can be reached at adamba@gte.net.
|