Rethinking Open Systems
David Chappell - August
1995
Open systems technologies, those not tied to any single
vendor, have been the holy grail for many users. With a near-religious
fervor, some organizations have spent the last dozen years crusading
for this belief. To be free from a single vendor, goes the cry,
is to live in the best of all possible worlds.
But is this really true? Are open systems the great boon they promised
to be? Or have they in fact been less effective than their supporters
hoped? And when they are effective, what exactly are the conditions
that allow the creation of successful open technologies?
Defining an Open System
Answering these questions first requires defining exactly
what it means to be open. This is easy if you're a vendor, since
whenever any vendor says "open" , they always mean the
same thing: their products. This definition isn't quite what most
users have in mind, however. To users, an open technology commonly
has the following attributes:
it's not controlled by any one vendor;
the basic functionality is the same regardless of which vendor
the technology is purchased from;
the technology allows interoperability and/or portability
among products from multiple vendors.
It's this last point that most lights up the eyes of users. If several
different vendors all sell essentially the same technology, then
competition among them will lead to lower prices. High tech products
can
become commodities like wheat, which in turn, the argument goes,
will benefit users. (It's worth pointing out that this in no way
benefits vendors--quite the opposite. Unsurprisingly, then, despite
protests to the contrary, vendors have often done their best to
avoid implementing truly open systems.)
Opening the World
The roots of the open systems movement date from the late
1970's. At the time, each hardware vendor had its own proprietary
processors, operating systems, network architecture, and more. When
a user bought a system, that user was entering a long-term relationship
with that vendor in many areas, a relationship that was very difficult
to break. The vendor was well aware of how hard it would be to switch
and so priced its products accordingly.
The situation today is very different. The industry has stratified,
and while many of the old-line monolithic vendors still survive,
they are largely living off past glory, i.e., their large installed
base (further evidence of how difficult it is for users to break
away). For a new system, it's quite possible that the processor,
operating system, and networking software all come from different
vendors. Indeed, the most profitable companies in the industry today
are those that dominate somedimension of this new partitioned world,
firms like Intel in microprocessors, Microsoft in PC operating systems,
and Novell in network operating systems. The days when a single
vendor could effectively build and sell completely proprietary systems
are fading fast.
In some ways, the current world is exactly what was hoped for by
the original proponents of open systems. Users are free from the
dominance of a single vendor. But to a large degree, this freedom
is not the result of open systems. Instead, it comes from something
that looks a great deal like a group of monopolies or, put more
politely, technology franchises.
What Do Users Want?
Contrary to popular opinion, users never wanted open systems.
What they were after was the benefits that open systems could provide,
chiefly competition among vendors and the lower prices that would
ensue. And after years of being beholden to a single vendor for
everything, many organizations wanted freedom, too, the ability
to easily buy elsewhere if a vendor's service or support or general
attitude were substandard.
But consider: in what market have we seen the greatest competition,
the fiercest price wars, and the fastest increase in value received
per dollar spent? The answer is obvious: the world of IBM-compatible
personal computers. But by the definition above, there's just no
way to argue that this competition was made possible by open systems.
Instead, it's the result of a de facto standard for building a system,
a standard that's ultimately based on two franchises: Intel processors
and a Microsoft operating system. The key things that users wanted
from open systems are present here: fierce competition, lots of
choices, and the absolute commoditization of hardware. (A side effect
is that the owners of those franchises get very rich, but this is
a concern only for their competitors, not their customers.)
The thing that's missing in this market, and an attribute that many
open systems advocates say is essential, is complete vendor independence.
Everybody who buys a PC is dependent on Intel and Microsoft, as
both the recent Pentium brouhaha and the eager anticipation of Windows95
make clear. But while this certainly offends many open systems purists,
it doesn't seem to have stopped them from taking advantage of the
very real benefits these dominant technologies provide to their
users.
The Beat Goes On
And those benefits show no signs of stopping. Just as Windows played
an essential role in commoditizing PCs, so might Microsoft's Windows
NT lead to the commoditization of server hardware. Unix, ostensibly
an open technology, has gone a long way in this direction by providing
a quite similar environment on hardware from multiple vendors. Those
vendors have always made sure, however, that the user and programming
interfaces, administrative tools, and other details were just enough
different from those of their competitors to provide product differentiation.
In practice, this has meant that moving from, say, Hewlett-Packard's
HP-UX to IBM's AIX has entailed some pain. Since both systems run
only on a single vendor's hardware, this was good for that vendor.
Windows NT has the potential to change this. Unlike the various
flavors of Unix, Windows NT really is the same on multiple systems:
user interface, programming interfaces, etc. Moving an application
from NT on one vendor's hardware to NT on another vendor's hardware
requires at most a recompile. This is the openness that Unix has
always promised but never quite delivered. Just as the ubiquity
of Windows reduced the market for PC hardware to a dog-eat-dog competitive
level, Windows NT may well produce the same effect for server hardware.
Users get what they want in this scenario, even though open systems
are nowhere to be seen.
Innovation and Open Systems
Why haven't open systems met users' expectations? One reason
is that in dynamic, rapidly changing technologies, they just take
too long to develop. Since most standards are produced by committees
of some kind, they lack the ability to respond quickly to a changing
market. And since many of those committees reach decisions by consensus,
the end result can sometimes reflect internal politics more than
market realities. A single agile vendor can often do a much better
job of establishing and advancing a franchise technology. Whatever
criticisms can be made of product delays from companies like Microsoft,
they pale in comparison to the glacial pace of many standards committees.
Since the business problems that users need to solve don't wait
for a committee to finish deliberation, those users can't, either.
There are many examples of these kinds of delays. One of the more
well known occurred in the development of version 2 of the Simple
Network Management Protocol (SNMP). The small group that was primarily
responsible for creating the original SNMP standard attempted to
repeat the process that had produced their initial success, working
essentially on their own to develop version 2. When the fruits of
their labors were presented as a fait accompli to the relevant working
group in the Internet Engineering Task Force (IETF), they were deemed
to have been developed in an insufficiently open manner. The result
was a significant delay that produced little in the way of technical
improvements.
Making Open Systems Work
Open systems are not the panacea they were once thought to be, and
they aren't the only path to achieving users' true goals. It's absolutely
true, however, that some open systems technologies have made the
world a much better place. Especially good examples of this can
be found in networking, where standards for LANs, WANs, and more
have, despite delays in their development, proven effective.
Based on the experience so far, it's possible to list some of the
criteria for creating workable, successful open technologies. They
include:
a reasonably stable technology: where technologies are very
dynamic or where basic issues in a technology are not yet well understood,
trying to define standards is unlikely to succeed. As mentioned
above, most standards committees are too ponderous to really be
effective in a fast-moving area. And if there's no real grasp of
fundamentals, it's easy to end up doing research by committee, something
that's proven to have a very low success rate.
design by experts: the members of the committee must truly
have expertise in the area. Too often, the quest for openness is
interpreted to mean open enrollment in the committee. If at least
the dominant developers aren't engineers with experience in the
area, it's unlikely that the results will be usable.
avoiding slavery to consensus: whatever the political pressures,
decisions cannot be made by strict consensus. Achieving complete
agreement is very difficult and, more important, takes too long.
Producing a perfectly harmonized standard is meaningless if it's
completed too late.
technical, not political criteria: standards produced by
agreements among competing vendors often don't work. Vendors have
every incentive to fight for proprietary advantage or, failing that,
to agree on a very loose specification, one that all but requires
proprietary extensions. The most effective standards have been produced
by groups of engineers, usually from different companies, but working
together as individuals, not as representatives of their employers.
a devotion to prototypes: nobody's design is right the very
first time. Being willing to build actual implementations of early
attempts and to learn from them is a crucial step is producing an
effective final product.
common code: for software-based standards, having a common
code base has proven very effective. Allowing vendors to license
or otherwise acquire the same implementation, then port it to their
systems both shortens the time to market and makes interoperability
and portability problems less likely.
Perhaps the best example of a standards organization that has embodied
these criteria is the IETF. More recently, though, they've been
the victims of their own success, as their importance in the market
has made it more difficult for members to not represent their employer's
position. Another successful example is the Open Software Foundation
(OSF) during the original development of the Distributed Computing
Environment (DCE). Unfortunately, OSF has changed its development
model to one that requires complete agreement among competing vendors.
This unsurprisingly has led to a slowdown in planned enhancements
to DCE.
A counterexample, an organization that embodies few of these criteria,
is the Object Management Group (OMG), creators of the Common Object
Request Broker Architecture (CORBA). The group is developing standards
for very dynamic technologies, and while the participants include
many smart, experienced engineers, the important decisions are largely
made by consensus among company representatives. As a result, the
OMG standards are riddled with holes, places where vendors are all
but required to add proprietary extensions. And while OSF offered
to provide a common code base for CORBA, the offer was rejected.
Given this, it's not surprising that the current crop of CORBA products
offer limited portability across vendor implementations and very
little interoperability. With the nature of the OMG process, it's
hard to imagine any other result.
Conclusions
Especially in networking, open technologies can work. Often,
though, the chief benefits of openness--competition and lower prices--can
be better achieved through single-vendor technology franchises,
particularly with dynamic new technologies. And when open systems
are a viable option, their history seems to indicate that only certain
kinds of processes work to create them. For users, a clear understanding
of their true goals, coupled with an agnostic, pragmatic perspective,
will produce better results than blind faith in the creed of open
systems.
This article appeared in a slightly different form in the August 1995 issue of Business Communications Review.