CyberInsecurity: The Cost of Monopoly
How the Dominance of Microsoft's Products Poses a Risk to Security
Table of Contents
- 1. Author Listing
- 2. Introduction by Computer & Communications Industry Association (CCIA)
- 3. CyberInsecurity Report
- 4. Biographies of Authors
Authors of the report
Daniel Geer, Sc.D -- Chief Technical Officer, @Stake
Charles P. Pfleeger, Ph.D -- Master Security Architect, Exodus Communications, Inc.
Bruce Schneier -- Founder, Chief Technical Officer, Counterpane Internet Security
John S. Quarterman -- Founder, InternetPerils, Matrix NetSystems, Inc.
Perry Metzger -- Independent Consultant
Rebecca Bace -- CEO, Infidel
Peter Gutmann -- Researcher, Department of Computer Science, University of Auckland
Introduction by Computer & Communications Industry Association
No software is perfect. This much is known from academia and every-day experience. Yet our industry knows how to design and deploy software so as to minimize security risks. However, when other goals are deemed more important than security, the consequences can be dangerous for software users and society at large.
Microsoft's efforts to design its software in evermore complex ways so as to illegally shut out efforts by others to interoperate or compete with their products has succeeded. The monopoly product we all now rely on is thus both used by nearly everyone and riddled with flaws. A special burden rests upon Microsoft because of this ubiquity of its product, and we all need to be aware of the dangers that result from reliance upon such a widely used and essential product.
CCIA warned of the security dangers posed by software monopolies during the US antitrust proceeding against Microsoft in the mid and late 1990's. We later urged the European Union to take measures to avoid a software "monoculture" that each day becomes more susceptible to computer viruses, Trojan Horses and other digital pathogens.
Our conclusions have now been confirmed and amplified by the appearance of this important report by leading authorities in the field of cybersecurity: Dan Geer, Rebecca Bace, Peter Gutmann, Perry Metzger, John S. Quarterman, Charles Pfleeger, and Bruce Schneier.
CCIA and the report's authors have arrived at their conclusions independently. Indeed, the views of the authors are their views and theirs alone. However, the growing consensus within the computer security community and industry at large is striking, and had become obvious: The presence of this single, dominant operating system in the hands of nearly all end users is inherently dangerous. The increased migration of that same operating system into the server world increases the danger even more. CCIA is pleased to have served as a catalyst and a publisher of the ideas of these distinguished authorities.
Over the years, Microsoft has deliberately added more and more features into its operating system in such a way that no end user could easily remove them. Yet, in so doing, the world's PC operating system monopoly has created unacceptable levels of complexity to its software, in direct contradiction of the most basic tenets of computer security.
Microsoft, as the US trial record and experience has shown, has added these complex chunks of code to its operating system not because such programming complexity is necessary, but because it all but guarantees that computer makers, users and consumers will use Microsoft products rather than a competitor's.
These competition related security problems have been with us, and getting worse, for years. The recent spate of virus attacks on the Internet is one more sign that we must realize the danger we are in. The report CyberInsecurity -- The Cost of Monopoly is a wake up call that government and industry need to hear.
September 24, 2003
CYBERINSECURITY: THE COST OF MONOPOLY
HOW THE DOMINANCE OF MICROSOFT'S PRODUCTS POSES A RISK TO SECURITY
Computing is crucial to the infrastructure of advanced countries. Yet, as fast as the world's computing infrastructure is growing, security vulnerabilities within it are growing faster still. The security situation is deteriorating, and that deterioration compounds when nearly all computers in the hands of end users rely on a single operating system subject to the same vulnerabilities the world over.
Most of the world's computers run Microsoft's operating systems, thus most of the world's computers are vulnerable to the same viruses and worms at the same time. The only way to stop this is to avoid monoculture in computer operating systems, and for reasons just as reasonable and obvious as avoiding monoculture in farming. Microsoft exacerbates this problem via a wide range of practices that lock users to its platform.
The impact on security of this lock-in is real and endangers society. Because Microsoft's near-monopoly status itself magnifies security risk, it is essential that society become less dependent on a single operating system from a single vendor if our critical infrastructure is not to be disrupted in a single blow. The goal must be to break the monoculture. Efforts by Microsoft to improve security will fail if their side effect is to increase user-level lock-in. Microsoft must not be allowed to impose new restrictions on its customers -- imposed in the way only a monopoly can do -- and then claim that such exercise of monopoly power is somehow a solution to the security problems inherent in its products. The prevalence of security flaw in Microsoft's products is an effect of monopoly power; it must not be allowed to become a reinforcer.
Governments must set an example with their own internal policies and with the regulations they impose on industries critical to their societies. They must confront the security effects of monopoly and acknowledge that competition policy is entangled with security policy from this point forward.
The threats to international security posed by Windows are significant, and must be addressed quickly. We discuss here in turn the problem in principle, Microsoft and its actions in relation to those principles, and the social and economic implications for risk management and policy. The points to be made are enumerated at the outset of each section, and then discussed.
1. THE PROBLEM IN PRINCIPLE
To sum up this section:
- Our society's infrastructure can no longer function without computers and networks.
- The sum of the world's networked computers is a rapidly increasing force multiplier.
- A monoculture of networked computers is a convenient and susceptible reservoir of platforms from which to launch attacks; these attacks can and do cascade.
- This susceptibility cannot be mitigated without addressing the issue of that monoculture.
- Risk diversification is a primary defense against aggregated risk when that risk cannot otherwise be addressed; monocultures create aggregated risk like nothing else.
- The growth in risk is chiefly amongst unsophisticated users and is accelerating.
- Uncorrected market failures can create and perpetuate societal threat; the existence of societal threat may indicate the need for corrective intervention.
Computing is essential to industrialized societies. As time passes, all societal functions become more deeply dependent on it: power infrastructure, food distribution, air traffic control, emergency services, banking, telecommunications, and virtually every other large scale endeavor is today coordinated and controlled by networked computers.
Attacking national infrastructures is also done with computers -- often hijacked computers. Thus, threats to computing infrastructures are explicitly and inherently risk harm to those very societies in proportion to those society's dependence on them. A prior history of catastrophe is not required to make such a finding. You should not have to wait until people die to address risks of the scale and scope discussed here.
Regardless of where or how it is used, computing increases the capabilities and the power of those who use it. Using strategic or military terminology that means what it sounds like, computing is a "force multiplier" to those who use them -- it magnifies their power, for good or ill. The best estimates of the number of network connected computers show an increase of 50% per year on a worldwide basis. By most general measures what you can buy for the same amount of money doubles every eighteen months ("Moore's Law"). With a conservative estimate of a four year lifetime for a computer -- in other words, consumers replace computers every four years on average -- the total computing power on the Internet therefore increases by a factor of 2.7 per annum (or doubles every 10 months). If a constant fraction of computers are under threat of misuse, then the force available to misusers will thus double every 10 months.
In other words, the power available to misusers -- computer hackers, in popular parlance -- is rising both because what they can buy grows in power per dollar spent and because the total number of networked computers grows, too. Note also that this analysis does not even include attacks enabled by storage capacity, which doubles in price-performance twice as fast as CPU (doubles every nine months rather than eighteen).
Internetworked computing power makes communication feasible. Communication is of such high value that it has been the focus of much study and much conjecture and not just recently. For one-way broadcast communication, the value of the network itself rises proportionally to N, the potential number of listeners ("Sarnoff's Law"). By way of example, advertisers pay for television time in rough proportion to the number of people viewing a given program.
For two-way interactive communications -- such as between fax machines or personal email -- the value of the network rises proportionally to N2, the square of the potential number of users ("Metcalfe's Law"). Thus, if the number of people on email doubles in a given year, the number of possible communications rises by a factor of four.
Growth in communications rises even more when people can organize in groups, so that any random group of people can communicate with another. Web pages, electronic mailing lists and online newsgroups are good examples of such communications. In these cases, the value of the network rises proportionally to 2N, the potential number of groups being an exponential growth in N ("Reed's Law").
Assume for now that the Internet is somewhere between the Metcalfe model, where communications vary according to the square of the number of participants (N2), and the Reed model, where communications vary according to two raised to the Nth power (2N).
If we make this assumption, then the potential value of communications that the Internet enables will rise somewhere between 1.52 = 2.3 and 21.5 = 2.8 times per annum. These laws are likely not precisely accurate. Nonetheless, their wide acceptance and historic record show that they are good indicators of the importance of communication technology.
To extend this simple mathematical model one final step, we have assumed so far that all communications are good, and assigned to the value of the network a positive number. Nonetheless, it is obvious that not all communications (over computer networks, at least) are positive. Hackers, crackers, terrorists and garden-variety criminals use the network to defraud, spy and generally wreak havoc on a continual basis. To these communications we assign a negative value.
The fraction of communications that has positive value is one crucial measure, and the absolute number of negative communications is another. Both are dependent on the number of networked devices in total. This growth in the number of networked devices, however, is almost entirely at the "edges" of networked computing -- the desktop, the workstation, the home, the embedded system, the automated apparatus. In other words, the growth in "N" is not in the core infrastructure of the Internet where highly trained specialists watch over costly equipment with an eye towards preventing and responding to attacks. Growth, rather, is occurring mostly among ordinary consumers and non-technical personnel who are the most vulnerable to illegal intrusions, viruses, Trojan horse programs and the like. This growth at the periphery, furthermore, is accelerating as mobile, wireless devices come into their own and bring with them still more vulnerabilities.
Viruses, worms, Trojan horses and the like permit malicious attackers to seize control of large numbers of computers at the edge of the network. Malicious attackers do not, in other words, have to invest in these computers themselves -- they have only to exploit the vulnerabilities in other people's investments.
Barring such physical events as 9/11, an attack on computing is a set of communications that take advantage of latent flaws already then present in those computers' software. Given enough knowledge of how a piece of software works, an attacker can force it to do things for which it was never designed. Such abuse can take many forms; a naturalist would say that attacks are a broad genus with many species. Within this genus of attacks, species include everything from denial of service, to escalation of authority, to diversion of funds or data, and on. As in nature, some species are more common than others.
Similarly, not all attacks are created equal. An annoying message that pops up once a year on screen to tell a computer user that he has been infected by Virus XYZ is no more than that; an annoyance. Other exploitations cost society many, many dollars in lost data, lost productivity and projects destroyed from data crashes. Examples are many and familiar including the well known ILOVE YOU, NIMDA, and Slammer attacks not to mention taking over users' machines for spamming, porn distribution, and so forth.
Still other vulnerabilities, though exploited every day and costing society substantial sums of time and money, seldom appear in the popular press. According to Londonbased computer security firm, mi2g Ltd., global damage from malicious software inflicted as much as $107 billion in global economic damage this year. It estimates that the SoBig worm, which helped make August the costliest month in terms of economic damage, was responsible for nearly $30 billion in damage alone.1
For an attack to be a genuine societal-scale threat, either the target must be unique and indispensable -- a military or government computer, authoritative time lookup, the computer handling emergency response (911) calls, airport flight control, say -- or the attack must be one which once triggered uncontrollably cascades from one machine to the next. The NIMDA and Slammer worms that attacked millions of Windows-based computers were examples of such "cascade failure" -- they spread from one to another computer at high rates. Why? Because these worms did not have to guess much about the target computers because nearly all computers have the same vulnerabilities.
Unique, valuable targets are identifiable so we, as a society, can concentrate force around them. Given enough people and training (a tall order to be sure), it is possible to protect the unique and core assets. Advanced societies have largely made these investments, and unmitigated failures do not generally occur in these systems.
Not so outside this core: As a practical and perhaps obvious fact, the risk of cascade failure rises at the edges of the network where end users are far more likely to be deceived by a clever virus writer or a random intruder. To put the problem in military terms, we are the most vulnerable when the ratio of available operational skill to available force multiplication is minimized and thus effective control is weakest. Low available skill coupled to high potential force multiplication is a fair description of what is today accumulating on the periphery of the computing infrastructures of every advanced nation. In plainer terms, the power on the average desktop goes up very fast while the spread of computers to new places ensures the average skill of the user goes down. The average user is not, does not want to be, and should not need to be a computer security expert any more than an airplane passenger wants to or should need to be an expert in aerodynamics or piloting. This very lack of sophisticated end users renders our society at risk to a threat that is becoming more prevalent and more sophisticated.
Regardless of the topic -- computing versus electric power generation versus air defense -- survivability is all about preparing for failure so as to survive it. Survivability, whether as a concept or as a measure, is built on two pillars: replicated provisioning and diversified risk. Replicated ("redundant") provisioning ensures that any entity's activities can be duplicated by some other activity; high availability database systems are such an example in computing just as backup generators are in electric power. The ability of redundant systems to protect against random faults is cost effective and well documented.
By contrast, redundancy has little ability to protect against cascade failure; having more computers with the same vulnerabilities cannot help if an attack can reach them all. Protection from cascade failure is instead the province of risk diversification -- that is, using more than one kind of computer or device, more than one brand of operating system, which in turns assures that attacks will be limited in their effectiveness. This fundamental principle assures that, like farmers who grow more than one crop, those of us who depend on computers will not see them all fail when the next blight hits. This sort of diversification is widely accepted in almost every sector of society from finance to agriculture to telecommunications. In the broadest sense, economic diversification is as much the hallmark of free societies as monopoly is the hallmark of central planning. Governments in free market societies have intervened in market failures -- preemptively where failure was be intolerable and responsively when failure had become selfevident.
In free market economies as in life, some failure is essential; the "creative destruction" of markets builds more than it breaks. Wise governments are those able to distinguish that which must be tolerated as it cannot be changed from that which must be changed as it cannot be tolerated. The reapportionment of risk and responsibility through regulatory intervention embodies that wisdom in action. If governments are going to be responsible for the survivability of our technological infrastructure, then whatever governments do will have to take Microsoft's dominance into consideration.
To sum up this section:
- Microsoft is a near-monopoly controlling the overwhelming majority of systems.
- Microsoft has a high level of user-level lock-in; there are strong disincentives to switching operating systems.
- This inability of consumers to find alternatives to Microsoft products is exacerbated by tight integration between applications and operating systems, and that integration is a long-standing practice.
- Microsoft's operating systems are notable for their incredible complexity and complexity is the first enemy of security.
- The near universal deployment of Microsoft operating systems is highly conducive to cascade failure; these cascades have already been shown to disable critical infrastructure.
- After a threshold of complexity is exceeded, fixing one flaw will tend to create new flaws; Microsoft has crossed that threshold.
- Even non-Microsoft systems can and do suffer when Microsoft systems are infected.
- Security has become a strategic concern at Microsoft but security must not be permitted to become a tool of further monopolization.
Near-monopoly dominance of computing by Microsoft is obvious beyond the findings of any court. That percentage dominance is at peak in the periphery of the computing infrastructure of all industrial societies. According to IDC, Microsoft Windows represented 94 percent of the consumer client software sold in the United States in 2002.2 Online researcher OneStat.com estimates Microsoft Windows' market share exceeds 97 percent.3 Its Internet Explorer and Office Suite applications share similar control of their respective markets. The tight integration of Microsoft application programs with Microsoft operating system services is a principal driver of that dominance and is at the same time a principal driver of insecurity. The "tight integration" is this: inter-module interfaces so complex, undocumented, and inaccessible as to (1) permit Microsoft to change them at will, and thus to (2) preclude others from using them such as to compete.
Tight integration of applications and operating system achieves user lock-in by way of application lock-in. It works. The absence of published, stable exchange interfaces necessary to enable exchange of data, documents, structures, etc., enlists such data, documents, or structures as enforcers of application lock-in. Add in the "network effects," such as the need to communicate with others running Microsoft Office, and you dissuade even those who wish to leave from doing so. If everyone else can only use Office then so must you.
Tight integration, whether of applications with operating systems or just applications with each other, violates the core teaching of software engineering, namely that looselycoupled interfaces make maintenance easier and life-cycle costs lower. Academic and commercial studies supporting this principle are numerous and long-standing.
Microsoft well knows this; Microsoft was an early and aggressive promoter of modular programming practices within its own development efforts. What it does, however, is to expressly curtail modular programming and loose-coupling in the interfaces it offers to others. For whatever reason, Microsoft has put aside its otherwise good practices wherever doing so makes individual modules hard to replace. This explains the rancor over Prof. Ed Felten's Internet Explorer removal gadget just as it explains Microsoft's recent decision to embed the IE browser so far into their operating system that they are dropping support for IE on the Macintosh platform. Integration of this sort is about lock-ins through integration too tight to easily reverse buttressed by network effects that effectively discourage even trying to resist.
This integration is not the norm and it is not essential. Just limiting the discussion to the ubiquitous browser, it is clear that Mozilla on Linux or Safari on Macintosh are counter-examples: tight integration has no technical necessity. Apple's use of Safari is particularly interesting because it gets them all the same benefits that Microsoft gets from IE (including component reuse of the HTML rendering widget), but it's just a generic library, easy to replace.4 The point is that Microsoft has performed additional, unnecessary engineering on their products with the result of making components hard to pull out, and thus raising the barrier to entry for competition. Examples of clean interfaces are much older than Microsoft: the original UNIX was very clean and before that Multics or Dijkstra's 1968 "THE" system showed what could be done. In other words, even when Microsoft was very much smaller and very much easier to change these ideas were known and proven, therefore what we have before us today is not inadvertent, it is on plan.
This tight-integration is a core component of Microsoft's monopoly power. It feeds that power, and its effectiveness is a measure of that power. This integration strategy also creates risk if for no other reason that modules that must interoperate with other modules naturally receive a greater share of security design attention than those that expect to speak only to friends. As proof by demonstration, Microsoft's design-level commitment to identical library structures for clients and servers, running on protocols made explicitly difficult for others to speak (such as Microsoft Exchange), creates insecurity as that is precisely the characteristic raw material of cascade failure: a universal and identical platform asserted to be safe rather than shown in practice to be safe. That Microsoft is a monopoly makes such an outcome the default outcome.
The natural strategy for a monopoly is user-level lock-in and Microsoft has adopted this strategy. Even if convenience and automaticity for the low-skill/no-skill user were formally evaluated to be a praiseworthy social benefit, there is no denying the latent costs of that social benefit: lock-in, complexity, and inherent risk.
One must assume that security flaws in Microsoft products are unintentional, that security flaws simply represent a fraction of all quality flaws. On that assumption, the quality control literature yields insight.
The central enemy of reliability is complexity. Complex systems tend to not be entirely understood by anyone. If no one can understand more than a fraction of a complex system, then, no one can predict all the ways that system could be compromised by an attacker. Prevention of insecure operating modes in complex systems is difficult to do well and impossible to do cheaply: The defender has to counter all possible attacks; the attacker only has to find one unblocked means of attack. As complexity grows, it becomes ever more natural to simply assert that a system or a product is secure as it becomes less and less possible to actually provide security in the face of complexity.
Microsoft's corporate drive to maximize an automated, convenient user-level experience is hard to do -- some would say un-doable except at the cost of serious internal complexity. That complexity must necessarily peak wherever the ratio of required convenience to available skill peaks, viz., in the massive periphery of the computing infrastructure. Software complexity is difficult to measure but software quality control experts often describe software complexity as proportional to the square of code volume. One need look no further than Microsoft's own figures: On rate of growth, Windows NT code volume rose 35% per year (implying that its complexity rose 80%/year) while Internet Explorer code volume rose 220%/year (implying that its complexity rose 380%/year). Consensus estimates of accumulated code volume peg Microsoft operating systems at 4-6x competitor systems and hence at 15-35x competitor systems in the complexity-based costs in quality. Microsoft's accumulated code volume and rate of code volume growth are indisputably industry outliers that concentrate complexity in the periphery of the computing infrastructure. Because it is the complexity that drives the creation of security flaws, the default assumption must be that Microsoft's products would have 15-35x as many flaws as the other operating systems.5
One cannot expect government regulation to cap code size -- such a proposal would deserve the derision Microsoft would heap upon it. But regulators would do well to understand that code "bloat" matters most within modules and that Microsoft's strategy of tight integration makes effective module size grow because those tightly integrated components merge into one. It is likely that if module sizes were compared across the industry that the outlier status of Microsoft's code-size-related security problems would be even more evident than the total code volume figures indicate.
Above some threshold level of code complexity, fixing a known flaw is likely to introduce a new, unknown flaw; therefore the law of diminishing returns eventually rules. The general quality control literature teaches this and it has been the received wisdom in software development for a long time (Lehman & Belady at IBM6 and later in many papers and at least one book). The tight integration of Microsoft operating systems with Microsoft application products and they with each other comes at a cost of complexity and at a cost in code volume. Patches create new flaws as a regular occurrence thus confirming that Microsoft's interdependent product base is above that critical threshold where repairs create problems. Some end-users understand this, and delay deployment of patches until testing can confirm that the criticality of problems fixed are not eclipsed by the criticality of problems created. With mandatory patches arriving at the rate of one every six days (39 through 16 September), it is few users indeed who can keep up.
Two different subsets of users effectively bow out of the patching game: the incapablemany (end-users who have limited understanding of -- and limited desire to understand -- the technology even when it is working correctly) and the critical-infrastructure-few (for whom reliability is such a vital requirement that casual patching is unthinkable). Un-patched lethal flaws thus accumulate in the user community. (The Slammer worm fully demonstrated that point -- the problem and the patch were six months old when Slammer hit.)7 Monopoly market dominance is thus only part of the risk story -- market dominance coupled with accumulating exploitable flaw density yields a fuller picture.
Not only is nearly every networked computer sufficiently alike to imply that what vulnerability one has, so has another, but the absolute number of known-to-beexploitable vulnerabilities rises over time. Attackers of the most consummate skill already batch together vulnerabilities thus to ensure cascade failure. (The NIMDA virus fully demonstrated that point -- it used any of five separate vulnerabilities to propagate itself.)
Microsoft has had a history of shipping software at the earliest conceivable moment. Given their market dominance, within days if not hours the installed base of any released Microsoft software, however ill thought or implemented, was too large to dislodge or ignore. No more. Of late Microsoft has indeed been willing to delay product shipment for security reasons. While it is too early to tell if and when this will actually result in a healthier installed base, it is an admission that the level of security flaw density was a greater threat to the company than the revenue delay from slipping ship dates. It is also an admission that Microsoft holds monopoly power -- they and they alone no longer need to ship on time. That this coincides with Microsoft's recent attempts to switch to annual support contracts to smooth out their revenue streams is, at least, opportunistic if not tactical.
On the horizon, we see the co-called Trusted Computing Platform Association (TCPA)8 and the "Palladium" or "NGSCB" architecture for "trusted computing." In the long term, the allure of trusted computing can hardly be underestimated and there can be no more critical duty of government and governments than to ensure that a spread of trusted computers does not blithely create yet more opportunities for lock-in. Given Microsoft's tendencies, however, one can foresee a Trusted Outlook that will refuse to talk to anything but a Trusted Exchange Server, with (Palladium's) strong cryptographic mechanisms for enforcement of that limitation. There can be no greater user-level lock-in than that, and it will cover both local applications and distributed applications, and all in the name of keeping the user safe from viruses and junk. In other words, security will be the claimed goal of mechanisms that will achieve unprecedented user-level lock-in. This verifies the relevance of evaluating the effect of user-level lock-in on security.
3. IMPACT ON PUBLIC PROTECTION
To sum up this section:
- Without change, Microsoft's history predicts its future.
- We must take conscious steps to counter the security threat of Microsoft's monopoly dominance of computing.
- Unless Microsoft's applications and interfaces are available on non-Microsoft platforms it will be impossible to defeat user lock-in.
- Governments by their own example must ensure that nothing they deem important is dependent on a monoculture of IT platforms; the further up the tree you get the more this dictum must be observed.
- Competition policy is tangled with security policy from this point on.
Microsoft and regulators come to this point with a considerable history of flouted regulation behind them, a history which seems unnecessary to recount other than to stipulate that it either bears on the solution or history will repeat itself.
Yes, Microsoft has the power to introduce features unilaterally and one might even say that the current security situation is sufficiently dire that Microsoft as the head of a command structure is therefore somehow desirable. Yet were it not for Microsoft's commanding position economics would certainly be different whether it would be a rise in independent, competitive, mainstream software development industries (because the barriers to entry would be lower), or that today's locked-in Microsoft users would no longer pay prices that only a monopoly can extract. For many organizations the only thing keeping them with Microsoft in the front office is Office. If Microsoft was forced to support Office on, say, Linux, then organizations would save substantial monies better spent on innovation. If Microsoft were forced to interoperate, innovators and innovation could not be locked-out because users could not be locked-in.
Both short-term impact mitigation and long term competition policy must recognize this analysis. In the short term, governments must decide in unambiguous ways whether they are able to meaningfully modify the strategies and tactics of Microsoft's already-in-place monopoly.
If governments do not dismantle the monopoly but choose instead to modify the practices of the monopoly they must concede that that route will, like freedom, require eternal vigilance. Appropriate support for addressing the security-related pathologies of monopoly would doubtless include the introduction of effective, accessible rights of action in a court of law wherever security flaws lead to harm to the end-user. In extreme cases, the consequences of poor security may be broad, diffuse, and directly constitute an imposition of costs on the user community due to the unfitness of the product. Under those circumstances, such failures should surely be deemed "per se" offenses upon their first appearance on the network.
Where risk cannot be mitigated it can be transferred via insurance and similar contracts. As demonstrated in previous sections, the accumulation of risk in critical infrastructure and in government is growing faster than linear, i.e., faster than mere counts of computers or networks. As such, any mandated risk transfer must also grow faster than linear whether those risk transfer payments are a priori, such as for bonding and insurance, or a posteriori, such as for penalties. If risk transfer payments are to be risk sensitive, the price and probability of failure are what matter and thus monopoly status is centrally relevant. For governments and other critical infrastructures, the price of failure determines the size of the risk transfer. Where a software monoculture exists -- in other words, a computing environment made up of Windows and almost nothing else -- what remains operational in the event of wholesale failure of that monoculture determines the size of the risk transfer. Where that monoculture is maintained and enforced by lock-in, as it is with Windows today, responsibility for failure lies with the entity doing the locking-in -- in other words, with Microsoft. It is important that this cost be made clear now, rather than waiting until after a catastrophe.
The idea of breaking Microsoft into an operating system company and an applications company is of little value -- one would just inherit two monopolies rather than one and the monocultural, locked-in nature of the user base would still nourish risk. Instead, Microsoft should be required to support a long list of applications (Microsoft Office, Internet Explorer, plus their server applications and development tools) on a long list of platforms. Microsoft should either be forbidden to release Office for any one platform, like Windows, until it releases Linux and Mac OS X versions of the same tools that are widely considered to have feature parity, compatibility, and so forth. Alternately, Microsoft should be required to document and standardize its Exchange protocols, among other APIs, such that alternatives to its applications could independently exist.
Better still, split Microsoft Office into its components -- noticing that each release of Office adds new things to the "bundle": first Access, the Outlook, then Publisher. Even utilities, such as the grammar checker or clip art manager, might pose less risk of compromise and subsequent OS compromise if their interfaces were open (and subject to public scrutiny and analysis and validation). Note that one of the earlier buffer overflow exploits involved the phone dialer program, and ordinarily benign and uninteresting utility that could have been embedded within dial-up networking, Internet Explorer, Outlook and any other program that offered an Internet link.
The rigorous, independent evaluations to which these otherwise tightly integrated interfaces would thus be exposed would go a long way towards security hardening them while permitting meaningful competition to arise. Microsoft will doubtless counter that its ability to "innovate" would be thus compromised, but in the big picture sense everyone else would have a room to innovate that they cannot now enjoy. Where governments conclude that they are unable to meaningfully modify the strategies and tactics of the already-in-place Microsoft monopoly, they must declare a market failure and take steps to enforce, by regulation and by their own example, risk diversification within those computing plants whose work product they value.
Specifically, governments must not permit critical or infrastructural sectors of their economies to implement the monoculture path, and that includes government's own use of computing. Governments, and perhaps only governments, are in leadership positions to affect how infrastructures develop. By enforcing diversity of platform to thereby blunt the monoculture risk, governments will reap a side benefit of increased market reliance on interoperability, which is the only foundation for effective incremental competition and the only weapon against end-user lock-in. A requirement that no operating system be more than 50% of the installed based in a critical industry or in a government would moot monoculture risk. Other branches to the risk diversification tree can be foliated to a considerable degree, but the trunk of that tree on which they hang is a total prohibition of monoculture coupled to a requirement of standards-based interoperability.
These comments are specific to Microsoft, but would apply to any entity with similar dominance under current circumstances. Indeed, similar moments of truth have occurred, though for different reasons, with IBM or AT&T. The focus on Microsoft is simply that the clear and present danger can be ignored no longer. While appropriate remedies require significant debate, these three alone would engender substantial, lasting improvement if Microsoft were vigorously forced to:
- Publish interface specifications to major functional components of its code, both Windows and Office.
- Foster development of alternative sources of functionality through an approach comparable to the highly successful "plug and play" technology for hardware components.
- Work with consortia of hardware and software vendors to define specifications and interfaces for future developments, in a way similar to the Internet Society's RFC process to define new protocols for the Internet.
Daniel Geer, Sc.D - Dr. Geer is Chief Technical Officer of @Stake, in Cambridge, Mass. Dr. Geer has a long history in network security and distributed computing management as an entrepreneur, author, scientist, consultant, teacher, and architect. He has provided high-level strategy in all manners of digital security and on promising areas of security research to industry leaders including Digital Equipment Corporation, OpenVision Technologies, Open Market, and CertCo. He has written extensively on large-scale security issues such as risk management, applications of cryptography, and Web security for The Digital Commerce Society, the Securities Industry Middleware Council, the Internet Security Conference, and the USENIX Association for whom he founded several conferences.
Dr. Geer has testified before Congress on multiple occasions and has served on various relevant advisory committees to the Federal Trade Commission, the National Science Foundation, the National Research Council, the Commonwealth of Massachusetts, the Department of Defense, the National Institute of Justice, and the Institute for Information Infrastructure Protection.
Dr. Geer holds several security patents, an Sc.D. in Biostatistics from Harvard University's School of Public Health and an S.B. in Electrical Engineering and Computer Science from MIT.
Charles P. Pfleeger, Ph.D - Dr. Pfleeger is a Master Security Architect in the Professional Services group of Exodus Communications, Inc. From 1992 to 1995 he was Director of European Operations for Trusted Information Systems, Inc. (TIS) and head of its European office in London. He was a member of the author group of the U.S. Federal security evaluation criteria and a co-author of the evaluation criteria for trusted virtual machine architectures. He led activities in secure networking, security analysis in hardware design, secure system architecture, and research into assured service. Prior to joining TIS in 1988, he was a professor in the Computer Science Department of the University of Tennessee Dr. Pfleeger has lectured throughout the world and published numerous papers and books. His book Security in Computing (the third edtion will be available from Prentice Hall in 2002) is the standard college textbook in computer security. He is the author of other books and articles on technical computer security and computer science topics.
He holds a Ph.D. degree in computer science from The Pennsylvania State University and a B.A. with honors in mathematics from Ohio Wesleyan University.
Bruce Schneier - Internationally renowned security expert Bruce Schneier has authored six books--including BEYOND FEAR and SECRETS AND LIES--as well as the Blowfish and Twofish encryption algorithms. Mr. Schneier has appeared on numerous television and radio programs, has testified before Congress, and is a frequent writer and lecturer on issues surrounding security and privacy.
Mr. Schneier is responsible for maintaining Counterpane's technical lead in world-class information security technology and its practical and effective implementation. Mr. Schneier's security experience makes him uniquely qualified to shape the direction of the company's research endeavors, as well as to act as a spokesperson to the business community on security issues and solutions.
Mr. Schneier holds an MS degree in computer science from American University and a BS degree in physics from the University of Rochester.
John S. Quarterman - John S. Quarterman is founder of InternetPerils, an Internet riskmanagement company. Previously, he was Founder and Chief Technology Officer of Matrix NetSystems Inc., the first company to map and track global traffic across the Internet. Mr. Quarterman has almost thirty years experience with network issues dating as far back as 1974, when he first used ARPANET, the Internet's predecessor, at Harvard University. He subsequently worked on ARPANET Unix software for Bolt, Beranek and Newman, the original prime contractor for the network.
Mr. Quarterman has consulted for a wide range of companies and organizations, including AT&T, HP, IBM, MCI and Nortel, among others. Twice elected to the board of directors of USENIX, he was instrumental in the board's decision to provide funding for UUNet, one of the first two commercial Internet service providers. A published author, he has written for Communications of the ACM, Forbes, First Monday and Computerworld, among others. He has appeared in articles written by others in the New York Times, the San Jose Mercury News, The Economist, The Washington Post, Wired and others too numerous to mention.
Perry Metzger - Perry Metzger is managing partner of Metzger, Dowdeswell & Co LLC, a New York based computer security and infrastructure consulting firm. Prior to this, Mr. Metzger founded and served as CEO of Wasabi Systems, Inc., a startup specializing in operating system software for embedded platforms. Previously Mr. Metzger served as President of Piermont Information Systems Inc., a New York based computer security consulting firm he founded in 1994. Piermont's clients included prominent international banks and brokerages, money management companies, public relations firms and advertising agencies.
Before founding Piermont, Mr. Metzger was involved in a variety of innovative technological projects, including highly parallel computer systems, automated equities trading systems, automated systems management software, and the implementation of one of the world's first firewall systems. Mr. Metzger is highly active in the work of the Internet's standardization body, the IETF. He was instrumental in the design and standardization of several major internet security protocols,including IPSEC, for which he served as co-author of several of the initial standards documents.
Becky Bace - Becky Bace is an internationally recognized expert in network security and intrusion detection. A 2003 recipient of Information Security Magazine's Women of Vision Award, she is recognized as one of the most influential women in Information Security today. Ms. Bace has worked in security since the 1980s, leading the first major intrusion detection research program at the National Security Agency, where she received the Distinguished Leadership Award, serving as the Deputy Security Officer for the Computing Division of the Los Alamos National Laboratory, and, since 1997, working as a strategic consultant.
She is currently President and CEO of Infidel, Inc., a security consulting firm. Ms. Bace's publication credits include the books Intrusion Detection (Macmillan, 2000) and A Guide to Forensic Testimony: The Art and Practice of Presenting Testimony as An Expert Technical Witness, (Addison-Wesley, October, 2002).
She received a B.S., Engineering/Computer Science from the University of the State of New York, and an M.E.S., Digital Systems Engineering, from Loyola College.
Peter Gutmann - Peter Gutmann is a researcher in the Department of Computer Science at the University of Auckland working on design and analysis of cryptographic security architectures. He helped write the popular PGP encryption package and has authored a number of papers on security and encryption including the X.509 Style Guide for certificates.
Over the years, Mr. Gutmann has uncovered numerous security flaws in various computing products, including problems with the encryption used in an early version of the Netscape browser and, later, Internet Explorer. He has also uncovered flaws in previous versions of Norton's Diskreet disk encryption, the Windows 95 password file system and the smart-card fare system used by Auckland's largest public transportation organization.
Gutmann is the author of the much used, open source cryptlib security toolkit.
1 "Government Issue," David Zeiler, The Baltimore Sun/SunSpot.net. September 18, 2003
2 "Wal-Mart sells more Linux wares online," Matt Hines, News.com. August 21, 2003.
3 "Microsoft's Windows OS global market share is more than 97% according to OneStat.com," OneStat.com press release. September 10, 2002.
4 "Apple Releases its own browser," Joe Wilcox, News.com, January 7, 2003.
5 Microsoft seems at least aware of the problem. See: http://www.wired.com/wired/archive/3.09/myhrvold.html.
6 L.A. Belady and M.M. Lehman, "A Model of Large Program Development," IBM Systems Journal 15(3), p.225--252 (1976).
7 " Slammer worm brings patch mgmt. issues to the fore," Audrey Rasmussen, Network World Fusion, Feb. 5, 2003.
8 See: http://www.trustedcomputing.org/home
Categories: Computer and Information Security
Tags: Computer & Communications Industry Association Report
by Jeffrey A. Eisenach and Thomas M. Lenard
Progress On Point
Periodic Commentaries on the Policy Debate
Release 7.4 n April 2000
The Microsoft case is a legitimate and important topic for political debate. Have the antitrust laws outlived their usefulness? Should they be enforced in the high-tech sector of the economy? Is Microsoft a good candidate for such enforcement? Have Microsoft’s actions violated the law and/or harmed consumers? Most importantly, if Microsoft has violated the law, what can or should be done about it?
In our view, it is quite clear that Microsoft has violated the law and harmed consumers. Further, we believe that one type of remedy -- a "competitive" structural remedy that would create four companies from the current one and so restore competition to the market for operating systems -- is clearly preferable to other alternatives. In this paper, we summarize the factual evidence and legal analysis that lead us to conclude a remedy is desirable, and describe briefly the remedy we have concluded would best serve consumers.
While we believe these issues are all worthy of debate and discussion, such discussion can only be constructive if it acknowledges the voluminous factual and legal record that has already been established during the course of the trial. Some of Microsoft’s defenders apparently view the trial record as unimportant -- or even biased. Rather than argue the facts, or the law, they have cast aspersions on the ideological leanings (too liberal?) or even the ethical standards (politically motivated?) of those involved in prosecuting the case.
For those who might be inclined to accept such arguments, it is important to remember that the Microsoft case has been prosecuted by an Assistant Attorney General for Antitrust, Joel Klein, who was confirmed by the Senate on a vote of 88-12 -- with all 12 of those opposing his nomination being liberal Democrats concerned that he would be too "pro-market" in his approach. Speaking in favor of Klein's nomination, former Judiciary Committee Chairman Senator Strom Thurmond (R-SC) defended his pro-market approach, calling him "within the mainstream of antitrust law and doctrine."
Further, the trial has been presided over by a Reagan-appointed judge, Thomas Penfield Jackson, who is not known for having anti-business views. Even as staunch a critic of the Microsoft case as The Wall Street Journal’s editorial page said "it was hard to find much wrong with Judge Jackson’s rendition of the ‘facts’."1
Rather than casting about for conspiracy theories, everyone interested in this matter would do well to focus on the facts, the law -- and the choice now before the courts, which is not whether, but how, to remedy the damage being caused by the Microsoft monopoly.
Judge Jackson’s 205-page "Findings of Fact"2 convincingly establishes three facts that are crucial to understanding this case:
First, Microsoft possesses monopoly power in the market for Personal Computer (PC) operating systems;
Second, Microsoft engaged in a wide-ranging effort to protect its operating system monopoly, utilizing a full array of exclusionary practices; and
Third, Microsoft’s actions were harmful to innovation and to consumers.
The Microsoft Monopoly: Judge Jackson’s Findings leave no serious doubt that Microsoft is a monopoly -- that is, that it possesses market power in the market for Intel-compatible operating systems. Judge Jackson bases this conclusion on three factors:
Viewed together, three main facts indicate that Microsoft enjoys monopoly power. First, Microsoft’s share of the market for Intel-compatible PC operating systems is extremely large and stable. Second, Microsoft’s dominant market share is protected by a high entry barrier. Third, and largely as a result of that barrier, Microsoft’s customers lack a commercially viable alternative to Windows. Findings ¶¶34.
While some Microsoft defenders have argued that new developments in the computer marketplace have eroded Microsoft's monopoly power, they fail to acknowledge that Judge Jackson specifically addressed such developments, including the Linux operating system; the growing popularity of hand-held information appliances, such as Palm computers; and the growth of Web-based applications, but found no evidence to indicate that any of them would erode Microsoft’s market dominance for the foreseeable future. Findings ¶¶ 48-50, 22-26.
On the question of monopoly power, Jackson's finding is consistent with virtually all the available data, as well as the public and private statements of such industry leaders as Microsoft's own chairman, Bill Gates. To be credible, contrary arguments should either provide new information or suggest some flaw in Judge Jackson's reasoning. All of the arguments we have seen, however, do nothing more than repeat speculation about how technological change will soon make Microsoft's monopoly irrelevant -- speculation conclusively and persuasively rejected by the Court.
Microsoft's Conduct: T he fact of Microsoft's monopoly is important not because having a monopoly is in and of itself illegal, but because only firms that possess such power are able to engage in certain activities that are harmful to consumers. Since Microsoft has been established to have market power, the next question is whether Microsoft actually engaged in such behaviors. Judge Jackson finds that it did.
Judge Jackson finds that Microsoft was especially concerned about technologies, such as Netscape’s Navigator browser, that could support platform-independent computing and thereby erode Microsoft’s position. In response to the Netscape threat, Microsoft undertook a broad array of anticompetitive practices to increase the market share of its Internet Explorer.
Microsoft…paid huge sums of money, and sacrificed many millions more in lost revenue every year, in order to induce firms to take actions that would help increase Internet Explorer’s share of browser usage at Navigator’s expense. Findings ¶ ¶139.
Note that Judge Jackson's finding with respect to Microsoft and Netscape is not limited to the question of "technological tying" -- i.e. whether Microsoft could legally bundle its browser with its operating system. Instead, Jackson identifies a broad pattern of activities for which Microsoft advanced no credible efficiency rationale, but which can easily be understood as being designed to harm competition. For example, Judge Jackson found that Microsoft was able to use its Windows license as leverage in disputes with original equipment manufacturers (OEMs), such as Compaq, over which browser would be featured on their products.
Microsoft sent Compaq a letter. . . stating its intention to terminate Compaq's license for Windows 95 if Compaq did not restore the MSN and Internet Explorer icons to their original positions. Compaq's executives opined that their firm could not continue in business for long without a license for Windows, so in June Compaq restored the MSN and IE icons to the Presario desktop. Findings ¶ 206.
The Findings of Fact also establish that Microsoft's anticompetitive conduct was not limited to its battle with Netscape, but instead went well beyond the so-called "browser wars." When Intel, for example, began developing software that would go directly to the equipment manufacturers and bypass Windows, Microsoft Chairman Bill Gates went straight to the top.
Gates told Grove that he had a fundamental problem with Intel using revenues from its microprocessor business to fund the development and distribution of free platform level software. . . . Faced with Gates' threat, Intel agreed to stop. . . . Findings ¶ 102.
Similarly, Microsoft attempted to use the leverage provided by the Windows monopoly to persuade IBM to stop competing in the market for applications software.
When IBM refused to abate the promotion of those of its own products that competed with Windows and Office, Microsoft punished the IBM PC Company with higher prices, a late license for Windows 95, and the withholding of technical and marketing support. Findings ¶ 116.
In addition to these examples, the Findings of Fact also establish that Microsoft threatened or otherwise engaged in anticompetitive conduct on numerous other occasions, involving such major companies as Apple, AOL, Intuit, Real Networks and Sun Microsystems.
In summary, far from the cry of Microsoft's defenders that the company is being punished for being more efficient than its competitors, or for "building a better mousetrap," the facts establish that it engaged in a broad, persistent pattern of behavior for which there is no plausible explanation other than an intention to deprive consumers of the benefits of competition. Ignoring these facts, as Microsoft's defenders consistently do, cannot make them go away.
Microsoft and Consumers: Microsoft's defenders are also wont to suggest that Judge Jackson has ignored the issue of consumer harm. To the contrary, the Findings of Fact identify numerous instances in which Microsoft’s anticompetitive conduct had restricted consumer choice, deterred innovation and had a chilling effect on the entire industry.
Although Microsoft's campaign to capture the OEM channel succeeded, it required a massive and multifarious investment by Microsoft; it also stifled innovation by OEMs that might have made Windows PC operating systems easier to use and more attractive to consumers. Findings ¶ 241.
Microsoft also engaged in a concerted series of actions designed to protect the applications barrier to entry, and hence its monopoly power, from a variety of middleware threats, including Netscape's Web browser and Sun's implementation of Java. Many of these actions have harmed consumers in ways that are immediate and easily discernible. They have also caused less direct, but nevertheless serious and far-reaching, consumer harm by distorting competition. Findings ¶ 409.
Through its conduct toward Netscape, IBM, Compaq, Intel, and others, Microsoft has demonstrated that it will use its prodigious market power and immense profits to harm any firm that insists on pursuing initiatives that could intensify competition against one of Microsoft's core products. . . . The ultimate result is that some innovations that would truly benefit consumers never occur for the sole reason that they do not coincide with Microsoft's self-interest. Findings ¶ 412.
The Findings of Fact demonstrate beyond any doubt that Microsoft's conduct had its intended effect of raising the costs to consumers of using products that Microsoft deemed dangerous to its monopoly, and of reducing the benefits to consumers of the innovation that would have taken place in the absence of Microsoft's illegal conduct. This is precisely the sort of consumer harm the antitrust laws seek to mitigate.
The Sherman Antitrust Act is the cornerstone of antitrust policy in the United States. Based on his Findings of Fact, Judge Jackson issued "Conclusions of Law"3 in which he determined that:
Microsoft maintained its monopoly power by anticompetitive means and attempted to monopolize the Web browser market, both in violation of section 2. Microsoft also violated section 1 of the Sherman Act by unlawfully tying its Web browser to its operating system. The facts found do not support the conclusion, however, that the effect of Microsoft’s marketing arrangements with other companies constituted unlawful exclusive dealing under criteria established by leading decisions under section 1. Conclusions p. 2.
In other words, Judge Jackson found Microsoft guilty of monopolization under Section 2 of the Sherman Act, both because it used illegal means to maintain its operating system monopoly and because it used illegal means to attempt to establish a monopoly in the market for Web browsers. He also found Microsoft guilty under Section 1 of the Act for illegally tying the Internet Explorer browser to the Windows operating system. However, he exonerated Microsoft on the charge of exclusive dealing under Section 1.
Jackson's Conclusions of Law detail the basis for each conclusion. On the charge of illegally maintaining its operating system monopoly, he finds that:
Microsoft strove over a period of approximately four years to prevent middleware technologies from fostering the development of enough full-featured cross-platform applications to erode the applications barrier. . . . . Microsoft succeeded . . . . Because Microsoft achieved this goal through exclusionary acts that lacked procompetitive justification, the Court deems Microsoft's conduct the maintenance of monopoly power by anticompetitive means. Conclusions p. 9.
Jackson specifically finds that there was no legitimate economic purpose for Microsoft's illegal conduct.
Microsoft fails to advance any legitimate business objectives that actually explain the full extent of this significant exclusionary conduct. Conclusions p. 11.
Because the full extent of Microsoft's exclusionary initiatives in the [Internet Access Provider] channel can only be explained by the desire to hinder competition on the merits in the relevant market, those initiatives must be labeled anticompetitive. Conclusions p. 16.
There are no valid reasons to justify the full extent of Microsoft's exclusionary behavior in the [Internet Access Provider] channel. Conclusions p. 15.
He also considers and specifically rejects Microsoft's contention that its activities were nothing more than the rough and tumble of the competitive process, redounding ultimately to the benefit of consumers:
These actions cannot be described as competition on the merits, and they did not benefit consumers. Conclusions p. 19.
To the contrary, Jackson concludes that Microsoft's actions were the antithesis of competition on the merits and, in the broadest sense, constitute predatory behavior that is illegal under Section 2 of the Sherman Act.
Microsoft placed an oppressive thumb on the scale of competitive fortune, thereby effectively guaranteeing its continued dominance in the relevant market. More broadly, Microsoft's anticompetitive actions trammeled the competitive process through which the computer software industry generally stimulates innovation and conduces to the optimum benefit of consumers. Conclusions p. 20.
Microsoft's campaign must be termed predatory. Since the Court has already found that Microsoft possesses monopoly power . . . the predatory nature of the firm's conduct compels the Court to hold Microsoft liable under Section 2 of the Sherman Act. Conclusions p. 21.
In sum, Judge Jackson's Conclusions of Law are damning to Microsoft and its conduct. After considering each of Microsoft's arguments to the contrary, he demonstrates that Microsoft's conduct, taken as a whole and in its entirety, is both illegal under the Sherman Act and harmful to consumers, whom the Act is designed to protect.
Given the Court's Findings and Conclusions of Law, it is a virtual certainty that Microsoft will be subject to remedial action of some form. The available remedies fall into two broad categories, conduct remedies and structural remedies.
Microsoft's defenders have generally focused their commentary on the prospect of conduct remedies, which would place restrictions on Microsoft's future behavior. While we are not prepared to exclude the possibility that some form of conduct remedy could be beneficial, the ones proposed thus far would appear to do more harm than good.
Given the range of illegitimate behavior documented by the court, and the complexity of the software industry, a meaningful conduct remedy would require a lengthy list of conduct restrictions and requirements. The imposition of such a remedy on Microsoft would be burdensome for the company and difficult, if not impossible, for the government to enforce. The real danger, however, is that a conduct remedy would lead the decree court and the Department of Justice to function as de facto regulatory agencies, monitoring the operations of a firm with 30,000 employees producing dozens of technologically sophisticated products. Because enforcement of conduct restrictions would involve ongoing oversight of virtually all of Microsoft’s operations, including new product introductions, it could interfere with Microsoft’s ability to develop new products and compete. And, because Microsoft has dealings throughout the software industry, oversight of Microsoft by the decree court might well lead to indirect oversight of other firms as well. In sum, there are legitimate concerns about conduct remedies in the Microsoft case.
A well-designed structural remedy, on the other hand, is subject to none of the concerns described above. We have proposed a "competitive remedy" that would replace the current monopoly with a competitive market structure. Specifically, it would separate Microsoft’s operating system products from the rest of the company’s product lines, and then create three equivalent "Windows companies."4 Each of the new Windows companies would have full ownership over all the relevant intellectual property, and would be allocated an equal share of employees, contracts and other resources to go with the intellectual property.
The competitive remedy we propose would immediately replace the existing operating system monopoly with a competitive market. In so doing, it would eliminate the need for ongoing regulation and dramatically reduce the potential for subsequent litigation.
Furthermore, we believe it is highly likely that the competitive remedy would result in far more rapid innovation in computer operating systems than we have witnessed over the course of the past decade, for the simplest of reasons: Competitive firms have an incentive to innovate in order to win business away from their competitors; monopolists do not.
Microsoft's defenders have offered several arguments in opposition to such a remedy, two of which are worthy of rebuttal. The first is that this solution is unworkable because the task of dividing up a complex firm like Microsoft is too difficult, or the costs too great. We disagree. Indeed, the task of dividing up a firm like Microsoft, which has virtually no tangible assets and whose 30,000 employees are mostly young, mobile and well-off, is vastly easier and less costly than dividing up a firm like, say, AT&T circa 1984. We believe the breakup we propose could be carried out quickly and with relatively minimal costs, and have seen no plausible evidence to the contrary.
The second argument has to do with standardization. The idea is that we need a monopoly like Microsoft to provide a standard for operating systems and, in the absence of such a monopoly, we would have "fragmentation" and resulting incompatibility. In Dr. Lenard's longer paper on the remedies issue he shows that this argument fails at several levels. Specifically, all of the new firms would have extremely strong incentives to maintain compatibility with the existing Windows installed base and with each other on a going forward basis. Moreover, those who advance this thesis have the burden of showing why in this particular instance economic efficiency can only be obtained through a monopoly, whereas in all other markets competition produces an efficient balance between standardization, on the one hand, and specialization/diversity on the other. We have not seen such a showing made, nor do we believe one is possible.
The Findings of Fact and Conclusions of Law handed down by Judge Jackson address each significant argument Microsoft has made in its own defense -- and find them wanting. Microsoft has a monopoly, has engaged in anticompetitive behaviors, has harmed consumers and has violated the law. Those who would argue otherwise have an obligation to rebut Jackson's factual and legal conclusions on their substance. We have yet to see such a rebuttal. The conspiracy theories that have been offered in place of substantive argument are unsupported by any evidence, and seem incredible on their face.
The danger that a conduct remedy in the Microsoft case could lead to increased government involvement in the software marketplace is not without merit. A structural remedy, on the other hand, would end the Microsoft monopoly, end the threat of government regulation and obviate the need for further litigation now and for many years to come.
1The Wall Street Journal, November 23, 1999, p. A22.
2Findings of Fact in U.S. v. Microsoft Corporation, Civil Action No. 98-1232 (TPJ) and State of New York, ex re. Attorney General Eliot Spitzer et al., v. Microsoft Corporation, Civil Action No. 98-1233 (TPJ), November 5, 1999 (hereafter, "Findings of Fact" or "Findings").
3Conclusions of Law in U.S. v. Microsoft Corporation, Civil Action No. 98-1232 (TPJ) and State of New York, ex. re. Attorney General Eliot Spitzer et al., v. Microsoft Corporation, Civil Action No. 98-1233 (TPJ), April 3, 2000 (hereafter, "Conclusions of Law" or "Conclusions").
4Thomas M. Lenard, Creating Competition in the Market for Operating Systems: A Structural Remedy for Microsoft, The Progress & Freedom Foundation, January 2000.
Jeffrey A. Eisenach is President and Cofounder of the Progress & Freedom Foundation. Thomas M. Lenard is Vice President for Research. The views expressed here are the authors and do not necessarily reflect those of The Progress & Freedom Foundation, its Board, Officers or Staff.