Copyright © 2004, Heal Consulting

While it's true that all platforms have their vulnerabilities, the problem facing Microsoft is one of attitude and culture. Security for millions of computer users will never improve without a change in Microsoft's attitude toward writing secure software.

An acquaintance worked at Microsoft once, leaving to go back to school, and intends to return. He said some really interesting things about Microsoft, Open Source, and security:

  1. He doubted the security of open source systems. The first thing that happens when you open up source, he said, is that people can find all the exploits that are there. To him that was an obvious security risk.

  2. All programmers at Microsoft have access to the XP source code -- even interns.

  3. When a new programmer is hired, he immediately starts reading the XP source code. He claimed that one such person wrote a program in half an hour that shut down Microsoft's internal network for half a day. This was taken as proof of the wisdom of keeping the software closed, and also of the incredible talent Microsoft hires. Really, that's what he said.

  4. Programmers at Microsoft make no decisions about what to write or what features should come first. All coding decisions are made by a project manager; the programmer just implements the manager's decisions. There was something about a "triangle approach to management" that seemed to me another way of saying "top down".

  5. The only reason Windows is under such attack is because it's the most popular. "90% of all PCs run Windows", he said. If Linux™ ever gets to be the most popular, it will be the most attacked. But two-thirds of all web servers run Apache, which is open, and most of those run Linux or some other open operating system, I said. He retorted that there is a bigger challenge in attacking Windows because it's a riddle: you don't have the source code, so it's more of a challenge and wins you more status points for attacking it. That's what he said.

  6. He said, suppose you're in a war and you have two tanks. You put one in a cave and the other you leave in the open, with all its specs published. Which one gets destroyed first?

I took the time to explain to him that security flaws and general bugginess are two sides of the same coin; a security flaw is just an exploitable bug. That didn't faze him. I went back even further.

To prove an algorithm's correctness you must see its code. No amount handwaving or test data can validate a secret algorithm. Algorithms that are exposed to everyone are improved more rapidly. You don't wait for the special cases to reveal flaws; you search the code for flaws and eliminate them -- by folding the special cases into the general case, if possible.

It was difficult getting through to him, but I didn't understand why until the conversation was almost over. It's because of one last thing he said, as if telling a wide-eyed idealist all about the real world: at Microsoft, the important thing is the feature list. Features sell software. Security doesn't sell software. Security is not pushed at Microsoft because there is no financial incentive to the non-security project manager to implement security. I understood then: security is just another feature to Microsoft.

Browsing Microsoft's Security How-To section for writing secure programs, it's notable that all of the articles are touting Microsoft's support for particular security-related features. I can't find any reference to security being the developer's responsibility.

I realized why my friend was so intransigent: it wasn't just that he didn't believe my points about security, but that they interfered with his mindset that you have to make money off software. It's a mindset that pervades the company, from top to bottom, and they don't question it.

By the way, the answer to the "tank in a cave" argument is simple: software is not hardware. If someone breaks your code, you look at what they did and fix it. Software kept closed in a cave may be better protected, but it will never improve.

Security is not a feature you add to a product. It's not even a process, or a an attitude, or whatever else you thought I was going to say. No, security is an emotion. Computers don't have emotions, people do. Security, to a programmer writing code, is having confidence that his code is correct. To be correct, it must be shown to everyone, including to the bad guys.

I don't have anything against the profit motive for creating software. I'd just like the folks in Redmond to design security into their programs by opening their source code. I doubt it will ever happen, but we can always hope.

This work is published under the Creative Commons License. You may incorporate all or part as long as you also provide the URL "" (either linked or as text), or make other arrangements with the copyright holder.

Microsoft, XP, and Windows are probably trademarks. I'll look into it.