To adequately answer that question we must first look at what exactly an operating system is.

Most people know an operating system as that thing that lets you run other programs.  Wikipedia defines an OS as "software, consisting of programs and data, that runs on computers and manages the computer hardware and provides common services for efficient execution of various application software."

I like to think of an operating system as a facilitator between machine and other programs/users.

How Stuff Works has a pretty good article about How Operating Systems Work, but I'd like to focus more on the why.  Operating systems perform the very important function of acting as a layer of abstraction in between the hardware and other programs.

The concept of abstraction should be familiar to anyone working in software or cognitive science (and I'm sure other fields), but it's something that everyone uses whether they are aware of it or not.  In fact we as humans wouldn't be able to have any kind of real thought without it.

In simple terms, the idea of abstracting is the idea of hiding or ignoring details that are unimportant in a given context.  As you're reading this post, you're seeing words and sentences.  You don't think of them as individual letters even though you're aware that's what words are made of.  To take it a step further, those letters are actually made of dots of light in your screen (or if this ever makes it into print, then it would be blobs of ink).  But that's not what you are consciously aware of when you are reading because they are unimportant details that get in the way of understanding the meaning.

In the same way, an operating system hides away certain details like whether you have a USB mouse or a PS/2 mouse, or which port it's plugged into. 99% of programs don't need or want to care about that.  The OS knows where it is, and will listen to the correct port to get signals from it.  The program just asks what's the mouse doing?  It doesn't need to know how the OS got that information.


Surprisingly I find that currency makes for a good analogy to an operating system insofar as it acts as an abstraction.  In fact they probably came about for the same reason.  With the barter system, it gets annoying to find someone with corn that wants the chair you just made.  Currency came about because it provided a common way to exchange things with one another.

For a similar reason, operating systems came into existence because it provided a common way to access a variety of hardware after it got annoying dealing with all the different hardware configurations that differed in ways you don't care about.

If you're writing a program for a very specific piece of hardware then an OS isn't necessary and can even get in the way, or just not be available, similar to if you were to want to trade with some culture that doesn't use currency because they live a self-sufficient life-style.

That brings us to an extension to the analogy.  If an OS is trying to keep programs from having to worry about hardware differences, what about OS differences?  It is like different countries having different currencies.  Most currencies can be exchanged for one another (for a fee).  In a similar fashion, there are now Virtual Machines that can run another OS inside of a parent OS.  It's not quite the same as currency exchange, but similar in purpose – it's generally a little less efficient (like the currency exchange fee) and allows you to use a different OS than your primary one.


What spurred this topic was actually a question from a friend: "Why do new computers frequently refuse to run old programs? What does changing 'Compatibility modes' actually do to make them work? What do I need to do or understand if I want to play older games, especially with the new Windows os?"

The short answer is that the program was either using undocumented functionality or was making assumptions about the environment that are no longer true.  "Compatibility mode" in Windows Vista and higher makes Windows pretend to be an older version.  There are many things it can do, but since it actually is different, there are things that it can't emulate.  Unfortunately while there may be things you can do for specific games/programs, there isn't really much you can do in general to make older programs work outside of compatibility mode.

Documented functions are sort of like a contract that says the OS works this way.  It's essentially a list of guarantees.  Undocumented functions have no such guarantee.  Essentially they're just things that happen to work a certain way.

Let's take a real world example.  Most traffic lights in the US have 3 lights with Red on top, Yellow in the middle and Green on the bottom.  Red means Stop and Green means Go, that's how it's "documented".  Let's say you ignore that and instead you treat the bottom light as Go.  This works fine for all the traffic lights you've seen, so you write those as the instructions for your car program (yes, I realize that's redundant as a program is a set of instructions).

Now your car program runs into one of those funky horizontal traffic lights instead of the normal vertical ones.  It sees that the bottom light is on (they're all on the bottom, but it wasn't told to care about that).  The light happened to be Red, your car program goes and promptly crashes. (double entendre!)

That's roughly the equivalent, of what happens.  Although to be fair to the developer, sometimes Windows just doesn't provide a documented way to do what you want (maybe the program that was reading the traffic lights above was colorblind) and so they cheat to get the job done.

Windows actually goes to great lengths to ensure programs remain forward-compatible but it's not always possible.  Joel Spolsky talks about one such special case:

I first heard about this from one of the developers of the hit game SimCity, who told me that there was a critical bug in his application: it used memory right after freeing it, a major no-no that happened to work OK on DOS but would not work under Windows where memory that is freed is likely to be snatched up by another running application right away. The testers on the Windows team were going through various popular applications, testing them to make sure they worked OK, but SimCity kept crashing. They reported this to the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it.

Basically it said "I'm done with this piece of memory" then quickly used it after saying that.  The reason this was fine in DOS was because only 1 program would be running at a time, so even though it said it was done, no one else would take it so it could still use it.  This assumption wasn't true anymore in Windows where multiple programs are running at the same time and they had to share resources.

Raymond Chen, one of the biggest proponents for maintaining compatibility with older programs had the following to say about the topic (from 2003):

I could probably write for months solely about bad things apps do and what we had to do to get them to work again (often in spite of themselves). Which is why I get particularly furious when people accuse Microsoft of maliciously breaking applications during OS upgrades. If any application failed to run on Windows 95, I took it as a personal failure. I spent many sleepless nights fixing bugs in third-party programs just so they could keep running on Windows 95. (Games were the worst. Often the game vendor didn't even care that their program didn't run on Windows 95!)