Archive for November, 2010

What is "The Cloud"?

Cloud computing and "The Cloud"  have become popular buzz words lately.  The latest Microsoft commercials are starting to make the term more known outside of the IT world.  But what exactly is it?

You'll get different answers depending on who you ask, and all the hype would have you thinking it's the latest and greatest invention, but in the broadest sense, it is simply the concept of having work done and data residing somewhere other than your physical machine.

I put "concept" in bold because that's the part that I think a lot of people miss.  It's not some specific technology or product, but just an idea.  Not even a new idea.

I'll let that sink in for a bit.

The idea (not the term) of cloud computing has been around since the very early days of computing.  The idea has been around since as early as the 1960's with one of the first manifestations of the idea being the ARPAnet, the predecessor to what is now the internet.

Various forms of what could be called cloud computing are already commonplace in the consumer space.  Any kind of web app, webmail, Google Docs, every Facebook app, anything that's a program that you don't have to download is living "in the cloud".

Two questions came to my mind when the term started picking up as a buzz word.  "Why now?" and "What's the big deal?"   I'll tackle these questions in reverse order.

First off, what's the big deal?  In the consumer space I think it's somewhat old news, it's something most internet users are already using.  The business world is usually more cautious and later adopters. Businesses are also traditionally more cautious about privacy and ownership of data.  The idea of their (intellectual) property being in the hands of someone else is less than ideal.

There are also issues of reliability and control.  If your servers go down, you have control and can make it top priority to get your customers back up and running.  If you're relying on someone else to keep your customers online they may have other priorities and that is out of your control.

Many businesses, especially smaller ones, are getting over this mentality because honestly, a dedicated company is going to be able to do a more reliable job, and it means you don't need to hire a dedicated person to make sure your machines that are serving your customers are always up and running.

So why now?  Technology is finally at a place where this has become feasible.  The internet has become ubiquitous.  You generally wouldn't ask someone if they have internet access, it's just assumed.  You don't ask if someone has an email address, you just ask what it is and assume that they have one.  Not only has internet in general become commonplace, but broadband internet is also widespread, meaning not only is the channel there, it is also fast enough to deliver the data and services efficiently.

The internet has become a valid distribution channel.  Web apps bypass a lot of the issues with distribution of traditional applications.  Cross platform comes for free.  No need to worry about creating a Mac version and Windows version (or Linux if that fits your target audience).  You can use them on public computers (with the usual precautions like remembering to log off whatever service you're using) without having to worry about installation.  You can access your stuff from anywhere.

That last sentence is important.  It is an answer to both "Why now?" and "What's the big deal?".

Mobility is shifting towards being something that's expected.  Being able to work on things from anywhere is quickly becoming an expectation from consumers and in the near future I suspect from businesses.  For many companies right now, mobility it's limited to mostly email, while most work still needs to be done in the office.

The cloud is a big deal because it allows for unprecedented mobility.  I can start writing a post at home and finish it at a friend's house.  In fact I've done so.  The cloud is important now because mobility is important now and will only become more so as it becomes an expectation.  Internet access on phones are now common and moving towards becoming standard.

The other benefit of the cloud that I haven't touched on yet is that it enables sharing and collaboration on a level that has not been possible in the past.  Whether it's sharing the latest news on Twitter, working together on a spreadsheet in Google Docs, or all your personal information on Facebook (j/k, hopefully) .  The world has become a lot more internationalized than ever.  I don't have any statistics, but I'm sure the number of people from other countries than the average person knows is probably twice what it was 10-20 years ago.

So what is "The Cloud"?  It is an ancient idea (in computing timeline) of having "stuff" that you can use from anywhere, and technology has finally come to a saturation point where it has become both feasible and economical to do at a large scale.  From a conceptual standpoint it is also another layer of abstraction like the operating system described in a previous post.

What exactly does an operating system do?

To adequately answer that question we must first look at what exactly an operating system is.

Most people know an operating system as that thing that lets you run other programs.  Wikipedia defines an OS as "software, consisting of programs and data, that runs on computers and manages the computer hardware and provides common services for efficient execution of various application software."

I like to think of an operating system as a facilitator between machine and other programs/users.

How Stuff Works has a pretty good article about How Operating Systems Work, but I'd like to focus more on the why.  Operating systems perform the very important function of acting as a layer of abstraction in between the hardware and other programs.

The concept of abstraction should be familiar to anyone working in software or cognitive science (and I'm sure other fields), but it's something that everyone uses whether they are aware of it or not.  In fact we as humans wouldn't be able to have any kind of real thought without it.

In simple terms, the idea of abstracting is the idea of hiding or ignoring details that are unimportant in a given context.  As you're reading this post, you're seeing words and sentences.  You don't think of them as individual letters even though you're aware that's what words are made of.  To take it a step further, those letters are actually made of dots of light in your screen (or if this ever makes it into print, then it would be blobs of ink).  But that's not what you are consciously aware of when you are reading because they are unimportant details that get in the way of understanding the meaning.

In the same way, an operating system hides away certain details like whether you have a USB mouse or a PS/2 mouse, or which port it's plugged into. 99% of programs don't need or want to care about that.  The OS knows where it is, and will listen to the correct port to get signals from it.  The program just asks what's the mouse doing?  It doesn't need to know how the OS got that information.

Surprisingly I find that currency makes for a good analogy to an operating system insofar as it acts as an abstraction.  In fact they probably came about for the same reason.  With the barter system, it gets annoying to find someone with corn that wants the chair you just made.  Currency came about because it provided a common way to exchange things with one another.

For a similar reason, operating systems came into existence because it provided a common way to access a variety of hardware after it got annoying dealing with all the different hardware configurations that differed in ways you don't care about.

If you're writing a program for a very specific piece of hardware then an OS isn't necessary and can even get in the way, or just not be available, similar to if you were to want to trade with some culture that doesn't use currency because they live a self-sufficient life-style.

That brings us to an extension to the analogy.  If an OS is trying to keep programs from having to worry about hardware differences, what about OS differences?  It is like different countries having different currencies.  Most currencies can be exchanged for one another (for a fee).  In a similar fashion, there are now Virtual Machines that can run another OS inside of a parent OS.  It's not quite the same as currency exchange, but similar in purpose – it's generally a little less efficient (like the currency exchange fee) and allows you to use a different OS than your primary one.

What spurred this topic was actually a question from a friend: "Why do new computers frequently refuse to run old programs? What does changing 'Compatibility modes' actually do to make them work? What do I need to do or understand if I want to play older games, especially with the new Windows os?"

The short answer is that the program was either using undocumented functionality or was making assumptions about the environment that are no longer true.  "Compatibility mode" in Windows Vista and higher makes Windows pretend to be an older version.  There are many things it can do, but since it actually is different, there are things that it can't emulate.  Unfortunately while there may be things you can do for specific games/programs, there isn't really much you can do in general to make older programs work outside of compatibility mode.

Documented functions are sort of like a contract that says the OS works this way.  It's essentially a list of guarantees.  Undocumented functions have no such guarantee.  Essentially they're just things that happen to work a certain way.

Let's take a real world example.  Most traffic lights in the US have 3 lights with Red on top, Yellow in the middle and Green on the bottom.  Red means Stop and Green means Go, that's how it's "documented".  Let's say you ignore that and instead you treat the bottom light as Go.  This works fine for all the traffic lights you've seen, so you write those as the instructions for your car program (yes, I realize that's redundant as a program is a set of instructions).

Now your car program runs into one of those funky horizontal traffic lights instead of the normal vertical ones.  It sees that the bottom light is on (they're all on the bottom, but it wasn't told to care about that).  The light happened to be Red, your car program goes and promptly crashes. (double entendre!)

That's roughly the equivalent, of what happens.  Although to be fair to the developer, sometimes Windows just doesn't provide a documented way to do what you want (maybe the program that was reading the traffic lights above was colorblind) and so they cheat to get the job done.

Windows actually goes to great lengths to ensure programs remain forward-compatible but it's not always possible.  Joel Spolsky talks about one such special case:

I first heard about this from one of the developers of the hit game SimCity, who told me that there was a critical bug in his application: it used memory right after freeing it, a major no-no that happened to work OK on DOS but would not work under Windows where memory that is freed is likely to be snatched up by another running application right away. The testers on the Windows team were going through various popular applications, testing them to make sure they worked OK, but SimCity kept crashing. They reported this to the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it.

Basically it said "I'm done with this piece of memory" then quickly used it after saying that.  The reason this was fine in DOS was because only 1 program would be running at a time, so even though it said it was done, no one else would take it so it could still use it.  This assumption wasn't true anymore in Windows where multiple programs are running at the same time and they had to share resources.

Raymond Chen, one of the biggest proponents for maintaining compatibility with older programs had the following to say about the topic (from 2003):

I could probably write for months solely about bad things apps do and what we had to do to get them to work again (often in spite of themselves). Which is why I get particularly furious when people accuse Microsoft of maliciously breaking applications during OS upgrades. If any application failed to run on Windows 95, I took it as a personal failure. I spent many sleepless nights fixing bugs in third-party programs just so they could keep running on Windows 95. (Games were the worst. Often the game vendor didn't even care that their program didn't run on Windows 95!)

Return top