Posts Tagged ‘history’

What is "The Cloud"?

Cloud computing and "The Cloud"  have become popular buzz words lately.  The latest Microsoft commercials are starting to make the term more known outside of the IT world.  But what exactly is it?

You'll get different answers depending on who you ask, and all the hype would have you thinking it's the latest and greatest invention, but in the broadest sense, it is simply the concept of having work done and data residing somewhere other than your physical machine.

I put "concept" in bold because that's the part that I think a lot of people miss.  It's not some specific technology or product, but just an idea.  Not even a new idea.

I'll let that sink in for a bit.

The idea (not the term) of cloud computing has been around since the very early days of computing.  The idea has been around since as early as the 1960's with one of the first manifestations of the idea being the ARPAnet, the predecessor to what is now the internet.

Various forms of what could be called cloud computing are already commonplace in the consumer space.  Any kind of web app, webmail, Google Docs, every Facebook app, anything that's a program that you don't have to download is living "in the cloud".

Two questions came to my mind when the term started picking up as a buzz word.  "Why now?" and "What's the big deal?"   I'll tackle these questions in reverse order.

First off, what's the big deal?  In the consumer space I think it's somewhat old news, it's something most internet users are already using.  The business world is usually more cautious and later adopters. Businesses are also traditionally more cautious about privacy and ownership of data.  The idea of their (intellectual) property being in the hands of someone else is less than ideal.

There are also issues of reliability and control.  If your servers go down, you have control and can make it top priority to get your customers back up and running.  If you're relying on someone else to keep your customers online they may have other priorities and that is out of your control.

Many businesses, especially smaller ones, are getting over this mentality because honestly, a dedicated company is going to be able to do a more reliable job, and it means you don't need to hire a dedicated person to make sure your machines that are serving your customers are always up and running.

So why now?  Technology is finally at a place where this has become feasible.  The internet has become ubiquitous.  You generally wouldn't ask someone if they have internet access, it's just assumed.  You don't ask if someone has an email address, you just ask what it is and assume that they have one.  Not only has internet in general become commonplace, but broadband internet is also widespread, meaning not only is the channel there, it is also fast enough to deliver the data and services efficiently.

The internet has become a valid distribution channel.  Web apps bypass a lot of the issues with distribution of traditional applications.  Cross platform comes for free.  No need to worry about creating a Mac version and Windows version (or Linux if that fits your target audience).  You can use them on public computers (with the usual precautions like remembering to log off whatever service you're using) without having to worry about installation.  You can access your stuff from anywhere.

That last sentence is important.  It is an answer to both "Why now?" and "What's the big deal?".

Mobility is shifting towards being something that's expected.  Being able to work on things from anywhere is quickly becoming an expectation from consumers and in the near future I suspect from businesses.  For many companies right now, mobility it's limited to mostly email, while most work still needs to be done in the office.

The cloud is a big deal because it allows for unprecedented mobility.  I can start writing a post at home and finish it at a friend's house.  In fact I've done so.  The cloud is important now because mobility is important now and will only become more so as it becomes an expectation.  Internet access on phones are now common and moving towards becoming standard.

The other benefit of the cloud that I haven't touched on yet is that it enables sharing and collaboration on a level that has not been possible in the past.  Whether it's sharing the latest news on Twitter, working together on a spreadsheet in Google Docs, or all your personal information on Facebook (j/k, hopefully) .  The world has become a lot more internationalized than ever.  I don't have any statistics, but I'm sure the number of people from other countries than the average person knows is probably twice what it was 10-20 years ago.

So what is "The Cloud"?  It is an ancient idea (in computing timeline) of having "stuff" that you can use from anywhere, and technology has finally come to a saturation point where it has become both feasible and economical to do at a large scale.  From a conceptual standpoint it is also another layer of abstraction like the operating system described in a previous post.

Why do web designers/developers hate Internet Explorer?

If you know anyone that works with creating websites in any capacity you've probably been encouraged to switch away from Internet Explorer.  But why should they care what browser you use?  It's not like you're making them use it.

To be honest, most of the hate is directed towards IE6 which comes with Windows XP.  IE7 was an improvement, and IE8 is actually considered acceptable by most web designers (unless they're just Microsoft haters).

So why all the hate for IE6?  The short answer is that supporting it is a pain because of its poor standards compliance.  But as long as a browser is still used by a decent proportion of users, it will need to be supported for business reasons.  So the sooner the majority of people switch off it, the sooner web designers don't have to worry about it anymore.

But let's back up for a bit.  What exactly does "poor standards compliance" mean?  In fact, what do these standards mean?  Stepping back even more, what exactly does a web browser do?

A web browser is a program that interprets HyperText Markup Language (HTML) and displays it in some fashion to a user.  The "standard" is what defines how a browser ought interpret HTML.  I put "standard" in quotes because it is somewhat of a misnomer.  Standards are partly determined by implementation, which is how it works, not how it should work.

There's a long (relatively speaking) history of how HTML in its current form came to be Dive Into HTML5 has very nice summary of the history of this.  I'm not going to repeat what it says, but I do want to quote the following:

HTML has always been a conversation between browser makers, authors, standards wonks, and other people who just showed up and liked to talk about angle brackets. Most of the successful versions of HTML have been "retro-specs," catching up to the world while simultaneously trying to nudge it in the right direction. Anyone who tells you that HTML should be kept "pure" (presumably by ignoring browser makers, or ignoring authors, or both) is simply misinformed. HTML has never been pure, and all attempts to purify it have been spectacular failures, matched only by the attempts to replace it.

But if there is no "pure" standard, why is there such a push for using "standards compliant" browsers?

It's because for the first time in the history of the web, the latest versions of browsers from all the major manufacturers are actually close to implementing a unified standard.  I believe this is largely due to the fact that browser usage statistics are more diversified than ever.  According to most sources Internet Explorer (all versions combined) now has less than 50% of the browser usage for the first time since the late 90's.

While it is still the most used browser,  the fact that it no longer has an overwhelming majority means that Microsoft can no longer run the show without consulting with and collaborating with other browser makers.  At the same time, they've also dug themselves into a hole where they don't want to break existing sites that have been made to work on previous versions of IE, yet they still want to adhere to standards so that web developers can design pages that work on all browsers without resorting to ugly hacks that literally say, if it's in IE do this, otherwise do the normal thing.

Joel Spolsky has a very insightful post on how difficult a situation Microsoft was in when creating IE8.  Particularly apt I feel are the following analogies:

And if you’re a pragmatist on the Internet Explorer 8.0 team, you might have these words from Raymond Chen seared into your cortex. He was writing about how Windows XP had to emulate buggy behavior from old versions of Windows:

Look at the scenario from the customer’s standpoint. You bought programs X, Y and Z. You then upgraded to Windows XP. Your computer now crashes randomly, and program Z doesn’t work at all. You’re going to tell your friends, “Don’t upgrade to Windows XP. It crashes randomly, and it’s not compatible with program Z.” Are you going to debug your system to determine that program X is causing the crashes, and that program Z doesn’t work because it is using undocumented window messages? Of course not. You’re going to return the Windows XP box for a refund. (You bought programs X, Y, and Z some months ago. The 30-day return policy no longer applies to them. The only thing you can return is Windows XP.)

And you’re thinking, hmm, let’s update this for today:

Look at the scenario from the customer’s standpoint. You bought programs X, Y and Z. You then upgraded to Windows XPVista. Your computer now crashes randomly, and program Z doesn’t work at all. You’re going to tell your friends, “Don’t upgrade to Windows XPVista. It crashes randomly, and it’s not compatible with program Z.” Are you going to debug your system to determine that program X is causing the crashes, and that program Z doesn’t work because it is using undocumentedinsecure window messages? Of course not. You’re going to return the Windows XPVista box for a refund. (You bought programs X, Y, and Z some months ago. The 30-day return policy no longer applies to them. The only thing you can return is Windows XPVista.)

This was posted in 2008 while IE8 was still in Beta testing.  He speculated that the IE8 team would probably reverse their decision to have the default mode be standards compliant because it broke about half the pages in some way.  However Microsoft found a clever solution that I feel was the best possible compromise between the pragmatists and idealists.

During Beta (as well as in the final release) they defaulted to "standards" mode which broke many pages, however, there is a button to view a page in "compatibility mode".  During the Beta phase they recorded popular sites where "compatibility mode" was turned on by many users, and with that came up with a "Compatibility View list" that users could choose to use or not when they run IE8 for the first time.  This list is updated through Windows Update about every 2 months and all the owners of all the sites on the list are contacted letting them know they are on the list and how they can get off the list when they bring their site up to standards.

So how did this whole mess get started?

Looking at a history of HTML we see that HTML was first introduced in about 1990 and there wasn't really a formal specification.

After several browsers started appearing on the scene, HTML2 was published in 1995 that was a retro-spec of what was "roughly corresponding to the capabilities of HTML in common use prior to June 1994."

In 1996 "HTML 3.2 released which retrofitted the Netscape inventions into the HTML 'standard'" and "Internet Explorer 2 [was] released, virtually bug for bug compatible with Netscape (except for a few new bugs…)"

Finally HTML4 was "released" in 1997 and it represents more or less what is the current state today in 2010.  (There's technically some new specs like XHTML but the feature set has been more or less untouched.)

The First Browser War greatly contributed to the mess by pushing Microsoft and Netscape to furiously try to outdo each other in features (such as such as the classic 90's effects Blink and Marquee) with little attention being paid to the buggy consequences of rapid feature development.

After winning the browser war, Microsoft didn't really have much reason to spend further development time and money on browser development, leading to a mini-"Dark Ages" for the browser world.

If we look at the timeline we can see that IE1 was released on August 16, 1995.  IE2 was released on November 22, 1995, a mere 3 months later, trying desperately to play catchup with Netscape which had almost a year head-start being released on December 15, 1994.

IE continued with about 1 major version every 1-1.5 years until IE6 on August 27, 2001, by which point IE (all versions) had over 90% of the browser market.  When you own 90% of the market, standards be damned, you are the standard.

After that, development stagnated even as IE usage continued to rise until 2004 when Firefox 1 was released.

On February 15, 2005 plans for IE7 were announce, with the final release happening on October 18, 2006, a full 5 years after IE6.  Six days later on October 24, 2006, Firefox 2.0 was released.  Both featured tabbed browsing with enhanced security and phishing filters.

Also in 2006 the Mozilla Corporation (a commercial subsidiary of the non-profit Mozilla Foundation, formed to help fund the operations of the Foundation and get around the limitations of a non-profit entity) received a large sum of money (85% of their $66.8 million revenue in 2006) from "assigning [Google] as the browser's default search engine, and for click-throughs on ads placed on the ensuing search results pages."  With that kind of funding in addition to community contributions to the project Firefox has been able to successfully keep up and even pull ahead of Microsoft's efforts.

With the ubiquity of the internet and Google fervently pushing it's own browser offering with Google Chrome, browser usage is more diversified than ever, making standardization ever more important.  The benefit of standardization, somewhat counter-intuitively, is giving choice to the consumer.

Standardization encourages features that are external to defining how a page is displayed.  With designers spending less time trying to simply get a site working properly in every major browser, they now have more time to spend on things that actually matter: giving customers and users a better experience.  Browsers also become more diversified in their personality, out of necessity to differentiate themselves from each other.  Chrome favors minimalism and simplicity, Firefox favors customization and flexibility, and Internet Explorer favors… familiarity?  I'm not actually sure to be honest, but it definitely has its own personality.

The push for upgrading browsers isn't an anti-Microsoft movement (although it probably was historically) but a movement to increase the amount of choice a user has without worrying about their favorite site not working in their favorite browser.

Coming back to the main question, after going on a long tangent, of "Why do web designers/developers hate Internet Explorer (6/7)?"  The answer is that they are the last remaining browsers in popular use that still regularly need to be handled differently to get sites working properly.  At least this is true on the desktop.  Mobile browsers are a whole other can of worms, with a difference being that most people don't expect every sites to work perfectly on mobile devices yet.  There's still tiny quirks here and there in current browsers, but for the most part pages are write once, run anywhere.

(HTML5 is coming/partially here but it is a more well thought out standard to seeks to build on top of rather than replace the current standard.)

Return top