Browse any major newspaper (digitally, of course) in the last three years and there's bound to be some misrepresenting story on cloud computing. There's Amazon Elastic Compute Cloud, Google Apps, Microsoft's Azure, the Apple iCloud, Salesforce.com, and a host of others that are kind-of "Cloud-lite" (Dropbox, Box.net, et al.)
They all use the umbrella term "cloud computing" and any article on them tends to conflate what it is they're comparing to amorphous water vapor. For instance, Amazon's cloud is run like a standard data center, where there are computers running operating systems and it's up to the system administrators to configure those systems in mostly the same manner as they would the rack-and-steel versions. By contrast, Google's cloud is not a cloud-computing bank at all (at least relative to Amazon): it's rentable services. So instead of configuring a computer, a Google cloud developer is only concerned with the programs. Apple's cloud is also another beast, it's Internet storage with the Apple brand name stamped on it.
So then, what is cloud computing? All three of those examples are very different and not interchangeable. I do most of my development in Amazon's cloud, but if I wanted to switch to Google, I would need to change everything up. I'd have to learn a completely different architecture and give up a lot of control to the system controls I'm accustomed do. Is that bad? Not really, but it's not practical for my company. Amazon is really good for shops that need more computing power and already have a data center infrastructure. Often (but not always) getting the services to work in the cloud is just a matter of uploading a few configuration files to a virtual cloud host. With Google, we'd have to start from scratch, which might explain why established businesses are loathe to adopt the platform (so far). And with Salesforce.com, even though it's more like Google than Amazon, it's success pivots on it core competency: it does financial, sales, and marketing calcs like Microsoft Excel, which is well known in the business community, and not in python or ruby (like Google), which is not.
I'm getting at whether cloud computing even means anything. There are other buzz words and jargon around cloud computing (Iaas and Saas being high on my hit list), but none of these terms help clarify what should be simple to explain. So simple, that virtual computing designers dropped "virtual" in favor of "cloud" a long time ago.
If we stick to the metaphor, imagine a cloud. It has a shape and a size, but it's ever-changing and always moving and yet we still call it a cloud even when an hour has gone by and it no longer looks like a pinwheel and has moved out of sight. Cloud computing is like that in that the resources available to any given user at any given time change internally, but externally it's function is always recognizable. Under that model, all of the above are cloud-computing environments, and just like real cloud they have different specialities and functions. In that regard, I'm not a cloud-computing purist: it doesn't need to be AWS-like to be a cloud.
But I do think that cloud computing isn't so so special or game-changing. It's more bar-lowering, since small businesses can get access to fast computers and hunks of storage cheap. But before virtual computing, we still had fast computers and hunks of storage. And that's the same hardware that's powering all these clouds. And even though this has conceptually been around for a while and companies have always had their own internal compute farms (basically private clouds, whether virtualized or not), it was Amazon that sparked this change when they experimented with leasing their excess xmas computing capacity.