Sunday, February 15, 2009

Why overcast skies are the way forward

No, I am not referring to the economy, which probably tanked since you started reading this. I am talking about enterprise applications and how independent irreversible forces are shaping the future of enterprise applications and leading them to run in a cloud. Cloud computing has been around long before the term came into vogue. There are several government initiatives (Singapore and Malaysia to name a couple) that provide large on demand computing capabilities for furthering all kinds of research. Universities around the world have also had such capabilities for a while now. More recently, private enterprises have jumped into the fray with companies like Tata (with their CRL facility in India) providing state of the art computing capabilities for large scale engineering problems.

What has generated the buzz around cloud computing however is that the technology is now available to all comers at a very affordable price. In many ways, cloud computing is a leveler and puts scale within the reach of a small enterprise, much like the Internet made universal reach possible for the small business, much like blogs spawned a million new authors and a new fan following for these hitherto unknown authors.

Just like EBay created a slew of businesses that provided the picks and shovels for the EBay community, cloud computing will create its own PayPals without which the business of cloud computing cannot succeed or grow.


Cloud computing is a new capability that removes upfront cost barriers for a growing enterprise. And like any new capability, it requires the creation of an eco-system before it can go mainstream. The next two years will see the emergence of dominant players in the eco-system who will then guide the evolution of cloud computing in the years to come.


So what are the forces that are shaping cloud computing today?




  • Emergence of public clouds from Amazon, IBM, Google,Microsoft, Sun, and others to name a few. The capability exists today and the players are not fly by night operators.


  • The ability to invest huge amounts of money upfront is problematic even at the best of times, not to mention the pain of doing that during a recession.


  • In house hardware becomes obsolete, making it a bottleneck even as the software evolves to higher levels of complexity during the same period. And there are the usual problems with airconditioning, cabling, power capacity, rack space etc.


  • The availability of increased connectivity and increased information demands granular decision making that propels most enterprise applications to consider migrating to the cloud.


  • Smaller players in an industry segment that is information driven no longer are constrained by capital expenditure costs, requiring true innovation in those industries since the playing field is a lot more level than before. (As a leading CTO of a financial services firm once told me, "I want my competition to beat me at trading because they executed a better trade, not because they had better hardware or a bigger database")


  • Reduced hype cycle for most new technologies. The time it takes for something to go from obscurity to ubiquity has gone down considerably. This requires companies to execute faster and deliver faster than conventional internet time metrics


So what are some of the pre-requisites for this technology to become truly mainstream. For one, the picks and shovels need to be tailored to this environment.


And that entails



  • Platform neutrality

  • Language neutrality

  • Open Protocols for communicating with the cloud eco-system
  • Ability to manage large amounts of data with the lowest possible latency (The higher that number, the longer your application has to run to complete its work and the more you pay)

  • Monitoring capabilities for online systems

  • Analysis capabilities for runs that have completed


Most of all, it requires a change in mindset, for everyone associated with the process of building, testing, deploying and architecting applications that are designed to run in a cloud. Scaling, performance, HA and volume characteristics have to be designed into the enterprise application and cannot be deferred to version 2.0 or later.



And it is as cataclysmic a change as the one client server applications had to go through with the advent of the web.

Data Management: The elephant in the room



Once you have the hardware, the provisioning system, state of the art network, on demand acquisition of computing resources, and applications that are multi-threaded and can take advantage of the available computing power, there is just one thing that prevents you from completing your work quickly enough and releasing the resources back into the cloud (Whether it is CAPEX or OPEX, the guys who are responsible for PNL statements are always looking to bring the costs down). And that is latency. The typical application spends more time acquiring the data than operating on it. And if you are tied to disk page based data storage and retrieval systems, your latencies are limited at how fast the slowest piece of the application runs. Cloud computing is not necessarily cheaper, if anything, it puts a premium on failures and poor quality software, but only more affordable in that you pay for what you use and you can amortize the cost over a longer period of time. Being able to provision your data so that is readily available or even pushed to you ahead of time is the only way of ensuring that you can cut down latency, and making sure that the data is managed in-memory rather than disk.


A distributed data management solution like GemFire (the one that I am most familiar with) from Gemstone Systems fits this requirement very nicely. Built for scale, with dynamic resource management (ability to expand and contract in response to load characteristics), with the ability to partition data and route behavior to nodes that have the data, GemFire provides one of the more important shovels for the cloud computing community. And while this aspect of cloud computing still needs further validation before winners are picked, being the most scalable data management platform certainly does not hurt. Throw in support for multiple languages, and multiple interfaces like objects and SQL into this elastic cluster and you have something that has been in production for years in high profile instituitions.


Candidate applications for the cloud:


Applications that need elastic scaling capabilities, like online gaming


Simulation runs for risk management


Portals, Software as a service kind of applications


Collaborative applications with bursty characteristics that need to provide good QoS to their end users


Design and rendering based software


Animation studios



Note that all of these have to be designed to be seamlessly scalable from the ground up in order to benefit from the cloud. As the years go by, and large scale economics reduce the cost of running in a cloud, many more will get added to this list.


Overcast skies are here to stay and I think that is a good thing for the software community.





0 Comments:

Post a Comment

<< Home