Saturday, January 5, 2013

The uncertainty of forecasting clouds


As I wrote in a previous post the adoption of large-scale cloud computing services suffers from many ills, for which I can only envisage one medicine: regulation.

It is tempting to think that cloud is just a matter of time; that companies giving up their data centres and most of their IT staff is something about to happen; that so called "mega-vendors" will dominate the market offering all the application, platform and infrastructure services that any non-IT company might ever desire. I, for one am a believer. I think that most companies will have extremely reduced information deparments compared to nowadays. They will not have IT departments anymore but rather a few roles such as Information Architects, Security Experts, Data Scientists (high in the hype cycle now), possibly a handful of developers, depending on the type of business. Big corporations and institutions may keep a  "private cloud" either for cost reasons, to keep highly confidential data or both. Cost is indeed an issue: nobody proved yet that it is cheaper to rent a large data centre than to own one. In fact it may never be. That is the case with office buildings. It is far cheaper for a corporation to own than to rent one. On the other end of the spectrum, small to mid-sized companies may just outsource everything to cloud providers and not even have an IT department anymore. I am sure that there are cases like that already.

But for most companies cloud is still a challenge and will remain one until there is enough regulation and enforcement thereof. Tecnhically moving to the cloud is not difficult. Services have been accessible for a few years already. But there are a few reasons why businesses may not be prepared yet:

1- The cloud provider may go down, possibly even because of some other cloud provider with whom the prime contractor had an underpinning contract. Fresh example: Netflix going down possibly because of an outage in Amazon.

2- Data leakage: data access is almost impossible to protect from at least a few system administrators. The only effective means of protecting data is through encryption. But encryption is a tricky control, extremely hard to manage. First there is ultimately a key that is a single point of failure - encrypting is pointless if the system administrators or anybody eles has access to the key; second, if the key is lost then the whole data is lost; third, keys should be changed regularly but then they need to be backed up somewhere safe - of course not with the data itself. And finally depending on what is encrypted different qualities may be affected. For example using database encryption usually has an impact on performance, may restrict available features and requires licensing of additional modules. Using file system encryption has more or less the same issues, and using volume encryption is useless while a system is working and the administrator has access to it. Obviously encrypting data that needs to be accessed by multiple persons is complex as many keys need to be managed, and if applications need to access the data then the keys need to be saved somewhere in the system which make them accessible to system aministrators (and hackers).

3- Undefined boundaries: Who else is using the infrastructure? What if your system is running on a virtual machine and one of your "neighours" who is hosted in another VM in the same physical server breaks out and gets control of that server? Then your machine is already compromised.

4- Legal/compliance: who knows where your systems are running? There my be constraints on location - e.g. your data may not legally leave the country of origin. Moreover, if there's a police investigation on one of your "neighbours" (i.e. someone who is using the same infrastructure as you) and the Police needs access to your files to follow a trace, you will need to deal with that too.

The main problem with solving these issues may be lack of motivation from those "mega-vendors". Lack of regulation benefits the established large vendors. First, they are happy to host systems from anyone on the planet within their own legal framework and on their own terms. Second, without regulation only the big vendors have enough credibility from the customers' point of view. And then there are some the technology issues: regulation leads to standardisation, accreditations and consequently to more competitors in the market and reduced profit margins. Besides, there is no doubt that in the cloud world open source is king. It is obvious that a provider will always prefer to use and develop open source to paying licenses to someone else - after all they are not in the business of selling code, but services; so the large vendors that live on selling licenses are not really interested in standardisation but rather on keeping a competitive advantage by offering more and better integrated services than the competitors.

Anyway it is certainly interesting to live in such a transition era. 

No comments:

Post a Comment

Comments are always welcome. They will be moderated for posts older than 14 days. In that case a delay of a few hours can be expected before publishing.