Thursday, 30 January 2014

Cloud Computing: Cloud Storage And Data Recovery Strategies

 Data backup is a critical issue in Cloud computing along with its data recovery in case of any issue with data at vendor's end.Many vendors provide storage at their own premises  on a very high price that cannot be bearable by small and medium businesses and other customers. In this article i have discussed different strategies to resolve this critical issue.One strategy is the development of an application at customer place which will collaborate with Cloud end and records the data on local drives,second strategy is the deployment of inter-private cloud storage and third strategy that i have discussed is geographical redundancy approach.

Cloud vendors ensure data security to its customers but still any strong virus attack or some other unseen issue can do the damage no matter vendors provide maximum transparency to its data services and applications.In order to achieve this cloud storage system must be robust.It means that if there occurs some failure in the system at one area then the whole system should not fail at once and continues to work.This concept is commonly known as "No single point of failure".The system should also be able to recover in a minimum span of time.In the same way the system should have its backup as well so as to confirm maximum data security.

Following are the different disaster recovery strategies.

·         Using application at user premises to backup cloud onto local drives
·         Inter-Private Cloud Storage (SAN deployment)
·         Geographical redundancy approach

Using Software Application to backup Cloud Data onto local Drives at customer premises

According to this approach a computer will be placed at customer's premises and an application will be developed in it.This application will interact with Cloud vendor on regular bases through a secured channel and backup the data from the cloud storage on the local drives. This application ensures maximum security by implementing good data encryption policy and only authorized users can use this.The data movement is through secured routes protected by implementing strong encryption policies and networking securities.At a particular time it back up the complete data from cloud and then it checks for any changes occur and backup any data updates one by one.

This approach is feasible if data quantity is less and data changes do not occur on rapid basis but what if the huge bulk of data is to be managed.Will the local storage drives be enough?Another concern is the rapid changes in the data and it requires that the software application that has been developed to interact with cloud should be efficient enough to tackle such scenarios.

In case of huge bulk of data only needed data is backup just like in the case of SharePoint hosting. This approach is very useful and can be handled in low maintenance cost.Moreover,by using this approach,customers are not entirely dependent on vendors and they can shift their data backups to some other vendor easily.

In order to avoid any deletion of data by mistake employees at the customer's end are advised to take backup after every one hour so that in any data loss scenario,the lost data can be copied back from local storage devices.Moreover,in case of any network failure the services provided by the cloud vendor can be utilized at customer's end.So by using this approach any network failure disaster can also be avoided.As soon as the transmission between customer's end and cloud vendor's end is restored the data can be copied to cloud storage.In this way the customer side can continue its routine works without and service degradation. 



Inter-Private Cloud Storage

Another important strategy is the inter-private cloud storage with the utilization of Storage Area Network so as to enhance the data accessibility and backup. SAN is comprised of very efficient and fault tolerant machines.
SAN is an expensive technology because in order to deliver maximum performance it needs huge quantity of efficient routing devices.
As SAN just gives accessibility and storage so in order to achieve maximum performance in minimum time some efficient routing protocols should be designed.In order to achieve this one way is to acquire ISP lines but it doesn't help much as the bandwidth of ISP is limited and network uniformity is another issue.That is why it is tough to get the required KPIs.Technically,cloud storage is the most efficient,reliable and adaptable technology. Cloud storage is happening to be the choice of  many enterprises in terms of efficient online backup.  Enterprise can access  the cloud storage via Intranet (such as SAN solution) which is basically the idea of private cloud storage. Private cloud storage is designed behind the firewall of an enterprise.All the data of enterprise is stored in it.
The best way to achieve maximum efficiency is to allow private clouds to contribute in common cloud service.The shared public cloud storage service is incorporated in different number of private clouds and is known as inter private cloud storage.Thus it provides efficient backup for enterprise.
All data is stored in the server.Key is that all servers should have backup servers as well.These aervers are located in widespread geographical locations.These backup servers are comprised of LBS where LBS is known as local backup servers and RBS where RBS is known as remote backup servers.Firstly the data is stored in LBS.Storing the data in RBS depends on the speed of network because if the huge bulk of data is to be stored whose size varies in gigabytes then some good speedy network and efficient algorithms are required.Storing the data in RBS is a very challenging task.So full one time data backup technique is used to address this issue and later on other data updates are backed up on daily basis.After one time full backup, daily data updates can easily be backed up and network will also be more efficient as the size of the data is reduced.Minimizing the size of data will reduce the amount of data flow which in turns reduces the network speed.It means large number of queries can be addressed in a very efficient manner.

Geographical Redundancy Approach

Another possible approach is geographical redundancy.This strategy is also used in conventional way but due to its huge resource requirements this technology is very costly and not feasible.There are two regions i.e. Region 1 and Region 2. Every region is the replica of other region.In case of any failure in one region, second region starts working.So by adopting  this way system security is ensured.

Conclusion


Cloud is a very quick and versatile technology but disaster recovery of the data is of great concern.1 have discussed three different strategies for disaster recovery. Every strategy has its own advantages and disadvantages. But this strategy will not be feasible for large amount of data .The second strategy i discussed was the use if inter privare cloud storage using SAN but it is an expensive strategy and will not be feasible for geographically widespread data. And thirdly, i discussed geographical redundancy approach but again this strategy is very expensive. So we see that we can adopt different disaster recovery strategies depending upon our requirement.

No comments:

Post a Comment