Among the previous articles you’ve read in our blog, you may have noticed that besides discussing how good business continuity management can save organisations from disaster, we also like to point out where it can also simply save you money. Here’s one of those cases. Satellite communications may intuitively seem to be more expensive than landline links. It’s easy to assume that with project and launch costs running into astronomical amounts, it won’t necessarily be the cheapest option for making phone calls or network connections. But is that really the case?
So what will you choose: public cloud, private cloud – or perhaps a solution in between? The flexibility and scalability of the cloud have also made it well suited to partial use, namely the hybrid cloud solution. Those who can’t quite make up their mind can have as much or as little of the cloud as suits them. However, it’s better still to approach this resource with a clear IT strategy in mind and to make a hybrid cloud solution a deliberate choice, rather than a vague default. Here are two possibilities that could drive a hybrid cloud decision.
Business continuity is often about reinforcing existing infrastructure or eliminating sources of business disruption. Bringing in techniques to accelerate or multiply results thanks to good business continuity may not be so frequent, but here’s one that may well do that. It’s version control, which is used when several knowledge workers need to simultaneously work on the same computer files to create advantage for the organisation – but without stepping on each other’s toes. Version control technology started in software development. However, it can be used for projects to create web content, coordinated product rollouts, corporate business plans and more.
From the title of this post, some people might immediately think of intuition: that vague and rather flaky resource used when that’s all you have. However, we’re actually thinking of something a little more structured in this context. In the coming age of Big Data and associated worldwide online resources, analytical techniques like those used in business intelligence can be used to detect trends and tipping points. They can give individuals and organisations meaningful information about how likely certain disasters will be: for example, ‘there is a 90 percent chance currently that your factory will be flooded out to a depth of eighteen inches of water’.
Let’s proceed by elimination. Servers? Those are the things that fall over when your data centre is hit by lightning and for which you do your disaster recovery planning anyway. Desktop PCs? They’re practically nailed to your desk, so they won’t be going with you as you run for the exit. Laptops? Maybe, although battery power and hard drive fragility may be issues. Smartphone? Compact, highly portable, runs tons of apps but has such a tiny screen. So finally, is the tablet computer the best compromise for IT on the run while you’re trying to get everything else back to normal?
Set it and forget it? Not if it’s a cloud computing solution on which your enterprise is relying to accomplish its daily operations. Due diligence in cloud vendor selection and frequent regular testing are both key components of the overall process. Taking a leaf out the banks’ books can be instructive in this context. While many banks have recognized the advantages to be gained by using cloud-based resources, they also know that security and reliability are of paramount importance for efficient, uninterrupted business.
US statesman Benjamin Franklin was famous for many things and for one in particular: his proclamation that “in this world nothing can be said to be certain, except death and taxes”. Well, Benjamin, it seems like modern technology and inflation have conspired to add a couple more items: server crashes and data security breaches. In other words, it’s not a matter of if these events will occur. It’s a matter of when. It’s true that robust quality IT products can push out the when so far that it seems to disappear in the distant future. However, smart organisations make the assumption that both things will happen and take appropriate precautions.
It’s funny how some myths continue to be believed, even by hard-nosed business people. The notion that virtualisation will save a company’s data is such a myth. Although it can be valuable in optimising an organisation’s use of IT resources and reacting quickly to changing IT needs, virtual environments are not inherently safer than independent physical servers. But data recovery provider Kroll Ontrack found that 80 percent of companies believe that storing data virtually like this is less or no riskier. Beliefs are one thing, statistics are another. 40 percent of companies using this virtual mode of storage were hit with data loss in 2012 – 2013. What’s going on?
Businesses can’t function if they don’t have customers. When customers find other solutions and move away, it’s therefore a threat to business continuity. Conventional banks may be at risk if a new development in online-only banking takes off. Startup ‘Simple’ (that’s the company’s name) for instance is giving clients an innovative alternative. Its solution is to eliminate fees, move all the banking activity to the Internet and offer online apps to help track budgets and finances. It makes its money from interest charges and internetwork payments, but can work with lower margins than conventional bricks-and-mortar banks that must pay for the operation of high street branches. Is this the end of the old-style banks?
The data snooping debate has quietened down a little recently, even if Edward Snowden’s name still crops up here and there. Whether or not the revelations about intelligence activities have changed much in terms of governmental attitude and behaviour remains to be seen. Pressure can still be applied to Internet, cloud and telecommunications service providers to provide data about users, and the only safe data encryption may be the one you do yourself. Indeed, increasingly large quantities of information are generated every day and are available for analysis by government agencies. But who decides what to do with all the data?
Data deduplication or the elimination of repetition of data to save storage space and speed transmission over the network – sounds good, right? ‘Data deduping’ is currently in the spotlight as a technique to help organisations boost efficiency and save money, although it’s not new. PC utilities like WinZip have been compressing files for some time. The new angle is doing this systematically across vast swathes of data. By reducing the storage volume required, enterprises may be able to keep more data on disk or even in flash memory, rather than in tape archives. Vendor estimates indicate customers might store up to 30 terabytes of digital data in a physical space of just one terabyte.
Think you need advanced computer skills to set up a phoney bank website and fool people into giving you their money? Think again. DIY phishing is now on offer in kit form. Someone who knows how to set up a personal website or even a Facebook page probably has the level of knowhow required to get started in fraud and identity theft. For business continuity, the threats are multiplied. Instead of having to deal (only) with specialised cybercriminals, organisations and their employees must now be wary of almost anyone and everyone. But is that such a bad thing?
Not everybody chooses the cloud as the first option for backing up data. Despite the advantages of practically limitless storage area, pay-as-you-go pricing and resilience, a weak point for the cloud is the network speed for uploading or downloading all those gigabytes (terabytes, petabytes…). The alternative for organisations is to put their own solution in place, something that will let them blast large amounts of data backwards and forwards at high speed. In the old days of IT, an IT team would have been tasked with assembling the requisite components and tweaking them to make them work properly together. But now IT vendors have spotted the need and produced the PBBA, a solution whose popularity is growing steadily.
Did you know that in six years’ time each individual on the planet will correspond to over 5,000 gigabytes of stored data? That’s the estimate from market research company IDC and digital storage enterprise EMC who see worldwide data holdings doubling about every two years to reach 40,000 exabytes (40 million billion gigabytes) by 2020. Right now in 2014, that means making moves to extend and enhance data storage solutions appropriately, and update those disaster recovery plans too. To store and manage all the data forecast to arrive, new techniques and technologies are available to blend with revamps of existing ones.
Despite the publicity given to Big Data and (to a lesser extent) the Internet of Things, their practical advantage has yet to be clarified. It’s difficult to think of them in terms of business continuity when they don’t influence the fortunes of an enterprise; unless you count the negative impact of money spent investigating them. A few companies cite gains in marketing effectiveness for example by analysing huge amounts of online data from customer interactions, but Big Data is not mainstream – or not yet. Similarly, the Internet of Things in which phones, PCs, cars, fridges and more are all web-enabled is a conversation starter rather than a reality. Things would change if either one acquired a killer app.