Our website makes use of cookies like most of the websites. In order to deliver a personalised, responsive and improved experience, we remember and store information about how you use it. This is done using simple text files called cookies which sit on your computer. These cookies are completely safe and secure and will never contain any sensitive information. By clicking continue here, you give your consent to the use of cookies by our website.

CCI
Friday, 07 August 2015 15:21

How to secure and improve your data connections

Posted By  Graham Jarvis

Security is an important issue when connecting to a data centre or the cloud, but adding encryption can increase latency. Graham Jarvis looks at how businesses are solving the security and latency issues.

Data centres are only as secure as the connectivity that links them up to a network. They can otherwise be prone to cyber-attacks or information security breaches, which can have catastrophic consequences for any organisations that wish to transmit data and back-up data between their own data centres or even the cloud. With network latency being an issue that needs to be addressed, the transmission to and from data centres or to the outside world could increase the risks associated with these threats.

According to Clive Longbottom, analyst and Client Services Director at Quocirca, latency can lead to lost transactions whenever there is a failure in the connectivity, application or platform. High latency levels also make it impossible to make real-time IT use such as voice or video transmissions, but he thinks that latency is only one part of the equation. There are also complex network, application and hardware mix considerations to bear in mind.

“Latency has two effects on security in data centres: the first issue is about how closely you can keep your data centres in synchronicity with each other, and when you are transmitting data you have got to have security keys”, explains David Trossell – CEO of Bridgeworks. So whether you are putting data into the cloud or into a multi-tenanted data centre, it’s crucial to be secure. In other words no unauthorised person should have the ability to pry into the data itself.

Encrypt data

So underlying the protection of the data centre from an information security perspective is the need for enterprises to encrypt data whenever it is uploaded to the cloud or transmitted between data centres for the purposes of back-up and retrieval. The encryption needs to occur while the data is at rest, when it isn’t being sent across a network. It’s also worth noting that the most secure encrypted data only has one person who holds the keys to it.

“An AS 256 key offers the strongest encryption, but the problem is that a strong security key takes up more computing power and yet with encrypted data you shouldn’t be able to perform any deduplication which looks for repeated patterns in the data”, says Trossell. To improve the ability to quickly transmit data most organisations would traditionally opt for a wide area network (WAN) optimisation tool, where the encryption process occurs while the data is in transit using IPSEC.

Encrypting the data at rest means that it’s more secure. For WAN Optimisation, the keys would have to be offered to the WAN optimisation engine in order to decrypt it, de-duplicate it and before using internet security protocol IPsec across the wide area network. The WAN optimisation engine would then need to strip off IPsec, and this would then permit the re-encryption of the data. This means that you now have two security keys in a couple of places – this can be the biggest security risk.

“For the highest levels of security, data should be encrypted before it hits storage”, says Longbottom. He adds: “This requires full stream, speed capable encryption and yet this is often not feasible.” He says the next level there is about storing and then encrypting, and it requires deleting the unencrypted version afterwards: “This is encryption at rest and on the move, but too many organisations just go for encryption on the move, and so if someone can get to the storage media, then all of the information is there for them to access it and what is typically overlooked is key management.”

Don’t lose the keys

“If you lose the keys completely, no-one should be able to get at the information – even yourself; and if they are compromised, you at least have control over them as long as you are aware that you can do something about it”, he explains. However, if the keys are held by a third-party, then he says it becomes a richer target for hackers, “as it will likely hold the keys for a group of companies rather than just one, and the speed of response from noticing the breach to notification to the customer to action begin taken could be a lot longer.”

The trouble is that the data is traditionally often not secure when it is encrypted while in transit across the network. “The issue here is that if you have a high speed WAN link, then this will inhibit the movement of data and then you are not fulfilling your WAN optimisation”, comments Trossell. His colleague Claire Buchanan, CCO at Bridgeworks adds: “You are impacting on your recovery time objective (RTO) and on your recovery point objective (RPO).” The RPO is the last point of when the data was backed up, and the RTO is how quickly the data can be retrieved and put back to work.

Gain control

“With encryption at rest the corporate is in full control and it is the sole owner of the key, but normally WAN optimisation tools simply pass the data through with no acceleration and in order to provide some level of security, WAN optimisation tools provide an IPsec layer – but this is not anywhere close to the levels of security that many corporations require”, she explains.

Security used to be quite light in terms of security compliance, but a number of new threats have arisen and they weren’t as high as they are now. “You have the internal problems, such as the one represented by Snowden, and with more powerful machines the lower encryption of 128-bit is far easier to crack than something with 256-bit encryption which adds layers of complexity.” He claims that nowadays there are more disgruntled employees than ever – just look at Wikileaks if you need proof - but the employees have to have the keys before they can access the encryption.

Longbottom says that it wasn’t long ago that 40-bit encryption was seen as being sufficient. It required a low level of computing resources and in most cases it was adequately hard to break. Increased resource availability made it easier to break within a matter of minutes. “Therefore the move has been to 256-bit – AES, 3DES, BloFISH and so on”, he says before adding that cloud computing provides a means for hackers to apply brute strength to try and break the keys.

The solution is to keep the keys on site, and to limit the number of people who have access to them.  By doing this the data and therefore the data centres remain secure.  “Previously organisations have had no choice, but to simply move the encrypted data at a slow speed, and with WAN optimisation it simply passes the data along the pipe without any acceleration”, says Buchanan. Corporations still think it’s the only way to go, but not anymore. Encryption is often needed to ensure that the data is secure whenever there is a need to transmit data between data centres or to the cloud without compromising on speed or security.

Speed and security

Buchanan adds that a tool like their own WANrockIT can help organisations to improve the speed and security of this process: “With WANrockIT your encryption is just another block of data to us, accelerated just like any other data without it being touched – plus, if you are using encrypted data, the software has the ability to put IPsec on top so that you effectively get double encryption.”

One anonymous Bridgeworks’ customer, for example, tried to transfer a 32GB video file over a 500MB satellite link with 600ms of latency, and it took 20 hours to complete. With WANrockIT installed in just 11 minutes, the process only took 10 minutes to complete. Another customer could only do incremental back-ups of 50GBs rather than being able to do nightly back-ups of 430GBs – again the issue was latency at 86ms. It took 12 hours on their OC12 pipes, but when WANrockIT was installed the 50GBs back-ups were securely completed within 45 minutes. This allowed the full nightly back-ups to complete, and so the organisation could rest in the knowledge that its data was secure.

The security of an organisation’s data centre is therefore as much about its data as it is about how it prevents hacking and unplanned incidents that could prevent it from operating. Leaving a data centre without the ability to quickly and securely back-up inherently means that it’s insecure by nature as it won’t be able to respond whenever a disaster occurs.

So if your cloud provider or data centre is over reliant on sending sensitive data across a network without securing it at rest – before it is transmitted to another data centre or to the cloud, then it is potentially putting itself at risk. With data loss and downtime costing the UK £10.5bn a year, according to the EMC Global Data Protection Index, is it worth the risk?

About the author

Graham Jarvis is a contributing editor to CCi and an expert in cloud technologies.

Leave a comment

Make sure you enter the (*) required information where indicated. HTML code is not allowed.

IBM skyscraper2

datazen side

Most Read Articles