Solving the Windows 8 backup headache
22nd Nov 2012 | 10:13
Why Windows 8 and BYOD could bring network headaches for IT Managers
Datacastle, Vice President of Sales, Phil Evans explains why the move to new Windows 8 devices and the bring your own device culture could potentially bring down already congested networks, unless IT managers start to act now.
With Windows 8 now being available and accessible on a range of devices, there will inevitably be a rise in devices containing corporate information connecting to the network, increasing security risks. With many networks being congested already, Windows 8 will cause a knock on effect on the backup and network systems and IT managers must ask themselves if they are able to deal with the increase of devices, and still run an effective network.
These extra devices will also need to backup over a network, which is finite, and will consume bandwidth when they are in use and when they are being backed up. So what do you do with these extra devices? Especially if they are being used in a remote office where the connection is slow?
This is one of the major headaches for many companies when they have satellite offices all over the world. They are producing and accessing important data, which has to be backed up, but they have a slow connection. So instead of everyone in that office fighting to get their backup off to the centralised vault somewhere else in the world, which eats into the already crowded bandwidth, why not back up to a local solution that connects back to main vault when the network is quiet at night or over the weekend?
Backup to the LAN not the WAN
This is what some new backup services are now providing. Using locally placed NAS devices, that can be installed on the LAN at the office, they backup at network speed without taking up huge amounts of the bandwidth. Running Windows Server or Windows Storage Server can start for as little as £400. Users backup locally over the LAN and store all the backups from one office. All the local staff backup to the NAS. The network manager has control over when the connections out to the main vault happen - for instance when the network is quieter.
Therefore, instead of lots of different connections happening haphazardly you can optimise to the speed and capacity of the specific network. The initial backup is larger in footprint, you can set this one main backup to then only upload to the main vault when the office is quietest - at the weekend perhaps?
This sort of solution is ideal for areas where bandwidth is expensive and it has been thoroughly proven in areas where connections are akin to the speeds achieved with old 14.4k modems in less developed areas of the world.
Reduce the backup overhead with deduplication
Another benefit of doing it this way is these tools also often come with deduplication features too, so you make sure that every bit and every byte that you send over your precious bandwidth is unique and hasn't already been sent by another device in other backups. The system checks every block of data before sending to ensure it doesn't already exist. Many backup solutions swamp the network by sending the same document or PowerPoint off-site from every user with the document.The first backup of the system is always the largest, but this way future backups can be done faster and more efficiently by ensuring only changed blocks are sent.
The flipside to backups is the time it takes to restore. By keeping the backups on the NAS locally, systems can be restored quickly, unlike if they were stored on the centralised vault that could be thousands of miles away. This dramatically improves the recovery times for users and also saves on bandwidth too. An example of this is if someone loses their laptop whilst en route from one office to another, with local backups a new laptop can be restored with that person's data quickly and easily at LAN speed.
In addition to this some also have a feature that enables users to backup even when they are out of the office and don't have an internet connection. The software runs in the background so that when they get a connection again it then starts to synchronise the backup to the vault.
Reducing the potential for data loss
Device theft or loss and consequent loss of data is probably one of the biggest headaches for Organisations and with the EU looking into fining companies even more than even the Information Commissioner's Office (ICO) currently can (the maximum penalty being £500,000), making sure that they minimise the risk of data breach. With this in mind, you need to make sure that the backups are not only encrypted, but the software provides users with a detailed audit of what was on the device at the time of loss. This is because many companies don't actually know precisely what data was actually lost and often admit to losing more than they did, simply because the user wasn't sure what data was involved.
This may seem a little irrelevant when you consider that the data is also encrypted on the device as well as during the entire backup process and therefore relatively safe. However, by being able to find out what data was lost you can use it to look at your processes and, potentially, reduce the amount of future risk. The software can also be used to block ports so that you can control the ways in which data leaves your company. Added to this many companies have found that as little as 25 per cent of devices actually use encryption because end-users disable it as they feel it slows down their productivity - this system allows centralised management of the entire encryption process.
Consider backup when you build the network
Backing up is essential part of everyday life, but you don't want it to slow your business down when staff are already working on slow network connections. Over the years the cost of storage has gone down and down but the cost of bandwidth hasn't and it is unlikely to as businesses and individuals demand more and more, faster networks.
Companies need a way to backup data without affecting productivity, but still remaining secure and then being able to call on those backups again when the data needs to be restored. Couple this with the influx of devices running new software such as Windows 8 and your problems have got a lot more complicated; unless you keep a local cache that keeps the data close before copying it over to a centralised backup vault.
So the next time you think about your backup, think about your network and the impact all those backups have on it. If your network is slow when people are working on it, how much slower is it when they are all backing up over it at the same time?