Glitches Are More Dangerous Than Cyber Attacks
Iain Chidgey has some ideas on how to avoid human error
Cyber attacks and cyber war may be headline grabbing, but a bigger threat to data security could be the software glitch. And the cause of this glitch is the data itself – or rather the sheer amount of it stored in our databases.
Databases are now so large that it is impossible to refresh them regularly, to run tests frequently and to fix errors quickly. This leads to more frequent and more dangerous software glitches. Whilst threats from large scale cyber-attacks should not be overlooked, for most organisations it is the threat of a software glitch that presents a clearer danger.
Beware of the foul-up
We live in an increasingly technology dependent society where a glitch can have a hugely detrimental impact on both a business and its customers.
In the last two years alone, we have seen Amazon Web Services crash taking down with it a huge number of websites, a whole fleet of planes grounded by American Airlines due to a glitch which took out its computerised reservation system, and a software glitch which affected US investment bank Knight Capital so badly it was sent to the brink of bankruptcy after it managed to lose $461.1 million in less than an hour.
You might also remember one of the most publicised glitches that brought down the banking capabilities of Royal Bank of Scotland (RBS), NatWest and Ulster Bank affecting 17 million accounts and leaving customers without access to their accounts for over a week.
It seems these types of glitches are happening more frequently than ever before. Why? Often the cause lies in insufficient testing. When databases are as large and complex as they are today, trying to duplicate and refresh data sets for testing is becoming harder and harder. IT teams are spending more and more time responding to requests for copies of databases and in some cases they can’t fulfil those requests at all. Developers are pushed to finish projects faster, but often IT can’t meet their demands which results in minimal testing of applications before they go live.
In other instances, IT departments provide copies of databases for testing, but by the time a copy is available, the data is old. Data can be obsolete after only a couple of hours, but refreshing a single testing data set can takes days, so most tests never uses data which is up to date enough to be risk free.
Test on a virtual copy
Stopping glitches from occurring isn’t always possible and implementing a new application will always have inherent risks, but more can be done. Companies need to make testing a priority and equip their IT teams with technology and resources that will enable them to test often and on recent data. Neglecting testing, as some of the recent examples show, can have dire consequences.
One way to make testing easier is ‘database virtualisation’, which lets teams test on live replicas of data, so there are no surprises when the application goes live and developers can fix any issues early on. It’s also much faster, because with database virtualisation, it’s no longer necessary to make and move physical copies of databases.
Furthermore, if a glitch in an application does slip through, having a virtualised live copy of the database allows a business to rewind the system back to a point in time before the glitch occurred, quickly and efficiently removing the corrupted information, enabling IT to reinstall the database in its prior form.
Whatever the method, testing needs to be a priority. As we become ever more dependent on software applications, and the size and complexity of data that run within those applications increases, risks of software glitches will also increase. Cyber threats have been in the news, but software glitches are for many a more present threat. Don’t neglect cyber security, but make sure that your glitches don’t become your Achilles’ heel – keep testing.
Iain Chidgey is EMEA vice president of data management specialist Delphix
Do you know about tech failures? Try our quiz!