Data sprawl, fragmentation and silos are seen as the inevitable fallout of the ever-accelerating rate at which technology evolves. Problems that are bound to get worse as companies look to the cloud to deliver digital transformation. But they do not have to. According to Andrew Fitzgerald, sales director for Western Europe and Sub-Saharan Africa, at Cohesity businesses willing to make fundamental changes in the way they provide for, manage and protect their data will find that the modern technologies driving these problems can also help solve them.
The scale of the problem
“The main issue here is one of scale – not so much the sheer volume of data involved, but how we go about managing it. Instead of a joined-up approach, most companies tend to divide data into more manageable chunks, misleadingly referred to as systems, each with its own infrastructure,” Fitzgerald explains. “Infrastructure which is typically hardware-based and dedicated to, for example, backup, disaster recovery, archiving, analytics and so on, with little or no data sharing between their fragmented stores.”
Not only is this costly, but it can also lead to multiple copies of the same data propagating across these silos. 73 per cent of respondents to a ESG survey reported their organisation stores data in multiple public clouds today in addition to their own data centres. “Not only are there massive volumes of copied data, but it is spread everywhere,” Fitzgerald continues. “And that, in turn, can lead to operational issues caused by inconsistencies between those copies, how they are encoded, stored and accessed. It is also hugely expensive to do.”
Then there is also the little matter of visibility of data. Believe it or not, most organisations have little knowledge as to where most of their data resides, let alone who owns and uses it, or whether it contains sensitive information. In other words, most of their data is dark making it a considerable risk to the business.
How, for example, do you police compliance if you do not know what data you’ve got, or whether it is even all in the same area of jurisdiction? Likewise, how do you protect against ransomware attacks or – at a more mundane level – manage, optimise, and make the best use of this most valuable of resources on a day to day basis?
Google it or lose it
The typical response is to tackle these issues in the same piecemeal manner, typically, on a per-application or per-function basis. “Do not – it will only make the problem worse, for example, by creating silos within silos,” Fitzgerald says. “Take a look instead at how the big boys do it. Global players like Amazon, Google, and Facebook who, with exabytes of data to cope with, ought to have huge data issues, but do not. And that is because they have sidestepped the hardware-led system route in favour of a unified data management approach, using modern technology to avoid these very modern problems.”
The details, of course, differ, but each has opted to build an infrastructure using commodity hardware on which everything is virtualised, and software defined. Moreover, as part of that software-defined architecture, each has a modern data management layer comprising the following three key elements: a single unified file system to store data consistently regardless of location, technology, or application requirements; a single logical control plane to manage it; and the ability to run and expose the data services needed by applications regardless of how they consume them.
“This approach not only saves money by creating, in effect, a single shared data storage pool, it is also much more efficient, secure and easier to manage,” Fitzgerald explains. “There is much less risk of multiple copies being created, for example. The same backup and archiving technologies can also be applied across the entire infrastructure estate, so it is all highly visible – no dark data, no matter where it’s stored.”
What is right for one is right for all
But (I hear you cry) the likes of Google, Amazon and Facebook have the money and the technical resources to do that. We do not and we have also got lots of legacy solutions which we would have to upgrade or replace which would be both expensive and disruptive to the business.
“The answer to which is, yes, there is a cost and, yes, there will be disruption, but no more than with a lot of other IT investments,” Fitzgerald adds. “How much is your existing legacy setup hurting your business in terms of total cost of ownership, and lack of insight? Plus, you do not, necessarily, have to sweep everything aside in one fell swoop in order to copy the big boys.
“Demand is building, and the IT industry is rapidly waking up to this need making it increasingly possible, and easy, for businesses of any size to deploy a modern data management solution. Moreover, because they are software-defined, they really can be implemented and managed across an existing infrastructure with minimal disruption, whether on-premise, in the cloud or over a hybrid mix of the two.”
Which leaves the question of whether you think it is worth it, so ponder on this. Data sprawl, data fragmentation and siloed data are big problems already and if you do not do something, they will only get worse. They are acknowledged already as a significant roadblock to business transformation making it all the more imperative to do something about them sooner rather than later if you aspire to enable digitise your business.
According to Fitzgerald that something is modern data management. “By leveraging new technologies that are future-proofed and designed to operate with legacy technologies too, they overcome these issues caused by the use of point products,” he concludes. “And, by bringing it back under proper control to support risk mitigation and compliance, you also get a more holistic view of your data for business insight and new service development. Can you really afford not to?”
Read more data driven stories – Following a data-driven path to improve reliability and availability