- By Kirsten Doyle for Brainstorm
If data is the new oil, then grab yourself some cans before things get messy.
The most successful organisations understand the potential of their data as a competitive lever to gain insights into their customers. “However, any effective data management plan must be founded on an acknowledgement that data types have different needs for access, storage, and management, even as priorities around data usage change quickly,” says Rupert Brazier, country manager, Pure Storage.
Says Kate Mollett, regional manager for Africa, Veeam, the challenge for organisations is creating an integrated data management strategy that consolidates disparate cloud services, automates the movement of data across multiple workloads when and where it’s needed, and ensures the right data is available to the right decision-makers to add business value.
“We often find that people in charge of managing data sit in the security team where they’re responsible for data access, security and governance, or they sit in an infrastructure role where they’re responsible for data storage, backup and archiving. There seems to be a grey area in terms of who the ‘landlord’ of data should be. There’s also a third stakeholder to keep in mind, the people mining the data who sit in sales, marketing and even HR. The challenge currently is that everyone is operating in a silo and none of these roles form part of the same team.”
She says many local companies are using the cloud to automate processes. “However, for this to work effectively, data must evolve from policy-based to behaviour-based. Data must have built-in ML and AI to keep getting smarter about the best actions to take in any given situation. Automation improves the responsiveness, security and business value of data while reducing the cost and time that staff spend on manually managing and storing data, empowering the IT team to focus on getting real insights from their data.”
Tailored to fit
“It’s important to think about what your business needs from a data management plan and take into account when and how data is backed up and stored, how oft en, and how it is then accessed or managed once backed up,” says Joshua Grunewald, cloud hosting manager at Saicom. “Businesses need to consider how they use their data, how quickly they want to be able to restore data and how long and how far back they keep data. The plan should be a blueprint for the organisation – data management should never be seen as an insurance policy in the event of data loss or corruption, but as an opportunity to test, run reports and do integrity checks in an isolated environment. The plan also needs to make provision for storing multiple copies of data. The risk of a single copy is that if it’s compromised, there’s no backup available.”
Brazier adds that a properly executed data management plan must be built on a modern IT environment, consisting of data strategies based on flexible consumption models across on-premises, hosted, and public cloud, aligning application workloads with the most effective infrastructure. “Most importantly, a modern IT environment should work harmoniously with a common management interface, 100% non-disruptive architecture and proactive and predictive support services.”
However, the ‘big data’ trend has had a significant impact on data management. According to Mollett, the explosion of big data makes traditional backup and recovery inadequate. “Data availability is something that has become a business necessity. Companies can ill afford not to have access to data, especially when customers expect products and services to be available to them around the clock. To meet the challenges of this complex environment, businesses need a new approach to data management. Good data management practices need to ensure that data remains secure no matter the device being used to access it,” she says.
“I think the two biggest points here are that big data has created a much bigger need for finding more effective ways of doing data deduplication and compression because we’re storing so much of it,” adds Grunewald. “That technology has to stay ahead of the game and keep improving – otherwise you’re consuming more and more disk. And while it’s getting cheaper, it’s not endless or free, and moving things into the hyperscalers only works to a point. It’s important to do deduplication, but once you restore from it, it can be slow, so ensure that the hardware you’re using is purpose-built and can restore at speeds that match your business requirements. The second and biggest point is now that we are collecting so much data, what are we doing with it? How are we using it to make better decisions and how can we apply business intelligence to the sheer volume of data available?”
This brings us to dark data. “With the exponential increase in data volumes, it’s predicted that by 2020, every human will generate 1.7MB of data per second,” says Brazier. “Despite the growing recognition of data’s importance, only 0.5% is analysed and used. The remaining data, known as ‘dark’ or ‘cold’ data, is a big issue for companies.”
Rezelde Botha, business unit manager at Axiz, says this dark data represents a massive, untapped opportunity, as well as a major threat. “ This data must be classified, managed and analysed appropriately to gain business insights and identify any data that might be putting the business at risk. If this doesn’t happen, they can’t use this data to gain a competitive advantage, and have little hope of remaining compliant in an increasingly harsh regulatory environment.”
Botha cites a recent report by Veritas, which showed that mobile and public cloud environments are two of the weakest chinks in a business’ information security armour, as the majority of data within these environments remains unclassified. “A large part of the problem could arise from the inability to assign responsibility for this data. The report revealed that a staggering 69% of global companies believe that data privacy and protection are their cloud service providers’ responsibility. This isn’t the case. Most contracts with cloud providers put the responsibility of data management squarely in the hands of the business. The problem isn’t helped by the fact that modern workforces are becoming increasingly mobile, and data increasingly distributed. Strengthening data security is key, as is gaining data visibility and control. You can’t protect what you don’t know you have, or what you can’t see.”
Says Botha: “The better the business understands its data, the better chance it has of lowering its risks. Given the flood of data drowning enterprises today, there’s no way that billions of files can be manually checked and classified. Luckily, there are good data management tools that employ intelligent algorithms, machine learning, and advanced policies to do this for you.”
Brazier says implementing a modern infrastructure and utilising today’s leading-edge technologies such as hybrid cloud, flash and NVMe can ensure that hot and cold categories are eliminated and all data can be considered hot.
Balancing on-prem and cloud
But while hybrid environments can help get a grip on dark data, to ensure effective data management across a hybrid environment, a lot of work is needed, says Brazier. “Businesses need to make strategic decisions about which cloud environment to leverage based on the type of data they’re dealing with, and the applications making use of that data. For example, if you have a mission-critical application that runs consistently every day, an on-premise configuration is best. It’s less expensive than running these kinds of apps in the cloud. On the other hand, workloads that typically spin up or down with some frequency, and that require lots of distributed compute, may be better suited for the public cloud, where they can take advantage of the economies of scale.”
For Grunewald, when it comes to hybrid environments, establishing a good framework to ensure that the data lives where it needs to is key to accessing data when it’s needed as quickly and seamlessly as possible. “Data required for long-term retention doesn’t need to live on-premise, because it’s accessed far less oft en. If you’re going to have a hybrid environment, only keep close what you need frequent access to, everything else can live in centralised environments in the cloud.”
Sean Hurwitz, BU lead: Insights at Trackmatic, says: “Hybrid data management is a mantra for a successful digital organisation. The strategy isn’t to move to a total, single environment, but to end up with a hybrid architecture and, possibly, multiple services and solution providers. Organisations need to move from Capex step changes to software, computing, storage, and resourcing on an elastic model. They also need to manage the explosion of data with increased governance while ensuring appropriate levels of datasecurity and privacy. The future architecture of the digital organisation rests on a multicloud, hybrid DevOps platform underpinned by connected and trusted data. Hybrid environments try to compromise on the ownership of data versus the high costs to maintain the infrastructure. Companies opt to store less sensitive data in the cloud, while maintaining internal servers for critical data.”
So what can we expect from data management in the next five years? Hurwitz believes there will be a huge focus on data ingestion and storage. “The impact on structured data ingested is the loss of the associated metadata and data lineage, combined with a reduction in data quality, and lifecycle management for the data being ingested. In addition, the next-generation data platform needs to have the ability to ingest and govern data at speed, supported by a bimodal IT application/product development model.”
Research suggests 80% of worldwide data will be unstructured by 2025. For many large companies, it’s reached that critical mass already. Unstructured data creates a unique challenge for organisations wishing to use their information for analysis.
Brazier believes that in the next five years, we’ll see more AI and a greater degree of autonomous management of data through cloud-based software. “As we continue to generate more data, the importance of software can’t be overstated. According to Gartner, 2019 should see worldwide spending on enterprise software reach $439 billion, an increase of 8.3% from 2018. Cloud-based management software that monitors data storage and the underlying infrastructure is the key to taking infrastructure to the next level, building in automation that saves time and money, speeding up innovation and insight for the benefit of the organisation and end-user. This also allows companies to access their data from anywhere, with 24/7 predictive support that can autonomously find and fix issues before you’re even aware of them. Also, due to the nature of SaaS, upon every login, you will automatically be using the newest version of the software to benefit from the latest features and improvements. Given these benefits, it’s not an exaggeration to say that deploying the right software can have the same effect as employing a team of highly skilled IT professionals.”
Data management is not only an imperative, but an end-all since data will be the driving currency of future businesses as to how they shape their growing needs and products. “All industries will benefit from the universal skills of data management and analytics because all industries will just be creating more data,” concludes Hurwitz.