We Need a New Approach to Beat Data Management Complexity

Not a day goes by without us coming across a business article discussing the benefits of data and its ability to transform organizations. We read about a new world of possibilities, with data ushering in new customer experiences and powering the next wave of applications that will connect data with insights and results. But the disparity between where organizations are today on their transformational journeys and where they want to go is growing, and it is a cause for concern.

What gets in the way of innovation is complexity, a complexity that spans people, processes and technology, and is based on how data and data infrastructure are managed. How does that look like? A recent ESG study found that across organizations, 93% of IT decision makers see the complexity of data storage and management impede digital transformation. As an example of that overhead, on average, these organizations rely on 23 different data management tools. That’s an overwhelming amount of disparate hardware and software (not to mention silos of people) required to manage the lifecycle of data and data infrastructure, from how data is accessed, protected, governed, and analyzed to how it is deployed. , provision and upgrade infrastructure, and mobilized.

Complexity impacts everyone, and it’s only growing

Organizations have lived with this complexity for years, and yet it is precisely what stands in the way of transformation. How is that? It all comes down to the current focus of data management and infrastructure and the way it impacts everyone. For starters, think about storage and the headaches IT faces every day – countless hours spent adjusting, maintaining, and upgrading storage across fleets. Trade-offs must be made between resilience, efficiency, and performance. Provisioning is manual and fraught with guesswork. The cloud seems like a potential answer, but data and applications are needed everywhere.

And the impact of complexity goes beyond IT. Data innovators, those who turn bits and bytes into new applications and knowledge, cannot access data fast enough. Manual processes inhibit data utilization and slow down time to value. Data managers are challenged to optimize data access and protect that very data within an increasingly intense threat landscape.

So what can be done? Today, by capitalizing on the power of data, the cloud, and artificial intelligence, we can reinvent the data experience.

A new data and infrastructure paradigm

Let’s get to the essentials. We need an architecture spanning data, cloud, and artificial intelligence to create a new data experience through data-centric policy and automation, a cloud operating model, and AI-powered intelligence and information. These are the essential aspects of a modern approach to data and infrastructure management.

Develop data-centric policies and automation

Data has a continuous life cycle spanning testing / development, production, protection, and analysis. Needs to be holistically managed, from creation to elimination. Software that can only manage individual parts of the lifecycle is inefficient and creates visibility gaps. Instead, we want holistic, data-centric policies and automation that collapse silos and unify workflows across the data lifecycle. That means ensuring that the policies that manage how data is stored, accessed, protected and mobilized, including how applications are provisioned, are data-centric and automated.

Take advantage of a cloud operating model

The cloud has set the standard for agility – the cloud operating model enables line-of-business (LOB) owners and developers to create and deploy new applications, services, and projects faster than ever. This makes the underlying data foundation invisible and changes operations to be application-centric, not infrastructure-centric. Expanding that idea further, organizations must take advantage of cloud operational expertise wherever their application and data workloads live, from the edge to the cloud.

Part of the transformation journey involves evolving IT to an “as a service” model. Based on operational experience in the cloud, as a service infrastructure, it radically simplifies and automates management, freeing up staff to work on higher-value initiatives and providing the self-service agility that LOB owners and developers need to go faster. .

Take advantage of AI-powered information and intelligence

AI is a critical dimension in any modern IT architecture. It continues to transform every industry with unprecedented intelligence, creating autonomous operations in manufacturing, transportation, and healthcare, to name a few. Just as we rely on Google Maps to see the future and deviate if necessary, businesses need artificial intelligence to be deeply embedded in data operations. Imagine being told that you can avoid an outage by making a network configuration change, or improve application performance by rebalancing workloads and resources in a specific way, or instantly provision applications across your entire fleet without any planning or calculation. That’s the power of AI-powered information and intelligence.

With this new paradigm for data, we will transform the data experience in every organization, creating value for everyone from IT administrators to data innovators. Rather than fine-tune and maintain infrastructure, IT administrators simply deploy cloud services with instant application provisioning. Instead of waiting days to access data, developers and data scientists get optimized access on demand. Rather than worrying about data threats, data administrators can set protection policies with a single click wherever the data is located.

HPE is taking the lead in redefining data and infrastructure management with this new vision of data. Combining cloud data services, cloud native infrastructure, and AI-powered intelligence, all delivered as a service. HPE GreenLake It uniquely provides a single, edge-to-cloud platform for connecting applications to infrastructure, innovators to data, and automation to policy in a seamless, unified cloud operational experience wherever the data is.

For IT leaders, there is finally an answer to the growing challenges of complexity. With the HPE GreenLake edge-to-cloud platform, you can accelerate data modernization first by collapsing silos between people, processes, and technology, and unlocking data, agility, and innovation for your organization.


About Sandeep Singh

Sandeep is vice president of storage marketing for HPE. He is a 15-year storage industry veteran with first-hand experience driving data storage innovation. Sandeep joined HPE from Pure Storage, where he led product marketing from a $ 100 million pre-IPO run rate to a public company with revenues in excess of $ 1 billion. Prior to Pure, Sandeep led product strategy and management for 3PAR from pre-revenue to higher more than $ 1 billion in revenue, including a four-year tenure with HP after the 3PAR acquisition. Sandeep has a BS in Computer Engineering from UC, San Diego and an MBA from the Haas School of Business at UC Berkeley.

Copyright © 2021 IDG Communications, Inc.

Leave a Comment