government digital transformation, data storage
© Siarhei Yurchanka

Here, Shaun Collings from Pure Storage argues for why a new approach to storage can help accelerate the strategy behind government digital transformation

According to stats presented in a 2018 HM Treasury discussion paper entitled “The Economic Value of Data”, data proliferation is accelerating greatly. The Organisation for Economic Co-operation and Development (OECD) estimated that in 2015, global data volumes stood at 8 trillion gigabytes, an eight-fold increase on 2010. By 2020, that volume is forecast to increase up to 40 times over, as technologies generate huge quantities of data.

The pressure is on

Clearly this puts pressure on legacy public sector storage infrastructures, as well as future IT investment budgets. But that is not the only place where costs are becoming a challenge. Managing such huge data workloads has a serious human resource cost as well for government digital transformation.

Research firm Vanson Bourne surveyed 900 IT professionals about the proliferation of data across locations, storage silos and their experiences managing data. The findings highlighted some interesting issues. This included the fact that 63% of respondents said they have between four and 15 copies of the same data. In addition, 26% of respondents said they would quit their jobs if the proliferation of data was not slowed or restricted. Other concerns respondents cited included loss of morale (42%) and massive turnover (38%). Any loss of IT resources will be particularly costly to government organisations as they battle to compete with the salaries offered in the private sector.

So is the cost really worth it? Absolutely, given that cost is not the only factor – the value of the resulting benefits is an important consideration. In the UK, a review of Artificial Intelligence (AI), which relies on vast data sets, by Professor Dame Wendy Hall and Jérôme Pesenti, identified access to data as a major success factor in the continued growth of the UK’s AI industry. Why should we care? Because data dependent technologies, such as AI, offer massive untapped potential for economic growth.

A PWC cited estimate suggests that by 2030, AI could increase GDP by 10%, and in the public sector AI has the power to radically transform service quality for citizens. Again, the Treasury has a great example of the tangible and financial difference data can make to local service delivers in the transport sector of government. Transport for London (TfL) has led the world in making transport data available at no charge to external app developers. The result of this open data approach, as calculated by Deloitte, is that the use of this data now contributes up to £130 million per year to the London economy.

Data uses contribute to costs

The final cost issue we will consider is the impact of greater data utilisation. The way data is used, or analysed, can also add to the cost of storing and managing it and in part that is down to the nature of the data that is being considered. Big data analytic tools are often concerned with analysing web traffic, sensor data, video and images and other contextual data and cross-referencing these sources in real-time in order to draw deeper, more accurate insights that can be put into action within fragments of seconds. This means dealing with huge quantities of very small data objects and because of the time pressure involved, the process has minimal tolerance for latency.

However, analysis can happen synchronously or asynchronously. Imagine a retailer providing continuous marketing to customers in a multichannel environment. Analysis will need to be made in real-time or near real-time in order to optimise the customer experience, so there is no time or need to capture all data. In this instance storage architecture must be designed to ensure that data is served up to analysis and decisioning applications instantly. Latency and availability sit at the core of infrastructure design here, which is why all-flash is ideal.

However, if sensor, web data, etc. is going to be used in asynchronous analytics it will need to be captured, prepared typically for relational databases, and stored for future analysis. This extract, transform and load process can become cumbersome, slow and complex since different sources of data will need to be dealt with differently. In this instance, it’s critical that data storage infrastructure is designed with performance, capacity and scalability in mind. For numerous reasons, all-flash is the ideal option because with such huge datasets, tape-based storage comes with too much latency and traditional scale-up architectures will be too expensive.

Given that most organisations will practice a mix of asynchronous and synchronous analytics, with all the various latency and capacity issues, a more fluid and flexible architecture is required in order to truly unlock the value of data. A data-centric architecture orientated around the needs of the business delivers the components required for success.

Not only is it designed so that compute facilities and apps have dynamic access to storage where needed, it delivers data as-a-service to both web scale and traditional apps, with near on zero latency. A modern, data-centric IT environment consists of data strategies based on flexible consumption models across on-premises, hosted, and public cloud — aligning application workloads with the most effective infrastructure.

In terms of reducing operational costs, this can be achieved through compression and scale-up/scale-out capabilities, but also through management simplicity, which sees customers achieving far more cost-effective management. For example, one of Pure’s international banking customers manages 30% of its data on Pure with just one FTE resource, who does so on a smartphone. Performance times mean one start-up has accelerated app dev times by 40%.

In essence, when you are forward-planning and listening to the imperative from the business for far greater value and advantage from data, know that leading businesses including Man AHL and Mercedes-AMG Petronas Motorsport are all switching to a more fluid, dynamic, high performance, cost-effective, data-centric architecture.

This article was written by Shaun Collings, Director, Public Sector UK at Pure Storage.

LEAVE A REPLY

Please enter your comment!
Please enter your name here