data virtualisation
© Gorgios |

Alberto Pan, Chief Technical Officer, Denodo, explores the role that data integration is currently playing in the global pandemic by enabling response teams and governments to gain the upper hand and start planning for a future post COVID-19

More than seven months after the initial outbreak emerged, COVID-19 is continuing to place immense strain on healthcare systems, economies and governments worldwide.

In response, we have seen both the public and private sectors become increasingly reliant on technology – and more specifically, data – in a bid to alleviate pressure and keep people informed.

Whether it is finding out about the number of new cases in certain regions, accessing real-time testing updates, or debunking harmful myths and misinformation, data has made a meaningful difference to the unforeseen turmoil caused by the pandemic.

What has also become clear, however, is how much harder it is to control and contain critical situations when pertinent information is not made readily available. This had led to discussions around the role that data virtualisation could play in the ongoing fight against COVID-19.

Connecting the data dots

The UK government is a treasure trove of data, spanning across 24 ministerial departments, 20 non-ministerial departments and 300+ agencies and other public bodies. That’s a lot of information, in a lot of places.

Now imagine each of these individual departments or agencies is storing data in multiple different sources – whether in cloud environments, data lakes, or legacy on-premise systems. For those wanting to easily access information, it becomes almost impossible to work out what it is they need, and where it is located.

This already extensive challenge becomes even more of a problem when numerous parties are involved. In the majority of cases, even those working with the same data points will have copied or filtered it in a certain way that caters to their particular needs and use-case. Not only does this cause obvious up-front issues during the extraction and analysis phases, it is also extremely time-consuming, immediately hindering any hope of agility.

Time is of the essence

In the midst of a global pandemic, it is essential that those most involved are the most informed. This means having an easy avenue to real-time insights. On the front line, for example, A&E staff, emergency medical dispatchers, or experienced paramedics, need fast, seamless access to decision-making data. For health workers having an unhindered view of vital information could directly impact their ability to administer life-saving treatment.

For government officials providing daily updates on COVID-19 to the nation, it is crucial that they are aware of the latest infection rates, hospital administrations and prevention measures. For hospital staff treating the most severe virus patients, previous medical history, allergies and symptoms must be made known immediately.

It is impossible, however, for all of this information to be delivered promptly when it is coming from multiple data sources and being accessed through a range of systems. Even when separate data points are ‘stitched’ together to form one combined view, it becomes useless if it is not delivered in time.

In today’s complex data landscape, it is no longer feasible to replicate data from myriad sources into a central repository because of the associated costs and delays in accessing the information. This is where data virtualisation – a modern approach to data integration – can help.

Going beyond integration

Data virtualisation delivers a simplified, unified, and integrated view of trusted data – in real-time or near real-time – as needed by the consuming applications, processes, analytics or user. By combining data from disparate sources, locations and formats, without replicating the information itself, data virtualisation transcends the limitations of more traditional techniques.

Older, legacy systems no longer have to communicate with more modern platforms, on-premise technologies don’t need to connect with cloud-based systems, and there is no issue in achieving that single ‘source of truth’ that data users are constantly chasing. Data virtualisation also doesn’t require information to be moved, leaving the source exactly where it is.

The result is faster access to all data, less replication and cost, and greater agility to change. This has been noted by all major market analysts, with Gartner estimating that organisations using data virtualisation decrease their data integration costs by 45%. The analyst firm has also predicted 60 per cent market penetration of the technology by 2022.

While the majority of us have had no choice but to sit back and watch COVID-19 wreak havoc on every aspect of life as we know it, those working in roles across the public sector have been forced to face the disruption head-on. As well as the many medical professionals working tirelessly to save lives and provide critical care, the government has played an indispensable role in navigating the UK through the biggest global health dilemma in years.

During this period, data silos and legacy systems have made it challenging for users to access and analyse all of the available data across the public sector. Data silos can lead to inaccurate results or delayed decision making – two things that cannot be a risk during a pandemic. With data virtualisation providing seamless agility, unified data governance and logical abstraction capabilities – amongst many other benefits – it can help aid the public and private sectors during, and after, the COVID-19 crisis.

LEAVE A REPLY

Please enter your comment!
Please enter your name here