legacy data, public sector
© Eevlva

Here, Chris Cherry discusses some insights that can be seen from the use of legacy data in the public sector

Data is increasingly becoming the key to harnessing the critical insights necessary for departments across local and central government to make better, more informed decisions at a faster pace.

With enormous amounts of data available to organisations in the public sector, there are huge opportunities to detect crucial patterns faster, plan sustainably for the future and devise solutions to fundamental societal issues. The data collected from a wide range of repositories over decades upon decades, when used appropriately, can be a tremendous force for good.

However, although the importance of data is paramount, public sector organisations are still not maximising the potential that it has to offer. All too often this is because their data exists in disparate and legacy IT infrastructure, which makes it near impossible to access or process. For example, the IT system used by the Department for Work & Pensions to run the state pension is over 30 years old, according to CDIO Simon McKinnon.

Having data stored away in systems that are disjointed, obsolete and impossible to maintain means that the data itself is limited in usefulness as it becomes too difficult to access. Furthermore, when data records are not managed effectively, for example not maintaining accurate metadata, this further hinders the ability to make use of it. In fact, low quality data can be worse than having no data available at all, as inaccuracies can lead to making crucial inaccurate conclusions.

This is why effective data management, integration and storage strategies are crucial to be able to analyse big data in the public sector and draw subsequent benefits from it. Artificial intelligence and machine learning are being touted as offering revolutionary solutions to the efficiency and data challenges facing the public sector, but the truth is that without a solid data strategy and foundation in place, the operational benefits promised by such technologies will not be realised.

Adopting a data-first approach

By managing access and harnessing legacy data, organisations across the public sector can provide new levels of service. Vital to this is looking at the data first rather than how you model it, which in itself is a different perspective to that of traditional IT transformation programmes. Strategically, organisations must have a data-first approach when it comes to transforming legacy data into actionable, situational awareness.

Whether the data concerns businesses, organisations or individual citizens, systems need to be built more consistently and with a data-first approach that prioritises accuracy and maximises usability.

Recent technological innovation has made collecting and storing data easier than ever. Although data sets can be large, raw and unstructured, especially in the public sector, through the use of technologies such as data hubs, data can be ingested ‘as is’ regardless of format and type. This means data can be subsequently integrated so that historic records can be analysed in parallel. In legacy systems, between 60% and 80% of the cost of data warehouse projects is spent on ETL (the Extract, Transform, Load process used to analyse data), therefore using a data hub means that significant funds can be freed up to be used elsewhere.

It is only by integrating all sources of data can governments effectively use the unprecedented amount of data available. Insights gained from historical data should be extrapolated to make important predictions for the future and plan appropriately, however there are other challenges that organisations will face in doing so.

The challenges of data management

As there are so many different departments and repositories of data available in the public sector, it can be difficult to know where to start the data transformation process. The solution is to work backwards from an issue the government is facing and identify the data required to solve it. Gaining immediate value from an iterative process can lead to crucial insights being harnessed.

Furthermore however, the introduction of regulations such as GDPR in the EU and MIFID II has led to heightened privacy concerns and data breaches are becoming more commonplace, meaning that governments must rethink how they are responsible for storing and managing their data. Not only that, but Brexit has placed new pressures on how data and information is processed in terms of visas and passports.

As technological capability continues to improve, there is mounting pressure upon public services to work faster and more effectively. For example, the sharing of data insights in a seamless, timely and efficient way can be used to anticipate natural disasters and acts of terrosim.

The ‘natural monopoly’ held by government departments over citizen centric information means that they are ideally suited to adopt forward-thinking data management technologies. In doing so, they can ensure that the way they capture data is responsible but it can also be utilised effectively by those that require access to develop important solutions faster than ever.

By Chris Cherry, Public Sector Lead, MarkLogic.

LEAVE A REPLY

Please enter your comment!
Please enter your name here