Joe Kim, EVP, Engineering & Global CTO at SolarWinds, explores how government IT departments can utilise the large amounts of data they collect effectively
The past few decades have seen a digital revolution across the globe. Breakthroughs in technology have transformed society and day-to-day life for millions. The Internet of Things (IoT) and the rise of connected devices have been at the centre of this change, and their numbers and applications continue to grow.
However, one of the most astonishing developments of this revolution is the volume of data that has been and continues to be, created. It’s unclear just how much data there is in the world right now, but in 2013, there were about 4.4 zettabytes of data in total. With one zettabyte being equivalent to a trillion gigabytes, this was already a huge amount. What’s scary is that from today until 2020, the digital universe is predicted to double every 2 years.
As always, it’s up to IT professionals to manage and analyse this ever-growing magnitude of data. This places a huge strain on IT departments, particularly in organisations that deal with exceptionally large amounts of data, such as the government. After years of seeing the amount of data rise, it’s only now that government IT professionals are realising they need to change their data-mining methodology to better handle the big data challenge more effectively.
New SolarWinds research has found that 95% of government organisations have adopted hybrid IT. So when applying these new data mining strategies, the first thing government IT professionals need to think about is adopting an approach that is ideal for a hybrid cloud environment. At SolarWinds, our advice is to implement automated and intelligent decision-making that’s driven by predictive analytics, this not only helps with data mining in a hybrid environment but also improves the efficiency of the network.
Changing the approach
Before the amount of stored data skyrocketed to its current, unimaginable levels, data analysis was largely a matter of manual labour – a task that could be accomplished by data scientists diving into the data to retrieve vital information. However, the speed at which data is now being created, combined with IoT, connected devices, and hybrid cloud environments, means this traditional approach would be a near-impossible task for even a small business today, let alone a government organisation.
In a government organisation, data will not only live in numerous departmental silos, but will also be scattered across multiple IT environments, making it even harder for IT departments to keep track of. With the data hidden across the organisation, traditional data mining approaches also make it difficult to identify insights within the data, and even harder to use these insights to create a consistent and flawless network performance.
Tools need to be implemented to make it easier for government IT professionals to monitor and analyse data that lives both on-premises and across multiple clouds. Having this cross-stack view of IT data can help the IT team compare disparate metrics and events across hybrid infrastructure, identify patterns and the root of problems, and analyse historical data to help pinpoint the causes of system behaviour.
What’s coming next?
The best way for government IT organisations to effectively identify data patterns and use that insight to predict and prevent potential network issues is through pairing predictive analysis with automated data mining. Predictive analytics allow IT professionals to foretell the future states of systems by automatically analysing and acting on past trends. Historical performance issues can be assessed alongside current environments, which teaches the network to “learn” from prior incidents and avert future issues.
The idea behind the predictive analysis is to give network admins more time to jump on a problem by alerting them to a potential issue, such as disk space running out or a patch failing upon installation, before it actually occurs. The system will be able to use the knowledge of past experiences and performance issues and apply it to the present situation to avoid a network slowdown or downtime. By comparing both historical and recent data, predictive analytics can help IT professionals make informed predictions about the future.
Take it to the next level
Government IT professionals could even take things a step further and incorporate prescriptive analytics and machine learning into their data analysis mix as well. While predictive analytics is needed to highlight opportunities and possible risks, prescriptive analytics and machine learning can go to the next level and provide recommendations to prevent problems, such as intrusions, before they occur. Prescriptive analytics will allow IT organisations to overcome threats and react appropriately to suspicious behaviour by establishing what “normal” network activity looks like.
Government IT professionals need to tackle the big data challenge head on before they get too hard to manage. Implementing sophisticated and modern approaches to data mining such as predictive and prescriptive analysis, along with machine learning, can help prevent potential issues before they occur and keep the network running smoothly. All of this will be invaluable for government IT professionals who, without a doubt, will see the level of data they deal with continue to rise.
EVP, Engineering & Global CTO