Why the 20s will be the “Data Decade” and How to Prepare

In the next decade, the world will produce an unfathomable amount of machine data — metrics, measurements, and telemetry data that is emitted from everything from servers to robots and satellites. And the pace of that data generation is increasing exponentially. I wrote in a paper published last year how this data is a virtually untapped source of extraordinary business insights — data so dense and rich it will dwarf traditional business intelligence in terms of its potential value.

So why do I think this is important to you and your business? Because this tsunami of data is going to totally reshape the business landscape in the next decade. It will unleash new business models, give birth to innovative new products and services, disrupt existing markets and players, and give rise to an entirely new set of players who will become the new market leaders. Incumbents will stumble and some may cease to exist altogether.

“Machine data intelligence” is the next frontier of insight and competitive advantage for the enterprise. Having been a part of the “data processing” industry for the past four decades, there is no doubt in my mind that companies that learn to harness and leverage this data will be the clear winners in their respective industries.

Macro Trends Driving Data Generation

There are five major trends that will drive the exponential growth of machine data over the next decade:

  1. Continued adoption of cloud and hybrid-cloud computing. It’s tempting to think that the entire world has adopted cloud computing and that the cloud computing wave has nearly run its course. While it’s true that the majority of enterprises have moved at least some of their workloads to the cloud, there is still a long tail of workloads that continue to run on premises. Spending on cloud computing continues to rise with a recent Flexera survey showing budgets increasing by 50% and Gartner projecting that the market as a whole will reach $330B in the next few years. Another sure sign is the continued unabated investment in hyperscale data center construction — the number of which has tripled since 2013. There is still much more to come here.
  2. Adoption of virtualization, containerization, and serverless technologies. One interesting comment I hear often is someone will say, oh we’re not really a large scale operation, we don’t generate a lot of metrics. If you’re implementing a micro-services architecture using containers ala Kubernetes, the amount of data you’re generating may surprise you. A typical server generates something in the neighborhood of 150-200 metrics whereas a typical container will generate 1000 times that amount. I think in general most people would be shocked to know the amount of metrics and telemetry their deployments are generating. Most do not keep track.
  3. IoT deployments. The internet of things is big data on steroids. In a recent cover edition on IoT, The Economist talked about a world with a trillion connected computers. IDC estimates that the IoT will generate some 79.4 Zettabytes of data by 2025. The promise of IoT is great but there are also challenges. It’s a challenge to handle all those individual telemetry data streams — one for every metric emitted by every sensor and device. And Cisco estimates that 60 percent of IoT initiatives stall at the proof of concept stage and only 26 percent of companies have had an IoT initiative that they considered a complete success. One big reason is the ability to harness and makes sense of all of that data.
  4. 5G applications and edge processing. With faster speeds, lower latency, and higher operating frequencies, “ultra-wide band” 5G networks will unleash an enormous amount of network capacity over today’s 4G networks. According to Verizon, the 5G standard will support millions of devices per square mile and is designed to support up to 10 TB/s/km28 in data volume. Couple that with CCS Insight’s projection that worldwide 5G connections will grow 20x to 2.7B by 2025 and you get a sense of the massive impact 5G will have. What will generate all that data? A whole new generation of innovative applications that will take advantage of 5G networks and edge processing capabilities using technology such as virtual reality. In addition to powering our smart phones and mobile apps, 5G will be essential in realizing the promise of tomorrow’s “smart” cities, grids, factories, cars and homes.
  5. Digital transformation. Much like cloud computing adoption, it’s hard to get an accurate picture of just how far enterprises have progressed in their digital transformation initiatives. (An IDC survey from 2018 suggested nearly half of all businesses were in the “very early stages” of digital transformation and that only 7% had completed their journey). But one thing is for sure, if they hadn’t already embarked on this journey, odds are high that they will now as the COVID-19 pandemic is dramatically accelerating digital transformation initiatives globally. As I wrote in a recent post, literally overnight we’ve been forced to find new ways of working, meeting, shopping, managing healthcare, and even staying entertained. Savvy businesses are recognizing the opportunity this presents and increasing market share by providing exceptional online and digital experiences.

Unleashing the Full Potential of Machine Data

So let’s take a look at some examples of where innovations and disruptions using machine data are already beginning to occur.


The insurance industry is a great example of how machine data from IoT devices and sensors is revolutionizing (or disrupting, depending on your point of view) the process of underwriting risk. The conventional process of assessing and underwriting risk is very reactive, using static data from a given point in time. With technology like biometric sensors and wearables, connected equipment and smart facilities, environmental sensors, and telematics, underwriters can now see real-world behavior in real time. This provides carriers with a wealth of information they can use to design entirely new insurance products like “usage-based” insurance, for example using telematics data to more precisely assess driver safety or offering lower health insurance premiums for people achieving certain health and fitness goals. But they can also predict future behavior and partner with customers and consumers to reduce or avoid risk altogether.


As the cost of computerization has continued to plummet, the use of IoT sensors has proliferated dramatically. This is particularly true of the oil and gas industry where a modern day drilling platform can have some 10,000 sensors emitting sub-second information. Every aspect of production can now be monitored and measured: every platform, every well, every wellhead, every segment. Opportunity abounds. That data could be used for optimizing operations, predictive maintenance, compliance with insurance programs, and even to enhance worker safety. We recently worked with a major services company who was ingesting high-frequency measurements off its wellheads and then using that rich data to optimize its fracking operations. Optimization leads to better effectiveness and efficiency of fracking techniques, improving both the top and bottom lines of the business.


Smart factories or what’s being called “Industry 4.0” is another use case that exemplifies the power of machine data. In doing research for this post, I came across a great article by Suraj Rao of Western Digital Corporation — makers of disk drives and data storage solutions. Suraj heads the Digital Analytics Office with the mission of “creating competitive advantage through enterprise-wide business-insight with digital analytics.” Western Digital’s use of automation, machine data, and machine learning to power predictive maintenance, improve operational efficiency and product quality, as well as drive bottom line results, is impressive. In one example, they’ve replaced the time-intensive and fallible process of human inspection of wafers by training a machine learning model to recognize defective and non-defective wafer images and then classify new inspection images and assign a probability factor for failure with high accuracy. Well worth the time to read.

How to Get Prepared

Ok so back to you. You’re interested in leveraging machine data for competitive advantage but where do you start? If you are just beginning this journey, the effort can seem a bit daunting at first. Just remember, the two best times to plant a tree were 20 years ago and today. Start now and your future self will thank you.

By far the most important thing you will need to be able to do is harness the vast amount of machine data your business is generating and be able to make sense of it. That may seem like an obvious statement, but it’s easier said than done. There are three fundamental steps to building a robust machine data intelligence program.

  1. Build an inventory. A good place to start is in knowing what you have. What infrastructure do you have? What services are you running? What metrics are you generating? It’s critical to have an always-up-to-date inventory of all your infrastructure, connected equipment, sensors etc. and the associated metrics you are generating. Develop a plan and procedures through which new services and infrastructure automatically move into the inventory and get monitored by default whenever they are provisioned. (If you are new to monitoring, you can download and read our e-book on how to get started). It may take some time to complete the inventory but it will be well worth the effort. It is incredibly valuable just knowing what you have.
  2. Create a monitoring plan linked to business success. The key here is to first determine what you’re trying to achieve as a business and what KPIs you use to measure success. Then map the metrics you need to collect to ensure you are achieving those KPIs. A great mental exercise is to ask, “If we could only monitor one KPI, one metric, one telemetry point, what would it be?” You want to hone in on those metrics that move the needle the most and make sure you have a monitoring plan to cover them. Build your initial monitoring plan in collaboration with business leaders, set what you believe to be acceptable performance levels, and measure results. Then meet regularly with business leaders to share data and results and further refine your monitoring plan.
  3. Implement a unified monitoring and analytics platform with plenty of horsepower and capacity. Your platform will need to be able to handle the volume and frequency of data being generated at sufficient granularity to power capabilities such as fault and anomaly detection and predictive analytics and should also scale easily to handle the scope of your core infrastructure and IoT deployments. Data is the missing link to creating real business value. The greater the density and richness of the data, the greater the accuracy, precision, confidence, predictive qualities, and insights you’ll get from your metrics and analytics.
  4. In Summary

    The possibilities of machine data intelligence, across all sectors including healthcare, insurance, utilities, manufacturing, and many others, are endless — limited only by our creativity and imagination. The ability to tap into this oncoming tsunami of data, monitor and analyze it in real time, and collect and store it in such a way that it can be mined at will, without compromise or constraints, is at the heart of machine data intelligence. As I said in my introduction, the companies that learn to harness machine data to optimize operations, innovate new products and services, and create entirely new revenue streams, will be the clear winners.

    It should be an exciting decade ahead.