New Tech-driven Operations Harvest Data to Eliminate System Failure

By Mike Yager

Data is ubiquitous, omnipresent, and becoming a form of capital. According to Jacquelyn Bulao, in her blog post, “How Much Data is Created Every Day in 2020,” 1.7 MB of data is created every second by every individual and 90% of the world’s data has been created in the last two years.  There is an incredible amount of actionable, relevant information floating around in the system we use today, but it can be incredibly difficult to pinpoint what is relevant. When I need to research a problem, often my course of action includes getting a massive data export, putting it into a spreadsheet, running pivot tables, and slicing the data into digestible chunks so I can determine what the data is trying to tell me.  This is the same method I’ve been using since I learned to use a spreadsheet, maybe with a few tweaks here and there. 

Data is vital to a business because it allows everyone on the team to make informed decisions. Visualizations and business intelligence tools provide for large volumes of data to be presented in such a way that anomalies and areas of interest are presented clearly and stand out from the rest of the data, thus reducing the amount of time between when a problem occurs and actionable decision. 

Think of this in the context of an Open Road Tolling system. I can run a report to see how many transactions have been created in the past 24 hours, how many of those transactions were paid by transponder, what the vehicle classifications were, and myriad other types of information.  But, unless I am familiar with the roadway, what is happening in the local community, the technology used to collect the data, I may not be able to identify that the data indicates an issue.  This is where business analytics and machine learning come in; these analytic tools bring in historical data, trending, and information from other sources to present the data consumer with information in a logical view. These tools cut through all of the noise and allow the consumer to arrive at the information indicating that there may be a degradation in the expected number of transponder transactions at a given time of the day.  Thus, before a subsystem degrades into a critical state, action can be taken to address the anomaly before a critical failure occurs. 

I know it sounds like science fiction, but today’s robust business intelligence and data visualization tools take us out of the work of looking at raw data to find a needle in a haystack, by delivering simple data representations that point us to potential issues we may not have seen.  This is the power of presenting consumable data. 

Retrieved from https://techjury.net/blog/how-much-data-is-created-every-day/#gref