This is Part 3 in a series about actionable ways to start treating your data like an asset.
Part 1 and 2 of this series introduced ways to think about your data as a true asset, and what distinguishes it from a consumable. In Part 3, it’s time to put that to the test.
To think of data as an asset, think about your data and analytics environment as a factory of:
These assets work together to accomplish organizational goals. But if data is the asset in your factory, the assets can’t also be the physical servers and cables that run your analytics, or the chemical that flows through the channels of the factory. Servers are like the ground on which assets are built and information and knowledge is flowing through the pipelines of your organization and is ultimately what is delivered to customers.
Knowledge requires a vessel to hold it in - and data is the steel that provides the shape and strength. Data’s job is to deliver knowledge to the end of the pipe without spilling a drop.
If we want to stop treating data as a nebulous, ever-changing, soupy mixture of rows and columns, bits and bytes, we have to envision it as something that is deliberately constructed. Data management and fixed asset management become parallel exercises of defining, measuring, and managing the performance and health of each asset, and ensuring that it’s performing its function. In this case, we want to ensure that knowledge is delivered through data to where it's needed in the organization without being spilt or contaminated. Data quality testing and cleansing become maintenance activities in your asset management plan; and you’ll no longer need to reach for analogies that don’t hold up when speaking to your executive team or other areas of the business.
Through quality management, we achieve the goal of delivering knowledge across the organization. But before we talk about quality, we have to talk about design.
Good data design is always the first step to quality to ensure that it can efficiently transport knowledge. A poorly designed data model isn’t a good vessel for knowledge; while it might work at first, any added stress may cause the whole thing to fall apart. A well designed data model, on the other hand, can deliver concentrated “high pressure” information to high capacity knowledge stores, ready to be used by your organization. Poor data design may be one of your biggest risks, and will certainly be one of the costliest to fix if you let it go into production.
No matter how robustly assets are designed, they all need maintenance. Your job, once data is put into production, is to preserve the quality of that initial design and not let the data erode (and the value with it). This is the same whether your assets are traditional fixed assets or data.
To ensure that the quality of your assets is maintained and they perform to achieve the goals of the organization, implement an asset management plan. No asset management plan would be complete without an assessment of risk. The risk of poor quality in the data context is that the knowledge containers that you’ve constructed erode and leak knowledge before it's delivered or get contaminated along the way. Let’s be clear that this is different from thinking about information leakage from the perspective of cybersecurity. Decision makers in your organization need complete, interpretable, and accurate information; data is the asset that delivers it.
So your data factory full of pipes and pumps and reactors and tanks is a collection of assets that move information and knowledge. If you hear that “data is an asset” at your organization but don’t believe it’s actually happening, there’s something you can do.
Ask your team to do a data quality risk assessment on one - JUST ONE - critical data asset. The reaction that you get will be telling.
You may get confused looks or protests about more work. You may hear that “we already have Critical Data Elements (CDEs)” or “our cybersecurity team handles risk”. What you’re really looking for is identification of the hazards in a decision (or a critical process step) based on data, its impacts, and the controls that you have in place to manage it.
The next installment in this series will explain the details, including where CDE’s can go wrong, how to frame the risk assessment exercise, and how to conduct a risk assessment. You’ll have a few choices to make, so I’ll outline quantitative and qualitative risk assessment methods. We’ll even use your bowtie to visualize threats and incidents in data operations.
Find out how to do a DQ risk assessment in the next article.
Ultranauts helps companies establish and continually improve data quality through efficient, effective data governance frameworks and other aspects of data quality management systems (DQMS), especially high impact data value audits. If you need quality management systems for your data pipelines, Ultranauts can quickly help you identify opportunities for improvement that will drive value, mitigate risks, reduce costs, and increase impact.
Additional Reading:
Dobson, P. (2023, February 7). Part 1: Is Data Really Your Greatest Asset? Ultranauts Blog. Available from https://info.ultranauts.co/blog/is-data-really-your-greatest-asset
Dobson, P. (2023, February 7). Part 2: Is Your Data Worth Maintaining? Ultranauts Blog. Available from https://info.ultranauts.co/blog/is-your-data-worth-maintaining
Author: Peter Dobson is a data quality professional with a M.Sc. in Mechanical Engineering and background in industrial inspection and maintenance.