Understanding the costs of managing data so you can get the “data break-even point” right, means getting a thorough understanding of the cost structure for maintaining data to enable your organisation to know when it’s appropriate to invest in new technology, processes or people, to reduce those costs and increase data value.
There is an ever-increasing demand for correct data to be delivered in a timely manner, so smart business decisions can be made with confidence. This demand has created the need for businesses to manage and view their data not as technology, but just like any other asset, through the implementation of a data governance programme.
A data governance programme identifies key data areas and establishes processes, supporting technology and most importantly, the people to manage the programmes of work. Unsurprisingly though, a successful data governance programme requires a significant amount of monetary investment in both people and technology. But how do you know how much investment is the right amount?
The right point of investment is called the data break-even point. Too much investment and internal stakeholders begin to lose confidence in the data and processes as projects cost too much to deliver. Too little investment, and projects take too long due to a lack of resources.
Unfortunately, the data break-even point is something that many organisations do get wrong – it is very common for there to be no clear understanding of what it costs for an organisation or business to manage its data.
Just as a business establishes processes and guidelines for creating a new product line, the due diligence needs to be undertaken to identify the break-even point for data investments so they can evaluate at which point the correct data management practices are generating value for the business.
This is something that needs to be a whole-of-organisation responsibility, from the top down. The C-suite team needs to have a good understanding of what the data break-even point is, and drive awareness and change the internal culture.
As with standard business practice, it’s important to understand at which point in time your investment is going to start generating value. Just like when a business creates a new product line to sell, to do due diligence it will invest in market research, design, prototyping and building the production line, which will be included as production costs. A particular value will be set that it wishes to sell its product for and the numbers required to break-even. This is the point at which costs for the product are just being covered.
It’s the same for data. Perform due diligence and get an understanding of the data, outlining the formats, rules and processes under which that data operates and generates value for your organisation.
Unfortunately data value is a difficult ROI to define. There are many cost factors associated with data assets, ranging from software development to human resources for maintaining the data, technology investment business as usual/operational costs, and costs for fixing data faults.
There are two options available to help your organisation move above the data break-even point:
Reduce costs: We can achieve this by using data integration automation tools, such as Biml, to reduce the amount of time need to create data.
You can also make use of tools that help identify and automatically fix data quality errors (e.g. Microsoft Data Quality Services) or ensure that we support master data concepts through using tools such as Maestro. Cost reduction can also be achieved by ensuring you have enterprise data modelling processes in place. This facilitates development time through documentation of data lineage, traceability and overall understanding of the data elements needed.
Increase data value: We can increase the value of the data by adding metadata and ensuring everyone within the organisation has the same understanding of what the data represents. In addition, adding master data management concepts ensures there is a single version of the truth or one ‘gold record’ which increases global understanding, as well as reducing cost.
Your data governance programme should address process, tools and framework for managing your data assets. This also implies the implementation of a continuous monitoring programme with a range of key performance indicators that measure how often data quality problems occur, what types and how many occurrences. This will provide information on addressing the problems up-stream by modifying applications that manage the data or by changing the business processes that operate on the data.
Of course, the big question is how can you measure successful business decision-making. Increasing reliability and confidence in CRM data could be measured through successful marketing campaigns based on the percentage of accurately contacting customers (i.e. the contact data in the CRM system was correct), or what the percentage was of items returned due to incorrect contact details. For an insurance business it could be measured through a noticeable increase in retained customers or a noticeable reduction in processed claims, the latter indicating you have an increased portfolio value.
KPIs on data integration project delivery will show how productive the processes and toolsets are for developing data integration applications. Measured against the amount of rework, these KPIs will highlight whether processes or toolsets will need to be updated or replaced.
So, there are many methods of gathering the base-line information to start creating a data break-even point dashboard. Although the road to achieving this may be an arduous one, the end goal of having a data break-even chart and a very clear understanding of what this point is, and how much investment is going in to the management of your organisation’s data, will help ensure you are getting the most out of one of your business’ most valuable assets.
You can get more information on this and many other information management related issues at www.dama.org
Article by Robert Blaas of Mettle Consulting