Archive for the ‘Delivery’ Category


When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 6 of 6)

April 5, 2013

In my previous blog I explored the importance of a firm understanding of commercial packaged applications on data quality success. In this final post, I will examine the benefits of having operational experience as a key enabler of effective data quality delivery.

Similar to understanding packaged applications functionality, having hands on experience in an operational capacity (procurement, production planning, order fulfillment, etc.) provides an invaluable perspective and appreciation for the intended purpose of data on the processes they enable. Since data permeates every facet of the enterprise, it should be no surprise that operational roles work very closely with data – typically as both a consumer and producer. Regardless of which end of the data supply chain you reside on, folks in these roles have first-hand experience when it comes to the impacts of bad data on day-to-day operations. For example consumers of bad data often times end up having to design elaborate workarounds to make up for deficiencies in the data that they receive. These exception processes drive increased overhead and drag, resulting in operational inefficiencies and work output delays – which in turn may result in further downstream impacts. Not only do these consumers understand the importance of good data on operational effectiveness, they have an innate understanding of which particular data attributes are more critical than others when it comes to their domain of expertise (e.g. non-stock procurement). This expertise perspective, coupled with a profound appreciation for the importance of good data, makes folks with operational experience great data quality candidates. All that remains is making sure these resources have the necessary technical skills to perform the job. In my experience, layering on the technical proficiency to complement these functional skills is much easier than the other way around.

Lately, we have been witnessing the emergence of the data scientist role. The recognition of this role is helping to elevate data’s importance to the enterprise, and the need for the “right” skills to harness the business benefit potential that effective data management and use can deliver. However, to date I have seen a disproportionate emphasis on the more technical aspects of the role. It will take more than advanced analytics, predictive modeling and slick algorithms to advance the cause of data management as a means of optimizing business performance. To be truly effective, these new roles will have to balance the “hard stuff” with the “soft stuff”.


When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 5 of 6)

March 11, 2013

In my last posting I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.

Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs).

Since these software applications are built to address specific business function needs, their underlying data models are a direct reflection of the functions they enable. Basically, they are process-driven data models. As these applications have evolved, so have their data models – the two are inextricably bound together. When these applications are implemented, the system is designed for each client’s specific use. Along with any industry data standards, this client specific use dictates the data requirements and business rules – the data’s intended use or purpose. However, it is important to note that these data requirements are within the construct of the packaged application’s data model.

From my perspective, harvesting business rules is one of the more challenging aspects of data quality deliver. Therefore, a firm understanding of how a particular packaged application works, how it can be configured or setup and how it integrates from module to module is very critical when it comes to understanding the data’s business rules. Typically, the analysts who design the system also specify the functional data requirements needed to enable and support the intended functional design.  Therefore, who is in a better position to know which data is important, and what the likely business rules are than a functional analyst who has deep experience designing and implementing a certain packaged application? In contrast, having technical data analysts compile or determine business rules without packaged applications experience is largely a trial and error approach that is highly inefficient and can contribute to an incomplete understanding of the data’s intended purpose. This deficient set of business rules can lead to the identification of false data defects, or data defects that remain undetected. Either way, there can be a profound impact on business performance.

In my next post, I will examine the benefits of having operational experience when it comes to effectively delivering data quality.


When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 4 of 6)

November 28, 2012

In my previous post I emphasized the importance of demonstrated project management fundamentals as a key enabler of effective data quality delivery. In this blog, I will discuss why an understanding of corporate financial concepts is so important to data quality success.

Despite the continued evolution of data management technologies and the growing awareness of the challenges and promises of data quality, business buy-in is still a major barrier to the widespread adoption of data quality as another lever to achieve operational effectiveness. One of the key reasons for limited adoption is a clear linkage between data quality and a business’s performance, which is measured in a myriad of ways from operational metrics to managerial reports to formal KPI’s. But, eventually, the enterprise’s performance is summarized in three key financial statements; the income statement, the balance sheet and the cash flow statement. Positioning data quality impacts or improvements in the context of these financial statements begins to “connect the dots” and moves data quality from the abstract to the concrete and from the theoretical to the practical. To illustrate this point, let’s take a look at the impacts and implications of a simple data quality issue like “undeliverable” billing addresses.

If billing address date defects within a company’s billing system prevent the delivery of customer invoices, there will be an increase in a company’s return mail volume. The obvious implication of this additional return mail is an increase in a company’s shipping and handling expenses associated with the analysis and correction of billing address data defects and subsequent invoice reprocessing. While a localized problem like this is typically not material from an accounting standpoint (i.e. important/significant), when combined with inefficiencies and costs associated with other data defects found throughout the enterprise, the impacts can manifest themselves as higher operating expenses on the income statement, which reduces operating income.

Another impact associated with “undeliverable” billing addresses is the delay in invoices being received by the customer. However, the implications of billing delays are less obvious. If the undeliverable billing address issue is large enough, invoice delays will likely have an adverse effect on the company’s Days Sales Outstanding (DSO – the average number of days it takes to collect revenue after a sale has been made), and therefore on the Cash Conversion Cycle (CCC – the time between outlay of cash and cash recovery). Additionally, an increase in DSO can have a negative impact on collections since there is often a correlation between the age of receivables and write offs. To counter the increased risk associated with customer non-payment, the company will have to increase its “reserve for bad A/R” on the balance sheet. The net implication of delayed billing will be a weakened cash position as reflected on the company’s cash flow statement. This “tightening” of cash increases the company’s need for, and cost of, borrowing for expansion, inventory, product development, sales and marketing efforts, etc.

While proactively eliminating billing address data defects can sufficiently mitigate these risks to business performance, gaining business buy-in and support is still a necessary first step. However, simply telling the business that they have data quality challenges that need to be corrected, without communicating the impacts and implications in the context of business performance, misses the mark and makes the job more difficult. In my next post, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.


When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 3 of 6)

August 17, 2012

In my previous post I discussed effective stakeholder management and communications as a key enabler of successful data quality delivery. In this blog, I will discuss the importance of demonstrated project management fundamentals.

Large-scale, complex enterprise Data Quality and Data Management efforts are characterized by numerous activities and tasks being performed iteratively by multiple resources, across multiple work streams, with high volume units of work (i.e. dozens of source systems and data objects, hundreds of tables, thousands of data elements, hundreds of thousands of data defects and millions of records). Without the means to effectively define, plan and manage these efforts, success is nearly impossible.

Thankfully, there are many recognized paths to attaining and developing solid project management skills; certification from an industry recognized organization like the Project Management Institute (PMI), a degree from an accredited university, formal on the job training, etc. Yet, one of the more common pitfalls continues to be the belief that smart people, common sense, regular meetings and a spreadsheet are adequate to effectively manage a project or a portfolio. However, one glance at any project management manual or text book, or a quick spin through any formal project management framework is all that it should take to convince even the biggest skeptic that there is much more to project management. Take PMI’s Project Management Body of Knowledge (PMBOK) for example. Within PMBOK, there are 9 discrete knowledge areas covering everything from scoping to quality assurance to risk management to financial controls, and each of these knowledge areas demands specific skills, practices and techniques to be effectively performed. Without a formal project management structure, rigorous discipline, and appropriately skilled resources, even small work efforts can spiral out of control when seemingly minor issues and challenges quickly accumulate and lead to missed deadlines and blown budgets. And when you multiply these “minor” issues and challenges across an entire portfolio, the impact can be disastrous. So the next time you’re looking to cut a few corners, think twice about underwhelming project management – it will likely come back to bite you.

In my next post I will discuss the importance of grasping basic business financial concepts to enable effective Data Quality delivery. Stay tuned.