When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 5 of 6)

March 11, 2013

In my last posting I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.

Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs).

Since these software applications are built to address specific business function needs, their underlying data models are a direct reflection of the functions they enable. Basically, they are process-driven data models. As these applications have evolved, so have their data models – the two are inextricably bound together. When these applications are implemented, the system is designed for each client’s specific use. Along with any industry data standards, this client specific use dictates the data requirements and business rules – the data’s intended use or purpose. However, it is important to note that these data requirements are within the construct of the packaged application’s data model.

From my perspective, harvesting business rules is one of the more challenging aspects of data quality deliver. Therefore, a firm understanding of how a particular packaged application works, how it can be configured or setup and how it integrates from module to module is very critical when it comes to understanding the data’s business rules. Typically, the analysts who design the system also specify the functional data requirements needed to enable and support the intended functional design.  Therefore, who is in a better position to know which data is important, and what the likely business rules are than a functional analyst who has deep experience designing and implementing a certain packaged application? In contrast, having technical data analysts compile or determine business rules without packaged applications experience is largely a trial and error approach that is highly inefficient and can contribute to an incomplete understanding of the data’s intended purpose. This deficient set of business rules can lead to the identification of false data defects, or data defects that remain undetected. Either way, there can be a profound impact on business performance.

In my next post, I will examine the benefits of having operational experience when it comes to effectively delivering data quality.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: