Archive for the ‘Data Quality Value Proposition’ Category

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 5 of 6)

March 11, 2013

In my last posting I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.

Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs).

Since these software applications are built to address specific business function needs, their underlying data models are a direct reflection of the functions they enable. Basically, they are process-driven data models. As these applications have evolved, so have their data models – the two are inextricably bound together. When these applications are implemented, the system is designed for each client’s specific use. Along with any industry data standards, this client specific use dictates the data requirements and business rules – the data’s intended use or purpose. However, it is important to note that these data requirements are within the construct of the packaged application’s data model.

From my perspective, harvesting business rules is one of the more challenging aspects of data quality deliver. Therefore, a firm understanding of how a particular packaged application works, how it can be configured or setup and how it integrates from module to module is very critical when it comes to understanding the data’s business rules. Typically, the analysts who design the system also specify the functional data requirements needed to enable and support the intended functional design.  Therefore, who is in a better position to know which data is important, and what the likely business rules are than a functional analyst who has deep experience designing and implementing a certain packaged application? In contrast, having technical data analysts compile or determine business rules without packaged applications experience is largely a trial and error approach that is highly inefficient and can contribute to an incomplete understanding of the data’s intended purpose. This deficient set of business rules can lead to the identification of false data defects, or data defects that remain undetected. Either way, there can be a profound impact on business performance.

In my next post, I will examine the benefits of having operational experience when it comes to effectively delivering data quality.

Advertisements
h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 4 of 6)

November 28, 2012

In my previous post I emphasized the importance of demonstrated project management fundamentals as a key enabler of effective data quality delivery. In this blog, I will discuss why an understanding of corporate financial concepts is so important to data quality success.

Despite the continued evolution of data management technologies and the growing awareness of the challenges and promises of data quality, business buy-in is still a major barrier to the widespread adoption of data quality as another lever to achieve operational effectiveness. One of the key reasons for limited adoption is a clear linkage between data quality and a business’s performance, which is measured in a myriad of ways from operational metrics to managerial reports to formal KPI’s. But, eventually, the enterprise’s performance is summarized in three key financial statements; the income statement, the balance sheet and the cash flow statement. Positioning data quality impacts or improvements in the context of these financial statements begins to “connect the dots” and moves data quality from the abstract to the concrete and from the theoretical to the practical. To illustrate this point, let’s take a look at the impacts and implications of a simple data quality issue like “undeliverable” billing addresses.

If billing address date defects within a company’s billing system prevent the delivery of customer invoices, there will be an increase in a company’s return mail volume. The obvious implication of this additional return mail is an increase in a company’s shipping and handling expenses associated with the analysis and correction of billing address data defects and subsequent invoice reprocessing. While a localized problem like this is typically not material from an accounting standpoint (i.e. important/significant), when combined with inefficiencies and costs associated with other data defects found throughout the enterprise, the impacts can manifest themselves as higher operating expenses on the income statement, which reduces operating income.

Another impact associated with “undeliverable” billing addresses is the delay in invoices being received by the customer. However, the implications of billing delays are less obvious. If the undeliverable billing address issue is large enough, invoice delays will likely have an adverse effect on the company’s Days Sales Outstanding (DSO – the average number of days it takes to collect revenue after a sale has been made), and therefore on the Cash Conversion Cycle (CCC – the time between outlay of cash and cash recovery). Additionally, an increase in DSO can have a negative impact on collections since there is often a correlation between the age of receivables and write offs. To counter the increased risk associated with customer non-payment, the company will have to increase its “reserve for bad A/R” on the balance sheet. The net implication of delayed billing will be a weakened cash position as reflected on the company’s cash flow statement. This “tightening” of cash increases the company’s need for, and cost of, borrowing for expansion, inventory, product development, sales and marketing efforts, etc.

While proactively eliminating billing address data defects can sufficiently mitigate these risks to business performance, gaining business buy-in and support is still a necessary first step. However, simply telling the business that they have data quality challenges that need to be corrected, without communicating the impacts and implications in the context of business performance, misses the mark and makes the job more difficult. In my next post, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 2 of 6)

March 30, 2012

In my first post I introduced the concepts of hard skills and soft skills in the context of data quality delivery, and I identified 5 soft skills that I think are highly critical to data quality delivery success, and which are typically underestimated; stakeholder management and communications, financial management, project management, commercial applications and operations. In this blog, I will discuss effective stakeholder management and communications as a key enabler of successful data quality delivery.

As data quality professionals, we are continually competing with others in the enterprise for mind share and wallet share. One of the quickest and easiest way to distinguish your data quality efforts from other IT and business initiatives is by clearly communicating with your key stakeholders. Why? Because very few are doing it, or doing it well. But it shouldn’t be treated as a one-time event. You need to make a commitment to stakeholder management as a regular practice and make it part of everything you do.

As a discipline, stakeholder management can be extremely nuanced and there are very sophisticated techniques and approaches. But as a primer, here are the high points. First, who are your stakeholders? The simple answer is “any person or group who benefits from or is impacted by your efforts”. Are there more elaborate definitions? Sure, but this will get you started. Now that we’ve defined a stakeholder, how do we manage and communicate to them? Here’s a high level process to guide you; 1) Identify key stakeholders, 2) Determine their roles, 3) Assess their perception of data quality or your data quality effort (e.g. advocate, agnostic or adversary), 4) Define what you want from them and where they are (e.g. influence, engagement and support) and 5) Figure out what you need to do to get them there. Once you’ve identified and assessed your stakeholders, assign team members to manage certain stakeholders, prepare the necessary communications materials and measure and refine repeatedly.

Here are a few practical tips to get you on your way to better stakeholder management:

  1. Make it a priority and stick with it.
  2. For some, it’s not intuitive or comfortable to regularly engage key stakeholders, particularly those a few levels above you in the organization. Challenge yourself and get out of your comfort zone, it will eventually become second nature.
  3. Regularly ask your key stakeholders for feedback. It’s not necessary to have formal surveys, though they are nice. A simple “how am I doing” works just fine.
  4. Become externally focused and adopt a different mindset. Think like your customers.
  5. Always be selling. Celebrate your successes and share them with customers and prospective customers. Nothing sells data quality like data quality success.
  6. Use graphs and charts as aids to communicate visually. A picture is worth a thousand words.

Once you make a conscience decision to anticipate and manage stakeholder needs, it will start to become intuitive and your skills will develop through trial and experience. More importantly, as stakeholder expectations are met and exceeded, demand for your services will increase. That’s the best measure of success.

In the upcoming 3rd part of this series I will discuss the next “soft skill” – demonstrated project management fundamentals. Stay tuned.

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 1 of 6)

March 7, 2012

I regularly receive questions regarding the types of skills data quality analysts should have in order to be effective. In my experience, regardless of scope, high performing data quality analysts need to possess a well-rounded, balanced skill set – one that marries technical “know how” and aptitude with a solid business understanding and acumen. But, far too often, it seems that undue importance is placed on what I call the data quality “hard skills”, which include; a firm grasp of database concepts, hands on data analysis experience using standard analytical tool sets, expertise with commercial data quality technologies, knowledge of data management best practices and an understanding of the software development life cycle.

Don’t get me wrong, these “hard skills” are an important and necessary ingredient to the successful achievement of data quality delivery goals. However, I believe there is disproportionate emphasis placed on these hard skills, while data quality “soft skills” remain unaddressed or are given considerably less weight. The “soft skills” I am referring to are; effective stakeholder management and communications, an understanding of corporate financial concepts, demonstrated project management fundamentals, knowledge of commercial enterprise applications and experience in an operations capacity. In my opinion, the soft stuff is the hard stuff, and these soft skills are harder to acquire and take considerably longer to develop than hard skills. I further believe that these soft skills are the real enabler of a successful data quality effort, without which, the achievement of data quality goals and the advancement of data quality as a means of optimizing business performance are severely compromised.

So how should one go about staffing a data quality team? It used to be that finding people in the enterprise with well-rounded functional and technical skill sets was rare. While these well-rounded resources are still not commonplace, the continued pressures put on the business analyst community due to the out sourcing and off shoring of core technical capabilities has forced business analysts to become more technically self-sufficient. As a result, business analysts are developing much better data analysis skills and database aptitudes. Additionally, over the past several years commercial application providers have done a great job of demystifying their technology and tailoring their solution’s interface, usability and experience to the business analyst community. Consequently, I have found it easier and have had more success taking business analyst resources with solid soft skills and layering on hard skills to make a highly effective data quality analyst.

Over the next 5 postings, I will explore these data quality soft skills in more detail, discuss their importance as key enablers of successful data quality delivery and share some thoughts on how to attract resources with these skills. Stay tuned.

h1

The Non-Traditional Challenges to Achieving Data Quality Success – Part 5 of 5

February 10, 2012

This is the last posting in a 5 part series on the non-traditional challenges to achieving data quality.  In Part 4, I reviewed the Data Quality Perception Gap.  In this post, I will conclude with the Delivery Gap.

The Data Quality Delivery Gap

Once we have successfully marketed, positioned and sold our data quality solution, we must shift our focus to delivery. The surest way to secure additional business is to gain customer confidence and there is no better way to do this than through demonstrated competence. While there are many variables that can impact delivery effectiveness, of those that we can control, skills are the most critical. This brings us to the seventh non-traditional challenge…….successful data quality projects can be delivered with generalists. If the business needs an experienced product manager, they don’t hire a payroll specialist. Then why staff a data quality role with an accountant, or a sales operations manager, or an SQL developer? Yet, this is often what happens, and when the effort fails it is at the expense of data quality’s reputation.

But when it comes to effective data quality delivery, having the right skills is only part of the human resource equation. Which brings us to the eighth and final non-traditional challenge……data quality professionals are not equipped with the proper mindset. In a nut shell, data quality success should not be measured by the amount of data defects cleansed, but rather the degree of business improvement achieved. Having data quality professionals that understand and embrace this perspective is integral to any meaningful data quality success. If data quality is to claim its position as a valued business discipline, we need to recognize that there is more to it than just getting a few smart people in a room. Doing otherwise devalues the proposition and diminishes our profession.

By focusing on these data quality gap areas, a better appreciation for data quality’s value proposition will start to take hold in your organization and the old, traditional challenges will seem……. less challenging.

h1

The Non-Traditional Challenges to Achieving Data Quality Success – Part 4 of 5

February 3, 2012

If you haven’t been following along, in my previous posting I reviewed the Data Quality Positioning Gap as a non-traditional challenge to achieving data quality success.  In this post, I will discuss the Perception Gap.

The Data Quality Perception Gap

Assuming we have properly met the challenges associated with customer expectations and solution positioning, chances are that our customer’s are still not “buying” because of the fifth non-traditional challenge…….data quality solutions are perceived as theoretical or impractical. Often times, data quality solutions appear to boil the ocean and our customers become overwhelmed with the scope and complexity or rightfully dubious of the likelihood of success. While this may not be readily apparent from the customer’s objections or from their rationale for why not to proceed, it is a leading reason why data quality solutions never see the light of day. In order to win our customers’ confidence and their business, we need to be viewed as a data quality expert. Proposing solutions that strain credulity calls this expertise into question.

Even if we are successful in proposing a practical and actionable solution, we need to be mindful of the sixth non-traditional challenge…….data quality solutions are perceived as creative ways not to address the problem. If the customer’s data quality problem can be solved by targeted data cleansing in the source system, then propose a solution that does just that. If the customer is unsure of the degree and impact of their data quality gaps, then propose a data quality solution to help them quantify and qualify their data quality issues. It is never a one size fits all and there’s no quicker way to lose credibility than to propose a solution that doesn’t address the customer’s needs.

My next posting will conclude the series on non-traditional data quality challenges.  Until then….

h1

The Non-Traditional Challenges to Achieving Data Quality Success – Part 3 of 5

January 27, 2012

In my last posting, I discussed the Data Quality Expectations Gap and considerations for overcoming it.  In this post, I will cover the Positioning Gap.

The Data Quality Positioning Gap

Once we have identified our customers, determined what motivates them and defined the offer, we need to market or “position” our solution. And it goes without saying that we need to do this within the context of what problem we are trying to solve. Enter the third non-traditional challenge…….data quality is incorrectly positioned as an end, rather than the means. More times than not, this is the direct result of not understanding customer motivators, as outlined in the previous section on the Expectations Gap. As a result, we erroneously conclude that the customer is looking for data quality and we further perpetuate the mismatch between expectation and message.

However, if we have done a good job understanding our customers and their expectations and even realize that data quality is an enabler and not the real business goal, we still have to overcome the fourth non-traditional challenge…….data quality is primarily positioned as a technology and not a business solution. Often times I see data quality professionals leading with technical features and functions instead of business benefits. For example, terms like entity resolution, standardization, normalization, enrichment and domain integrity all ring hollow if they are not positioned relative to the business problem our customers are attempting to solve. Let’s be honest, a VP of Sales and Marketing wouldn’t recognize domain integrity in product data if it jumped up and bit her, nor should she.

In my next posting, I will discuss the Data Quality Perception Gap.  Until next time.