Archive for the ‘Data Quality’ Category

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 6 of 6)

April 5, 2013

In my previous blog I explored the importance of a firm understanding of commercial packaged applications on data quality success. In this final post, I will examine the benefits of having operational experience as a key enabler of effective data quality delivery.

Similar to understanding packaged applications functionality, having hands on experience in an operational capacity (procurement, production planning, order fulfillment, etc.) provides an invaluable perspective and appreciation for the intended purpose of data on the processes they enable. Since data permeates every facet of the enterprise, it should be no surprise that operational roles work very closely with data – typically as both a consumer and producer. Regardless of which end of the data supply chain you reside on, folks in these roles have first-hand experience when it comes to the impacts of bad data on day-to-day operations. For example consumers of bad data often times end up having to design elaborate workarounds to make up for deficiencies in the data that they receive. These exception processes drive increased overhead and drag, resulting in operational inefficiencies and work output delays – which in turn may result in further downstream impacts. Not only do these consumers understand the importance of good data on operational effectiveness, they have an innate understanding of which particular data attributes are more critical than others when it comes to their domain of expertise (e.g. non-stock procurement). This expertise perspective, coupled with a profound appreciation for the importance of good data, makes folks with operational experience great data quality candidates. All that remains is making sure these resources have the necessary technical skills to perform the job. In my experience, layering on the technical proficiency to complement these functional skills is much easier than the other way around.

Lately, we have been witnessing the emergence of the data scientist role. The recognition of this role is helping to elevate data’s importance to the enterprise, and the need for the “right” skills to harness the business benefit potential that effective data management and use can deliver. However, to date I have seen a disproportionate emphasis on the more technical aspects of the role. It will take more than advanced analytics, predictive modeling and slick algorithms to advance the cause of data management as a means of optimizing business performance. To be truly effective, these new roles will have to balance the “hard stuff” with the “soft stuff”.

Advertisements
h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 5 of 6)

March 11, 2013

In my last posting I discussed why an understanding of corporate financial concepts is so important to data quality success. In this blog, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.

Packaged applications for ERP, CRM, MRP, HCM, etc. were first introduced decades ago to provide tightly integrated business management functions, standardized processes and streamlined transaction processing. While one can argue whether or not these applications have lived up to all of the hyperbole, the reality is that they have been successful and are here to stay. As these backbone systems continued to evolve and mature, lessons learned from thousands of implementations were incorporated into the model solutions as best practices. These best practices spawned industry standard processes and specialized variants were born (e.g. vertical systems solutions). With the widespread adoption of these solutions, the days of custom building an application to meet the business’s needs have largely disappeared (although exceptions do persist to support specialized needs).

Since these software applications are built to address specific business function needs, their underlying data models are a direct reflection of the functions they enable. Basically, they are process-driven data models. As these applications have evolved, so have their data models – the two are inextricably bound together. When these applications are implemented, the system is designed for each client’s specific use. Along with any industry data standards, this client specific use dictates the data requirements and business rules – the data’s intended use or purpose. However, it is important to note that these data requirements are within the construct of the packaged application’s data model.

From my perspective, harvesting business rules is one of the more challenging aspects of data quality deliver. Therefore, a firm understanding of how a particular packaged application works, how it can be configured or setup and how it integrates from module to module is very critical when it comes to understanding the data’s business rules. Typically, the analysts who design the system also specify the functional data requirements needed to enable and support the intended functional design.  Therefore, who is in a better position to know which data is important, and what the likely business rules are than a functional analyst who has deep experience designing and implementing a certain packaged application? In contrast, having technical data analysts compile or determine business rules without packaged applications experience is largely a trial and error approach that is highly inefficient and can contribute to an incomplete understanding of the data’s intended purpose. This deficient set of business rules can lead to the identification of false data defects, or data defects that remain undetected. Either way, there can be a profound impact on business performance.

In my next post, I will examine the benefits of having operational experience when it comes to effectively delivering data quality.

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 4 of 6)

November 28, 2012

In my previous post I emphasized the importance of demonstrated project management fundamentals as a key enabler of effective data quality delivery. In this blog, I will discuss why an understanding of corporate financial concepts is so important to data quality success.

Despite the continued evolution of data management technologies and the growing awareness of the challenges and promises of data quality, business buy-in is still a major barrier to the widespread adoption of data quality as another lever to achieve operational effectiveness. One of the key reasons for limited adoption is a clear linkage between data quality and a business’s performance, which is measured in a myriad of ways from operational metrics to managerial reports to formal KPI’s. But, eventually, the enterprise’s performance is summarized in three key financial statements; the income statement, the balance sheet and the cash flow statement. Positioning data quality impacts or improvements in the context of these financial statements begins to “connect the dots” and moves data quality from the abstract to the concrete and from the theoretical to the practical. To illustrate this point, let’s take a look at the impacts and implications of a simple data quality issue like “undeliverable” billing addresses.

If billing address date defects within a company’s billing system prevent the delivery of customer invoices, there will be an increase in a company’s return mail volume. The obvious implication of this additional return mail is an increase in a company’s shipping and handling expenses associated with the analysis and correction of billing address data defects and subsequent invoice reprocessing. While a localized problem like this is typically not material from an accounting standpoint (i.e. important/significant), when combined with inefficiencies and costs associated with other data defects found throughout the enterprise, the impacts can manifest themselves as higher operating expenses on the income statement, which reduces operating income.

Another impact associated with “undeliverable” billing addresses is the delay in invoices being received by the customer. However, the implications of billing delays are less obvious. If the undeliverable billing address issue is large enough, invoice delays will likely have an adverse effect on the company’s Days Sales Outstanding (DSO – the average number of days it takes to collect revenue after a sale has been made), and therefore on the Cash Conversion Cycle (CCC – the time between outlay of cash and cash recovery). Additionally, an increase in DSO can have a negative impact on collections since there is often a correlation between the age of receivables and write offs. To counter the increased risk associated with customer non-payment, the company will have to increase its “reserve for bad A/R” on the balance sheet. The net implication of delayed billing will be a weakened cash position as reflected on the company’s cash flow statement. This “tightening” of cash increases the company’s need for, and cost of, borrowing for expansion, inventory, product development, sales and marketing efforts, etc.

While proactively eliminating billing address data defects can sufficiently mitigate these risks to business performance, gaining business buy-in and support is still a necessary first step. However, simply telling the business that they have data quality challenges that need to be corrected, without communicating the impacts and implications in the context of business performance, misses the mark and makes the job more difficult. In my next post, I will examine knowledge of commercial enterprise applications as a key enabler of effective data quality delivery.

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 3 of 6)

August 17, 2012

In my previous post I discussed effective stakeholder management and communications as a key enabler of successful data quality delivery. In this blog, I will discuss the importance of demonstrated project management fundamentals.

Large-scale, complex enterprise Data Quality and Data Management efforts are characterized by numerous activities and tasks being performed iteratively by multiple resources, across multiple work streams, with high volume units of work (i.e. dozens of source systems and data objects, hundreds of tables, thousands of data elements, hundreds of thousands of data defects and millions of records). Without the means to effectively define, plan and manage these efforts, success is nearly impossible.

Thankfully, there are many recognized paths to attaining and developing solid project management skills; certification from an industry recognized organization like the Project Management Institute (PMI), a degree from an accredited university, formal on the job training, etc. Yet, one of the more common pitfalls continues to be the belief that smart people, common sense, regular meetings and a spreadsheet are adequate to effectively manage a project or a portfolio. However, one glance at any project management manual or text book, or a quick spin through any formal project management framework is all that it should take to convince even the biggest skeptic that there is much more to project management. Take PMI’s Project Management Body of Knowledge (PMBOK) for example. Within PMBOK, there are 9 discrete knowledge areas covering everything from scoping to quality assurance to risk management to financial controls, and each of these knowledge areas demands specific skills, practices and techniques to be effectively performed. Without a formal project management structure, rigorous discipline, and appropriately skilled resources, even small work efforts can spiral out of control when seemingly minor issues and challenges quickly accumulate and lead to missed deadlines and blown budgets. And when you multiply these “minor” issues and challenges across an entire portfolio, the impact can be disastrous. So the next time you’re looking to cut a few corners, think twice about underwhelming project management – it will likely come back to bite you.

In my next post I will discuss the importance of grasping basic business financial concepts to enable effective Data Quality delivery. Stay tuned.

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 2 of 6)

March 30, 2012

In my first post I introduced the concepts of hard skills and soft skills in the context of data quality delivery, and I identified 5 soft skills that I think are highly critical to data quality delivery success, and which are typically underestimated; stakeholder management and communications, financial management, project management, commercial applications and operations. In this blog, I will discuss effective stakeholder management and communications as a key enabler of successful data quality delivery.

As data quality professionals, we are continually competing with others in the enterprise for mind share and wallet share. One of the quickest and easiest way to distinguish your data quality efforts from other IT and business initiatives is by clearly communicating with your key stakeholders. Why? Because very few are doing it, or doing it well. But it shouldn’t be treated as a one-time event. You need to make a commitment to stakeholder management as a regular practice and make it part of everything you do.

As a discipline, stakeholder management can be extremely nuanced and there are very sophisticated techniques and approaches. But as a primer, here are the high points. First, who are your stakeholders? The simple answer is “any person or group who benefits from or is impacted by your efforts”. Are there more elaborate definitions? Sure, but this will get you started. Now that we’ve defined a stakeholder, how do we manage and communicate to them? Here’s a high level process to guide you; 1) Identify key stakeholders, 2) Determine their roles, 3) Assess their perception of data quality or your data quality effort (e.g. advocate, agnostic or adversary), 4) Define what you want from them and where they are (e.g. influence, engagement and support) and 5) Figure out what you need to do to get them there. Once you’ve identified and assessed your stakeholders, assign team members to manage certain stakeholders, prepare the necessary communications materials and measure and refine repeatedly.

Here are a few practical tips to get you on your way to better stakeholder management:

  1. Make it a priority and stick with it.
  2. For some, it’s not intuitive or comfortable to regularly engage key stakeholders, particularly those a few levels above you in the organization. Challenge yourself and get out of your comfort zone, it will eventually become second nature.
  3. Regularly ask your key stakeholders for feedback. It’s not necessary to have formal surveys, though they are nice. A simple “how am I doing” works just fine.
  4. Become externally focused and adopt a different mindset. Think like your customers.
  5. Always be selling. Celebrate your successes and share them with customers and prospective customers. Nothing sells data quality like data quality success.
  6. Use graphs and charts as aids to communicate visually. A picture is worth a thousand words.

Once you make a conscience decision to anticipate and manage stakeholder needs, it will start to become intuitive and your skills will develop through trial and experience. More importantly, as stakeholder expectations are met and exceeded, demand for your services will increase. That’s the best measure of success.

In the upcoming 3rd part of this series I will discuss the next “soft skill” – demonstrated project management fundamentals. Stay tuned.

h1

When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 1 of 6)

March 7, 2012

I regularly receive questions regarding the types of skills data quality analysts should have in order to be effective. In my experience, regardless of scope, high performing data quality analysts need to possess a well-rounded, balanced skill set – one that marries technical “know how” and aptitude with a solid business understanding and acumen. But, far too often, it seems that undue importance is placed on what I call the data quality “hard skills”, which include; a firm grasp of database concepts, hands on data analysis experience using standard analytical tool sets, expertise with commercial data quality technologies, knowledge of data management best practices and an understanding of the software development life cycle.

Don’t get me wrong, these “hard skills” are an important and necessary ingredient to the successful achievement of data quality delivery goals. However, I believe there is disproportionate emphasis placed on these hard skills, while data quality “soft skills” remain unaddressed or are given considerably less weight. The “soft skills” I am referring to are; effective stakeholder management and communications, an understanding of corporate financial concepts, demonstrated project management fundamentals, knowledge of commercial enterprise applications and experience in an operations capacity. In my opinion, the soft stuff is the hard stuff, and these soft skills are harder to acquire and take considerably longer to develop than hard skills. I further believe that these soft skills are the real enabler of a successful data quality effort, without which, the achievement of data quality goals and the advancement of data quality as a means of optimizing business performance are severely compromised.

So how should one go about staffing a data quality team? It used to be that finding people in the enterprise with well-rounded functional and technical skill sets was rare. While these well-rounded resources are still not commonplace, the continued pressures put on the business analyst community due to the out sourcing and off shoring of core technical capabilities has forced business analysts to become more technically self-sufficient. As a result, business analysts are developing much better data analysis skills and database aptitudes. Additionally, over the past several years commercial application providers have done a great job of demystifying their technology and tailoring their solution’s interface, usability and experience to the business analyst community. Consequently, I have found it easier and have had more success taking business analyst resources with solid soft skills and layering on hard skills to make a highly effective data quality analyst.

Over the next 5 postings, I will explore these data quality soft skills in more detail, discuss their importance as key enablers of successful data quality delivery and share some thoughts on how to attract resources with these skills. Stay tuned.

h1

The Non-Traditional Challenges to Achieving Data Quality Success – Part 5 of 5

February 10, 2012

This is the last posting in a 5 part series on the non-traditional challenges to achieving data quality.  In Part 4, I reviewed the Data Quality Perception Gap.  In this post, I will conclude with the Delivery Gap.

The Data Quality Delivery Gap

Once we have successfully marketed, positioned and sold our data quality solution, we must shift our focus to delivery. The surest way to secure additional business is to gain customer confidence and there is no better way to do this than through demonstrated competence. While there are many variables that can impact delivery effectiveness, of those that we can control, skills are the most critical. This brings us to the seventh non-traditional challenge…….successful data quality projects can be delivered with generalists. If the business needs an experienced product manager, they don’t hire a payroll specialist. Then why staff a data quality role with an accountant, or a sales operations manager, or an SQL developer? Yet, this is often what happens, and when the effort fails it is at the expense of data quality’s reputation.

But when it comes to effective data quality delivery, having the right skills is only part of the human resource equation. Which brings us to the eighth and final non-traditional challenge……data quality professionals are not equipped with the proper mindset. In a nut shell, data quality success should not be measured by the amount of data defects cleansed, but rather the degree of business improvement achieved. Having data quality professionals that understand and embrace this perspective is integral to any meaningful data quality success. If data quality is to claim its position as a valued business discipline, we need to recognize that there is more to it than just getting a few smart people in a room. Doing otherwise devalues the proposition and diminishes our profession.

By focusing on these data quality gap areas, a better appreciation for data quality’s value proposition will start to take hold in your organization and the old, traditional challenges will seem……. less challenging.