Archive for the ‘Data Governance’ Category


When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 2 of 6)

March 30, 2012

In my first post I introduced the concepts of hard skills and soft skills in the context of data quality delivery, and I identified 5 soft skills that I think are highly critical to data quality delivery success, and which are typically underestimated; stakeholder management and communications, financial management, project management, commercial applications and operations. In this blog, I will discuss effective stakeholder management and communications as a key enabler of successful data quality delivery.

As data quality professionals, we are continually competing with others in the enterprise for mind share and wallet share. One of the quickest and easiest way to distinguish your data quality efforts from other IT and business initiatives is by clearly communicating with your key stakeholders. Why? Because very few are doing it, or doing it well. But it shouldn’t be treated as a one-time event. You need to make a commitment to stakeholder management as a regular practice and make it part of everything you do.

As a discipline, stakeholder management can be extremely nuanced and there are very sophisticated techniques and approaches. But as a primer, here are the high points. First, who are your stakeholders? The simple answer is “any person or group who benefits from or is impacted by your efforts”. Are there more elaborate definitions? Sure, but this will get you started. Now that we’ve defined a stakeholder, how do we manage and communicate to them? Here’s a high level process to guide you; 1) Identify key stakeholders, 2) Determine their roles, 3) Assess their perception of data quality or your data quality effort (e.g. advocate, agnostic or adversary), 4) Define what you want from them and where they are (e.g. influence, engagement and support) and 5) Figure out what you need to do to get them there. Once you’ve identified and assessed your stakeholders, assign team members to manage certain stakeholders, prepare the necessary communications materials and measure and refine repeatedly.

Here are a few practical tips to get you on your way to better stakeholder management:

  1. Make it a priority and stick with it.
  2. For some, it’s not intuitive or comfortable to regularly engage key stakeholders, particularly those a few levels above you in the organization. Challenge yourself and get out of your comfort zone, it will eventually become second nature.
  3. Regularly ask your key stakeholders for feedback. It’s not necessary to have formal surveys, though they are nice. A simple “how am I doing” works just fine.
  4. Become externally focused and adopt a different mindset. Think like your customers.
  5. Always be selling. Celebrate your successes and share them with customers and prospective customers. Nothing sells data quality like data quality success.
  6. Use graphs and charts as aids to communicate visually. A picture is worth a thousand words.

Once you make a conscience decision to anticipate and manage stakeholder needs, it will start to become intuitive and your skills will develop through trial and experience. More importantly, as stakeholder expectations are met and exceeded, demand for your services will increase. That’s the best measure of success.

In the upcoming 3rd part of this series I will discuss the next “soft skill” – demonstrated project management fundamentals. Stay tuned.


When it comes to Data Quality Delivery, the Soft Stuff is the Hard Stuff (Part 1 of 6)

March 7, 2012

I regularly receive questions regarding the types of skills data quality analysts should have in order to be effective. In my experience, regardless of scope, high performing data quality analysts need to possess a well-rounded, balanced skill set – one that marries technical “know how” and aptitude with a solid business understanding and acumen. But, far too often, it seems that undue importance is placed on what I call the data quality “hard skills”, which include; a firm grasp of database concepts, hands on data analysis experience using standard analytical tool sets, expertise with commercial data quality technologies, knowledge of data management best practices and an understanding of the software development life cycle.

Don’t get me wrong, these “hard skills” are an important and necessary ingredient to the successful achievement of data quality delivery goals. However, I believe there is disproportionate emphasis placed on these hard skills, while data quality “soft skills” remain unaddressed or are given considerably less weight. The “soft skills” I am referring to are; effective stakeholder management and communications, an understanding of corporate financial concepts, demonstrated project management fundamentals, knowledge of commercial enterprise applications and experience in an operations capacity. In my opinion, the soft stuff is the hard stuff, and these soft skills are harder to acquire and take considerably longer to develop than hard skills. I further believe that these soft skills are the real enabler of a successful data quality effort, without which, the achievement of data quality goals and the advancement of data quality as a means of optimizing business performance are severely compromised.

So how should one go about staffing a data quality team? It used to be that finding people in the enterprise with well-rounded functional and technical skill sets was rare. While these well-rounded resources are still not commonplace, the continued pressures put on the business analyst community due to the out sourcing and off shoring of core technical capabilities has forced business analysts to become more technically self-sufficient. As a result, business analysts are developing much better data analysis skills and database aptitudes. Additionally, over the past several years commercial application providers have done a great job of demystifying their technology and tailoring their solution’s interface, usability and experience to the business analyst community. Consequently, I have found it easier and have had more success taking business analyst resources with solid soft skills and layering on hard skills to make a highly effective data quality analyst.

Over the next 5 postings, I will explore these data quality soft skills in more detail, discuss their importance as key enablers of successful data quality delivery and share some thoughts on how to attract resources with these skills. Stay tuned.


The Non-Traditional Challenges to Achieving Data Quality Success – Part 3 of 5

January 27, 2012

In my last posting, I discussed the Data Quality Expectations Gap and considerations for overcoming it.  In this post, I will cover the Positioning Gap.

The Data Quality Positioning Gap

Once we have identified our customers, determined what motivates them and defined the offer, we need to market or “position” our solution. And it goes without saying that we need to do this within the context of what problem we are trying to solve. Enter the third non-traditional challenge…….data quality is incorrectly positioned as an end, rather than the means. More times than not, this is the direct result of not understanding customer motivators, as outlined in the previous section on the Expectations Gap. As a result, we erroneously conclude that the customer is looking for data quality and we further perpetuate the mismatch between expectation and message.

However, if we have done a good job understanding our customers and their expectations and even realize that data quality is an enabler and not the real business goal, we still have to overcome the fourth non-traditional challenge…….data quality is primarily positioned as a technology and not a business solution. Often times I see data quality professionals leading with technical features and functions instead of business benefits. For example, terms like entity resolution, standardization, normalization, enrichment and domain integrity all ring hollow if they are not positioned relative to the business problem our customers are attempting to solve. Let’s be honest, a VP of Sales and Marketing wouldn’t recognize domain integrity in product data if it jumped up and bit her, nor should she.

In my next posting, I will discuss the Data Quality Perception Gap.  Until next time.


The Non-Traditional Challenges to Achieving Data Quality Success – Part 1 of 5

January 11, 2012

As Data Quality professionals, it seems like we are continually confronted with many of the same barriers that we faced years ago when it comes to positioning and achieving the “promise” of data quality. So why has data quality been slow in gaining traction as a valued and integral part of the business operating model? What can we do to overcome this inertia and advance the data quality culture?

Before we can attempt to answer these questions, we need to be able to recognize the challenges that are impeding progress. If you ask any data quality professional to identify the key data quality challenges that they face, the list will invariably include: lack of sponsorship, unclear ownership, environment complexity, high volumes, limited documentation, prohibitive cost, insufficient resources, inadequate tools, etc. These are the “traditional” challenges that most everyone cites and they are certainly real.

However, over the course of the last 10 years I have identified 8 “non-traditional” challenges that I believe present an even greater barrier to data quality success. I have classified these 8 challenges into 4 distinct data quality gap areas: Expectations, Positioning, Perception and Delivery. Over then next 4 postings, I will discuss each of these challenges and provide some considerations for overcoming them. Stay tuned!


For Successful Data Governance – Start Small

December 10, 2011

Two of the more common questions that arise when trying to effectively deploy Data Governance are; “Where do I begin?” and “What business areas should I include?”.  If you start too narrowly, the value and credibility of the effort is questioned.  Be too aggressive, and delivery risk and scalability become a problem. As usual, success comes down to defining and managing scope.  More times than not, however, it is prudent to err on the small side and here’s why. Most organizations are more comfortable making smaller decisions, the likelihood of success is greater, and small failures are less costly (both in capital and in reputation).  Besides, if you start small and are successful, you can always grow.  But if you go big and fail, you’re odds of a second chance are diminished. So how do you keep the scale down, but still deliver something meaningful?  Follow the money, because it’s all about value.  The quote-to-cash and procure-to-pay lifecycles are rife with opportunities.  Go and speak with business and operational leaders.  Familiarize yourself with the “customer’s” strategies and objectives.  Get your hands on the IT project portfolio.  These channels are excellent sources of information about what problems the business is trying to solve.  Once you’ve captured a handful of opportunities, do some basic analysis and fact gathering.  Then short list it to the 2 or 3 you think provide the most value and the best likelihood of delivery success. Here’s your starting point.  Good luck!


If You Build It, Will They Come…..If So, For How Long?

November 27, 2011

It should be no surprise to anyone, but there is much talk in the industry today regarding “data” and the management or control of it.  To that end, commonly used terms such as Master Data Management (MDM) and Data Governance are sometimes used interchangeably and other times have wildly different definitions and applications.  Whether or not the industry should or should not standardize on common terms and definitions is another subject altogether – and one that won’t be resolved any time soon.  But, regardless of what it’s called the enterprise’s desire to better manage and control data is a hot topic, and deservedly so.  But where does that leave Data Quality?

While not typically discussed in detail, it is often implied (and almost always inferred) that the successful deployment of MDM or Data Governance will achieve desired levels of data quality or data integrity.  That’s just not the case.   The arguments or analogies that are offered usually revolve around the concept of prevention.  As this concept relates to the creation and maintenance of master data in the post-MDM environment, it is fundamentally sound.  However, these new tools, processes, roles and responsibilities that have been developed and implemented to better manage and control data apply primarily to the new data.  What about the significant volume of master data that existed in the legacy environment prior to the deployment of MDM?  At time of MDM go-live, this legacy created master data comprises the vast majority, if not all, of the relevant master data – it drives the enterprise.  In order to fully realize the MDM value proposition, there also needs to be a comprehensive undertaking to bring legacy created master data up to acceptable levels.  Deploying data management without adequately addressing legacy data readiness is where the full promise of data management falls short.


The Northfield Point of View

November 21, 2011

Welcome to the Northfield POV. On a regular basis we will be bringing you our unique insight and perspective on a wide range of topics that are related to the rapidly growing and evolving fields of Data Quality, Data Management and Data Governance.