Barbara Latulippe enterprise data architect at Smith & Nephew, a global supplier of medical devices. Rob Barry interviewed Barbara about her use of MDM. You can download this and other interviews about SOA, BPM, and MDM as a podcast.
Rob Barry, SearchSOA.com: So is Smith and Nephew taking a more batch-oriented approach to MDM, or a real-time, SOA-oriented approach?
Barbara Latulippe, Smith and Nephew: Currently we're taking more of a batch approach to MDM. But throughout our landscape and our strategy we will be looking to be doing a real-time SOA and business process-oriented roadmap.
SearchSOA: And what brought about the need for master data?
Latulippe: We had different sources of information. It was manually intensive to get some of the information out. We lacked standards across our environment and there were a lot of business process breakdowns.
SearchSOA: And where would you say Smith & Nephew is in terms of its MDM maturity now?
Latulippe: We've had a four year roadmap in our maturity level. So I'd say we started off with basically very few standards across the enterprise. And we've built up data governance and put the entire MDM framework with the vision, the strategy and, certainly, the metrics and business governance—which is critical to any project. So I believe at this point we're really hoping to achieve a very lean and service-oriented architecture by 2012 in hopes of having a full maturity roadmap.
SearchSOA: Now you say that currently it's more of a batch-oriented MDM system. What kind of frequency do you have with the updates?
Latulippe: Currently we support three major initiatives. One is data synchronization between multiple SAP clients, and that runs nightly. Then we have data quality and data profiling where we have scorecards for our business users that are trying to be a little pro-active in cleaning up that data. Those run weekly. And then we support multiple projects, and certainly those have much larger volumes and critical need. Those run basically on demand and, I'd say, every few days.
SearchSOA: So when it comes time to make updates to these various systems, what's the speed of updating the master data set?
Latulippe: From our recent data migration project, we timed that our current programs using [Legacy System Migration Workbench, an SAP-based tool] were taking 12-plus hours. When we moved to the Informatica data integration it took less than a half hour to run that same set of data.
SearchSOA: And is volume ever an issue in maintaining master data?
Latulippe: Currently, where it is on a batch mode, volume is not an issue for us at all. The products have been fast and have been able to move large volumes of data in support of the data migration projects.
SearchSOA: So what have the biggest challenges been?
Latulippe: The biggest challenges have been getting the business collaboration. It's very business rules intensive. Another challenge has been getting the business ownership to drive the levels of standards that you need across an enterprise. And initially we had no tools. So it was challenging to just make a decision in our tools selection that the business could use. It was important to select tools that were business-friendly to use.
SearchSOA: You mentioned that issue of ownership. I know often different areas of an enterprise will define their data differently. How did you approach creating common definitions for the data?
Latulippe: We did use Informatica to do some data profiling and based on that we were able to generate a data dictionary. From there we got the three business units together and we had to roll up our sleeves and do some workshops that were facilitated by myself to try to drive agreement on all the enterprise data objects. For the most part there was a lot of synergy and a willingness to go to a common definition. Where there were issues, we really looked if we needed a new data field. But for the most part we had a high level of success driving commonality across the enterprise for those key fields. And as a result those are measured through our scorecards. The business then takes that scorecard information and makes sure it's clean on a weekly basis.
SearchSOA: And when the shift to Informatica?
Latulippe: I'd say that was late 2008 and through 2009. We continue to want to build out our portfolio of services with Informatica.
SearchSOA: And what kind of results has Smith & Nephew realized from embracing master data?
Latulippe: Right now we have enterprise collaboration. We have a lot of what I would call proprietary knowledge content with business rules. The three [business units] have been really excited about improving their data cleanup. We've been able to avoid data collisions as well as identify at least $1.4 million in actual and missed opportunity costs. And I can say today the businesses are pretty proud that they have been able to achieve at least 95% data quality in some key areas such as products and customers. Right now we're tackling vendors.
SearchSOA: And finally can you offer any best practices for others who might just be starting to look at master data for their own enterprise?
Latulippe: I'd say that the data structures need to reflect the way you want to do business today. We did struggle with a lot of past issues. So what we really tried to focus on was the business benefit and where we wanted to go. And we also tried to structure the data around how the business organizes information. We did a lot of workshops and reengineering once we were tackling some of these data structures and standards. I'd also say that we wanted to go to common tools across the enterprise so we could leverage them from both an IT and business perspective. And then data ownership and incentives are critical to data quality, accuracy and completeness. So we invested in data quality analyst roles. We had the business present their results back to the senior leadership team and we're looking to put data quality in individual MBOs.
This was first published in April 2010