Empower Businesses With a Single
Version of the Truth
Learn why every business needs master data management for improved
business process efficiency, flexibility, and reliability.
Watch the master data
management Webcast. Log-in required.
SAPPHIRE '06 PARIS
SAP NetWeaver MDM: Empowering Business with a Single Version of theTruth
Sunil Gupta, SAP
Rod Hall, Senior Project Manager, Ericsson
Gary Biggar, Data Quality Manager, Diageo
Sunil Gupta, SAP
Quite an audience here. Ladies and Gentlemen,Welcome. My name is Sunil
Gupta. I drive solution marketing with SAP. With here, are my partners
and colleagues in crime, Rod Hall from Ericsson, and Gary Biggar from
Diageo. You may also know his company as a maker of some fine beverages.
What we are going to take you through real quickly is first of all,
exposure and understanding of a few key concepts around what is Master
Data and understanding the pain? Getting to a little bit of the solution
followed by some real life customer case deployments around MDM.
Simplifying your business, let us look really quickly at the problem,
the state of your business today. Here is a simple diagram, the state of
your business. Different departments, you have Master Data today trapped
in different silos.
Different processes generate Master Data every time. You have an
incomplete view today in your business within the four walls that you
have. Now does the problem stop there or does it get bigger?
Well, if you expand the picture of your ecosystem, you are doing
business with your distributors, your suppliers, your end customers.
A lot of faith relies on sharing accurate information about your
products, about the vendors that you are dealing with and so forth. The
problem really is not limited to your business alone. It extends out
into your ecosystem. And as you can see here, you are sharing for
example a different view of the same product. Maybe this is a boot for
example or the same end customer. And as the result of doing that, what
is the pain that you are feeling? Lack of unification really impacts
your bottom line. Here are some sample cases of some customers of ours
today. And you will hear more today, on what their business pains are.
Whether it’s lost sales for GE, whether it is compliance, financial
reporting, and Sarbanes- Oxley compliance for example with Nortel. To a
large B2B distributor who is looking at $400 million dollars being lost
every year, a direct 20% margin hit, because they did not have an
accurate view. In this case, of the products they were selling. The
problem is really large. The question is what is the
impact on your business today?
And we can almost guarantee it in every line of business. You will see
this problem. Now let us look at some of the principles behind Master
Data and understanding the data quality challenge that exists today. On
the left hand side, you see a chart that shows on the vertical axis,
data quality versus time as events are taking place on the horizontal
axis. What is happening? Over time, your business is dynamic.
Data gets generated. Events happen. You go out and do an acquisition.
Low and behold, you are now acquiring other systems that you need to
integrate. So as an example of other events, you may have a new product
introduction cycle; again the same dilemma. As you go through with these
event cycles, you are constantly trying to make sure your data stays
accurate. The challenge is as soon as you start fixing it, something
else comes along and now you have the same problem. You never get to a
point where you have optimum data quality. So what should the ideal
solution be, because the events will never stop happening? Which you
want to put a solution in place, which will hopefully minimize the time
it takes, if you look at the graph, to recover from the impact of these
events. If you do an acquisition, how quickly can you integrate the data
about the new products, about the new customers, about the new suppliers
within your landscape and continue your business? How can you have a
foundation for creating business processes that you can creatively
design, make them flexible, to drive value in your business?
That we believe in short should be the ideal solution. This is what
brings us to data unification with NetWeaver. It is designed such that
you not only consolidate and have one view about your products, your
customers, your vendors, and any other data elements you designate as
Master Data. But you facilitate its consumption within processes. If you
look at the world of Enterprise SOA, you are trying to get to a point
where these business processes can automatically consume the right data
at the right time at the right place of what is needed within your
environment. And that is where we are going with MDM. That is what it
brings today as data unification within NetWeaver. Now some more data
points about MDM itself. What are some of the things about this? This is
a fifth generation technology in finitely configurable SCMAs, which
means you can quickly and accurately create SCMAs to model your own
data. It is a forward cell phases of consolidating, of harmonizing or
syndicating central management, some elements about prepackaged IT and
business scenarios. If you are for example in
the consumer products or retail business, you may be familiar with
things like GDS, Global Data
Synchronization.
It is something that is built and delivered right on top of MDM. Or you
may be looking at a specific vertical application. And we will see that,
for example customer data integration. Today we have over 230 customers
in many vertical industries. And obviously, we will have some customers
who are speaking with us today. Now let us look very quickly through
some of these scenarios. Consolidating is a first step in really laying
the foundation for quickly and accurately consolidating the data coming
from various sources. Whether it is SAP or non-SAP sources. This is
probably the single most important step for anything you do. If you do
not consolidate, cleanse, and normalize your data, you really do not
have Disclaimer: Transcripts are the exact conversation for the
presentation and may contain grammatical errors and other mistakes.
SAP is providing these transcripts, so you can read the presentation in
its original form.
a foundation for moving forward. Various elements around products or
customers, etcetera, can be quickly harmonized once you have this
foundation.
Once you have that foundation, now you are at the next step of actually
being able to syndicate this data as needed. Now different companies are
at different stages of the business. If I did an acquisition today, it
may take me one year or two years to get to a point where I have one
central view of all of my systems. During that time, I need a strategy,
which lets me share data from one system to the other. That is what
harmonization or syndication in this case lets you do. And then Central
Management where you are centrally defining the information elements
about your Master Data. And then synchronizing that, syndicating that to
any system or application, which needs it.
Now let us look at how you can apply these core building blocks to
various business problems. You will hear today something around vendor
consolidation but let me walk you through some examples. One of them is
around the ability to manage information about your products. Products
demand not only the information about the part number or the material.
But for example, images, part specification documents, and any other
pieces of information you designate as Master Data. What are the value
and elements we bring here? The ability to manage taxonomies,
hierarchies, built-in units of measures and conversions. The ability to
search dynamically on part numbers or other attributes where you only
know the partial strength.
For example, you know part of the product. That it is the color red. You
do not have the ability to know anything else. You want to search for
all red hoses for example. You can quickly and easily find this
information in the system. ot only that, if you are in the business of
requiring printed versions of these informations, whether it’s printed
catalogs, or circulars you can do the same thing. Global data
synchronization allows you to syndicate trade item data in one language
to data pools like Transora or UCC Net. And we are working on supporting
some of the other data pools. For example, that is shared here in the
European and other areas. Now what this lets you do is build on top of
MDM. It comes with its own predefined data model UI validation rules.
Which allows you to bring data from your ERP systems, your R/3 systems.
Augment that data as required for the data pool and then validate that
you are publishing this either by trade item number or by the supplier.
And then publish this data into the data pool.
Again, this is built on top of MDM. It utilizes all of the various
NetWeaver components, such as XI for syndicating the data, and that is
really, where the beauty of the architecture comes in. All of this
information, by the way, you can also analyze data by bringing the data
from MDM into analytics, into the world of BI. And that is where the
predefined, pre-built integration makes that happen. One last example,
something new that we are introducing with customer data integration;
gives you the ability to quickly have one view of your customers. Bring
together various elements. Define matching and consolidating strategies.
I would encourage you once we are done here if you have time to go to
our demo pods here. And take a look at some of the new matching
functionality that is coming out in the Q3 time frame. I think you will
find that to be of interest. The road ahead, new matching immersion
capabilities, that we are defining. The ability to again locate
identical duplicate records, to expose the matching process. Again
vastly enhanced from what we had earlier. And this will again simplify
the way you consolidate information.
So to summarize quickly the benefits of data unification. It accelerates
process change. You cannot have processes without having an accurate
foundation of data. Neither is it useful to just have data without using
it in processes. You need both of them working together. And that is
what we are delivering with data unification and with Enterprise SOA. It
helps you improve productivity. And companies like GE that not only see
top line benefits but bottom line benefits as well, so both cost
reductions and revenue generation. Which is an important point. In cases
like Ebuild, we have seen SKU maintenance, which has gone from $19 a
part to $6 a part for example. But enough of that, why don’t we hear
something more real and tangible from our guests. And I will introduce
Rod.
Rod Hall, Senior Project Manager, Ericsson
My name is Rod Hall. I work for Ericsson in Sweden. I am a part of the
IT management function there. And I want to share with you this
afternoon some of the experiences that we have had working with MDM in
some of the areas that Sunil has mentioned so far.
I am going to take you through a series of pictures here. Going to talk
about the plan, we have for MDM. Our time plan that we have worked to so
far and how we have actually gone on with it. I am going to tell you a
little bit about the landscape that we work with in Ericsson and the
prices that we are trying to support. And then think about the approach
to match the criteria. And then talk a little bit about the experiences
so far. Because I understand, there’s a certain amount of skepticism
about the ability of MDM to meet up. But I think that’s not our
experience so far at least. And we will talk a little bit about that
anyway.
A little bit about Ericsson first of all as you can simply see. Ericsson
is a large international corporation. We operate in virtually all
countries around the world. We have a 180 operating companies. These
companies do a certain amount of trading with each other as well as with
the outside world, both at the supplier and customer level. We employ
55,000 people, just over the current state of play. We have focused on
already MDM pilot on the purchasing area. In particular trying to
consolidate the data that we have actually got for the various
purchasing systems in use. To summarize in the picture we have got. We
have basically two large global systems supporting purchasing activities
within Ericsson. One, which we call our Market Unit Solution, which is
supporting over 95 companies at the present moment in time. Growing
almost by the month. And we have a second large system which basically
supporting the major supply functions. Which tends to be mainly in
Sweden and China from our point of view. And in addition to that, like
most large corporations, we are not 100% pure SAP. We actually have
about forty other local systems that are also involved in purchasing
activities. We had to basically look for a solution that would cover
that full spectrum of systems. Therefore, what we needed was a tool that
would support our Global Master Data Group. We have established a Global
Master Data Group to try and to manage this monster of trying to contain
different views of Master Data within the whole corporation.
The group has been functioning for several years now. And largely it has
worked with manual tools up until now. And we needed to give them a tool
that would enable, to actually automate some of their work. Take away
some of what we call the grunt work from what they are doing. And put
the uplink to the brainwork. Use their brains rather than the time they
have to spend sort of crunching through this sort of manual task that
they have had at the present moment in time. We started to look last
year for suitable tools. And as an SAP shop, I guess it is fair to say
that we looked first and foremost for something that would actually work
with our major SAP systems. But let me say, it is not really enough to
do that. We have to have a tool as well that would actually work with
the whole of our landscape including the non-SAP systems that we have
got as well.
We had an evening session, where we invited our Master Data Group
colleagues in and we spoke with SAP. And I think we all left the room
fairly infused that MDM looked like a good solution. But of course, as
we say in English, the proof of the pudding is in the eating. And
although it looked very good on paper, let us say, we really did not
know what we were getting into. But I think our experience has been
pretty good. And I will come back to that a bit later on at least. We
were looking really for a tool that would enable us to work with SAP and
non-SAP solutions. And most importantly as well, we are fairly committed
to the idea of actually extending our use of service-oriented
architecture within Ericsson. And using NetWeaver as the sort of driving
platform for that. We wanted a tool that would actually be compatible
with that. We were fairly convinced from talking to SAP that in fact
that MDM is very much in that line of work.
We set up a pilot and we did not want to sort of go forward on too broad
a front to begin with. Because like everything else, it is a new
product. And it is new for us. We wanted to sort of test something
first. Unfortunately, with a company as complex as Ericsson is, even a
pilot can be a fairly large and extensive. We focused very much on the
supplier, Master Data area. And in Ericsson, we have something like a
130,000 vendors. And within that of course, you’ve got a fairly certain
number of employees so it’s normally the case that we have actually two
vendors for every employee. But we have sort of a large number of
vendors, of course to deal with. Roughly about a 130,000 in the present
moment in time including duplicates. Now that of course is what we are
trying to find out. So we do not know how many of them are unique
vendors yet. That is what we are going to find out now over the next few
weeks. We are talking here of everything from sort of one-person
consultancies, to multinational corporations that we are dealing with.
It is really the whole mix of different vendors that we are discussing
here. The application we have chosen to target, first and foremost was
to try to support a BI solution. Which its task is to basically report
on our global span across the full range of suppliers. Currently, up
until now we have been trying to do this manually by aligning the
different
suppliers. A fairly gusty sort of task that somebody needs to do and to
make it more automated. That is what we are trying to get towards. That
is our target system. To get there of course we need to clean the data.
And that is where we are at, at the present moment. And I will come back
to that.
We also wanted to introduce the whole concept of MDM to our global
Master Data Group. Because we see this not finishing of course with
suppliers. This is the first step we will take. The intention is that
they should then become the sort of standard tool. This will be used for
managing Master Data within the whole Ericsson Corporation. And it
hopefully is going to be enough to be a major labor saver for the Master
Data Group. And enable them to do much more clever things with their
time than just crunching through doing both realignment that they do
today. And I think last but not least, we had to set up the operation
infrastructure. We had to actually place this within our fairly complex
SAP landscape. And that we have successfully done. The project timeline
looks something like this. And we started around about August last year
after the summer, just the sort of preparation stage. Went through the
classic of missing a blueprint of how we are going to hang it all
together. And then we went into realization phase. And it is fair to say
that our realization phase is taking a little bit longer than you might
expect to take because we basically had to wait for SAP to release the
various service packs. We needed service pack three but we saw some
value in starting the service pack two, to get
some experience. That is what we did. We did service pack two, then we
upgraded to service pack three. And then did an in-house realization
after that. We’re also fairly cautious, it’s fair to say. We were
expecting there to be some problems because it was a Ramp up product.
And those of you who have been with it, I think it has been my fourth
Ramp up. And first experience has helped me; it is always worth best not
to assume anything. I have to say that this Ramp up has been an actual
revelation. It has been really very smooth. We have had very few
problems in the actual product itself. So I think for those of you
thinking of doing this, we are out of Ramp up. You should not expect any
problems. But we did not get any in the Ramp up either. We probably
could have been more aggressive in our timelines than we actually were,
as it happens. We had our few challenges in just getting the platform
delivered to us but that is nothing to do with the product. It has to do
with our hardware supplier.
Where are we today? We are into cleansing mode today. We went live with
the product about three weeks ago, loaded all the data across as a
one-time load. And we are into cleansing of the data now.
An exact how long it’s going to take, not absolutely clear. But reckon
on probably the next four to six weeks at least of more work.
It is going to take us up to about summer break in Sweden doing the
initial cleansing. And then we will get into a sort of Go Live with a BI
application off the back end of that after the summer’s actually
completed. That is where we are at today. The actual data flow itself,
we are using, as far as possible, the various NetWeaver components we
have got. We use XI to drag data in from the SAP solutions that we’ve
got linked to. That has worked extremely smoothly, been our first major
use of XI within our landscape. We have had it there for a while. Have
not really used it very much up until now. This has been our first big
use of it. And it has gone extremely well. We also use XI to pump the
data out of the MDM tool, back out to the business bench ware system at
the end of the day. Once again, that has gone fairly smoothly. For the
non-activators since we’ve chosen to use one of our existing pieces of
architecture, Purple CBeyond, which is now owned by Sun Microsystems.
And in fact, I do not think it is going to be called CBeyond very much
longer but nevertheless it is a product that we have some experience and
confidence with within our organization. And we chose to use that to
pull in the non- SAP data into the MDM. We have actually used a mixture
of data for different data sources and pumped
it into MDM. That too has gone fairly well and smoothly. We have had no
real issues regarding the Sea Beyond stuff either.
And you can see the sort of data flow that is described there. I mean I
won’t take you through it point by point. But it’s very classical of the
hierarchy we use for consolidation mode. And that’s what we’re into at
the moment, by the way. We chose to do consolidation. The consolidating
data from different sources, leading it the same way in those different
data sources and consolidate you a single view within our business
warehouse system. That’s what we’re trying to do as a first step. I’ll
come out of it later to what we’re planning to do with it going
forwards. Because we see it very much as a first step in this area.
This is the detail scenario. Again, pretty much classic out of the book
stuff, for those of you that are familiar with MDM at the moment. We
haven’t really tried to do anything particularly clever or different
here. We’ve tried to stick as far as possible to absolutely full
standard SAP approaches. And I think it has paid dividends, because
we’ve actually had good support and I say very few problems. We
certainly haven’t made the mistake that we’ve made in previous years. We
tried to customize SOP to exactly meet our requirements. We’ve taken it
as a good enough solution. And our experience has very much been good
enough. We haven’t had a very much resistance to it either. That’s been
a positive experience all around. And down at the bottom of the thing
you can see the range of systems we’re connecting. We’ve got the two
major inputs being the two global SAP systems covering all those
companies. We’ve even got some local R/3 solutions out there. Because we
have, a number of those set in place, supporting various industry
standards, like cable solutions and things like that. And then
we’ve got majority of them, scholar systems, those in Sweden will now
scholar. It’s a small ERP system, which will be used for our very tiny
companies typically of 10-50 employees. And we have various other
systems as well in the mix as well. And that’s going to be extended
later on.
And it’s pretty much the standard sort of flow, as you’d expect of an
MDM solution. We have matching; try to match something like about 90%
automatically and pretty much achieving that at the present moment in
time. The key plans we’ve got involved from the business side of things,
largely the MDM group, they’re our Master Data Group. These are the
central functions supporting our Master Data function within Ericsson.
Supplementing them, we’ve got the sourcing group, the Central Sourcing
team who are providing certain key attributes of data, that only they
are actually to set in. It’s a collaborative effort between the two
things. And then we pass the data when it’s finished over to the BI team
to actually consolidate into the BI solution. I won’t go through this in
detail but just to say one thing, you have to spend a good deal of time
getting a matching strategy right. And for those of you looking into the
sourcing area, you’re going get this as a bit of free consultancy from
us at Ericsson in the material that’s sent out. It’s really giving a lot
of what’s gone into getting this scoring table working as exactly as it
should to. I recommend this to you as a very good starting point for
your own work if you’re looking
into this area. Because it certainly did take us a bit of good time.
Well, concise right away is the sort of intuitive way that MDM works.
Was very easy for us to talk to our Master Data Group. They liked what
they saw. They understood exactly what we were trying to explain to them
very early on. And I think the tool is very professionally put together
from that respect. This is a very good thing. But it does take time to
fine-tune this to get the right results. Experience as a whole and we’ve
had actually a very positive experience. I think MDM has been quite well
received by our Master Data Group. We’ve had a small group of people
from Global Master Data team involved so far. Now we’re spreading it out
to the whole team. And because it is fairly intuitive, because it
actually matches the way that it would naturally tend to work in any
case, we’ve had very little resistance. Often when they see a new
product, you think you get very enthusiastic to begin with, and that
enthusiasm tends to sort of go down as you get into experience. That’s
not been our experience here. It’s been a generally positive steady
state of enthusiasm all the way through. This is not to say we haven’t
had our moments of ups and downs and doubts and so forth along the way.
But we’ve come through all of that. And I think we’re actually in very
good shape now. And generally speaking, the enthusiasm is actually good
to carry on with this
product. Not only in this area, but looking to extend it into other
areas as well. I think that’s a very positive signal to as to take it to
stage. And to stress, it’s still early days for us. I mean we still have
been using it for a few weeks in real terms. We have us an extensive
testing as well to base these experiences on. I think it’s fair to say
that the enthusiasm is also taking hold. We want to see SAP moving
faster to do, to release what we call standard business contact tool to
us. I mean it’s got a very good tool set already of standard links
within to the SAP products. We’d like to see even more of that. And I
think it’s no secret that we’ve expressed to SAP that some of the
standard financial elements we’d like to see. But much more assistance
from SAP and how to manage those going forwards. Let’s say that our
Master Data Group today has to separately manage consistent Master Data
in 21 different SAP clients, you can see it’s pretty much what we call a
horse job in Swedish. And so we’re not absolute that can grind away,
maintaining the same data in all of those different clients. And it’s
not a fun job to do. Nobody would actually wish on his or her worst
enemy. So to get some assistance in that area would be extremely
valuable to us.
So I think we’re very much pushing our SAP to up the pace, so to give us
those things out there. We’ve had some challenges so to say. MDM was a
product that SAP bought. And I think we rather naively expected it would
slot very nicely into the sort of standard global security concept we’ve
got for the rest of the landscape. And to be fair, it doesn’t actually
slot in quite as nice as you might like. We’ve had a few challenges in
terms of getting it working. SAP has listened to us in that and I think
some other customers have said the same thing. And we expect to see some
improvements coming. But we have managed to get an acceptable security
solution working at least. I think that I can sum up by saying that
overall MDM has really fulfilled the goals we set for it. We’re pretty
pleased as a customer generally. It’s doing the job we expected and it’s
doing it in a fairly undramatic way. And that’s what we really want to
see in this sort of situation. It’s not very exciting or sexy if you
like, but it does the job. And that’s really, what you want to see.
General comments as I said earlier. The Ramp up experience
has been unnaturally positive. I’ve been waiting for the first big foul
up to actually come up that we’ve needed the help with. But we really
actually haven’t seen that. And we’re past that now. So generally
speaking, they suggest that the product itself is pretty stable. When
you go through Ramp up and you don’t hit major issues then you know you
got a good product underlying the whole thing. The use of XI has also
been extremely positive. Although it’s a new product that SAP’s brought
in you might expect that it would have had some problems working with
some of the other NetWeaver components. That’s not been our experience.
It’s going extremely smoothly together, with both the Portal and the XI.
The two major areas of NetWeaver we’ve chosen to integrate it with. And
indeed so far, at least, it’s gone well with BI as well. Although we
haven’t really explored those capabilities to the fullest extent yet. I
think things to watch for, I mean clearly it’s on the leading edge. You
got to make sure that you’re up to fairly high levels with the various
service packs in your landscape. Luckily, we at Ericsson tried to keep
fairly well up to date with the latest service packs in the NetWeaver
area. We had to make some fairly unanticipated updates however, keeping
our port carrier to make sure that everything worked. And even in some
of the back end systems, that was a bit of a surprise.
That was probably one of the very sort of risks we faced by being in the
Ramp up situation. It didn’t actually pose any problems, but it could
have done. And one of the things we have found and this again is a bit
of advanced warning for you. Tuning the Microsoft platform that we run
on, to get the right performance, has been a bit of an issue. It’s
always the case we’ve managed to get good performance, but you have to
tune in one of two ways. You have to tune the platform for the mass load
to begin with. You have to then further retune the platform for the sort
of the incremental changes that come afterwards. And we had a few
challenges to begin with to get the initial perform, the initial we
wanted it to. And then having to retune the solution to beyond that. And
as you bring on new data elements, it’s a constant sort of balancing act
between. To attune for the initial load order, to attune for the ongoing
performance. That’s something to be aware. It may be that we haven’t
actually sized the platform correctly. It may be our own fault. But we
have had some challenges in that area. Something to be aware of anyway.
And make sure you focus on the severity set up quite early on because
it’s going to take longer than you think to kick up your major SAP
customs. What are our next steps
planned? We plan to upgrade to service pack four, as soon as it is
possibly available. August we believe. Possibly, even sooner. We’re
looking for the new capabilities that are listed here basically. Dun and
Bradstreet integration, improve matching capabilities, and the
performance enhancements that I mentioned earlier.
What we plan to do with it when we actually got this service pack in
place, is that we plan to then go on and start to extend the thing. We
want to move from consolidation to at least harmonization mode.
The rest of the discussion we should go straight to centralization mode.
But that’s not something to be discussed. It’s the case of we don’t want
to jump too far ahead. But clearly, we want to move from just being a
consolidation shop as far as MDM into harmonization. Really driving this
supply Master Data down. We want to get into using workflow in a much
bigger way and to use the guided procedures and there’ll be forms in
connection with that.
We need to extend the base. I’ve listed some systems before. We’ve got
more procurement systems out there that we need to combine with. That’s
what we’ll also be looking to do. And next, we want to look to extend to
new data types. Customer Master Data is going to be our next area, more
than likely.
We’re fairly convinced that we’ve got a job worth to do here and this
tool can really help us in that area as well. And we would like very
much with SAP’s assistance to get into modeling other objects. Indeed
the financial objects that we’ve discussed as well. So therefore, we’re
looking to collaboration. We don’t want to custom build too much
ourselves. Ok? That’s all that I wanted to say from Ericsson. There will
be time for questions at the end. But I think I’m going to pass over to
my colleague, Gary, now to tell you about his experiences. Gary.
Gary Biggar, Data Quality Manager, Diageo
I’m Gary Biggar. I’ve been with Diageo which is I guess we can grab the
Diageo slide to see where we are, I guess the data module within IS in
Diageo. I’ve had a lot of experience, I guess looking at global product
data issues. And I’m sure we’ve all got experiences of trying to align
product data across your organizations.
And from that experience, Master Data Management for other objects such
as customers and vendors and it came along. The last two years I’ve been
spending time looking at SAP’s offerings in this area.
But I guess in Diageo it’s not all SAP and that is part of the problem.
Similar to Ericsson, we have many legacy systems and we’re going to hook
them all up together. I’ll just tell you a little bit about Diageo PSVN.
Contrary, I do see many people here and I heard that there was a rumor
going out that I was going to be giving away product samples at the end.
And so if anybody’s here just for that, then you can go. Because there
aren’t any on me, I’m afraid.
But Diageo of course, is the leading, I guess drinks, brand of drinks
manufacturer. And we have quite a substantial amount of money being made
out of that activity, close to two billion sterling. And we kind of
split our business pretty much across the world fairly equally. Between
North America where people obviously drink Johnnie Walker, Europe where
most people drink Gordons and internationally where most people drink
all the rest. We’re reasonably well set up in terms of global reach. I
must always say though we do promote responsible drinking. And as a Scot
of course, when I got a kilt on, it’s quite difficult. But I do manage
it. And we put a lot of our effort into educating the public about
misuse of alcohol and so on. 1% of our pretaxed profits go in that, as
you can see. Let’s look at Diageo’s data challenge. First of all, I must
say that Diageo has a bit of strategy, which is a surprise to quite a
lot of people. Having a data strategy, working out what you’re going to
do with your data is not something I have experienced a lot in other
companies. But what we’re attempting to do with our strategy is to
promote high quality data management processes. In many cases when
people put in SAP, R/3 or they put in various solutions, they focus on
their order to cash, their accounts and reports and purchase to pay.
They expect the data to kind of just look after itself as you might
imagine. There are five key themes in our data strategy. First of all,
you have to design your data. And data management is about quality,
enrichment, and so on. You have to have a strategy of how you do your
data maintenance. Are you going to centralize it, are you going to
devolve it? Are you going to make it self-serve? You have to work out
that you’re not going to introduce protocol and even data migration. And
make sure that you can support that activity. And then think about what
infrastructure you need for that.
This strategy is, well as a Word document it’s about 30-40 pages. We
measured our progress against that strategy around about two years ago
and that diagram tries to show just exactly how good or bad we are in
terms of our data management. One of the things that I should draw your
attention to is the data management capability by geography here. We
don’t necessarily have an evenhanded approach to how we do data
management across the organization. And that’s part of the challenge. We
in effect looked at that assessment and we certainly saw the chance to
make a positive step change in data quality. What we realized fairly
early on, was that if you want to increase your data quality, you have
to find who owns each individual piece of data. That can go right down
to granule level. But the key thing that we tried to do was to get at a
senior level sponsorship of data objects. We took our procurement
organization and we said you own vendors. We looked at our packaging
organization and said you own the components and the bells of material
in effect. In order to cash, we said you own customer. We then create a
forum of senior business owners of those objects. And that means that we
can take issues about data quality. We can report data quality to that
student committee. In effect, challenge
them to make Master Data global. Cause if we don’t have that will from
an organizational point of view, then all system efforts are futile. We
looked at some of the issues that we’re facing. We believe that
commandeer to our provider’s worth much reduced cost in terms of
software design and systems’ design.
We said that data is not common. It’s captured in many and various
legacy systems. We’d like to put it into single source. And then we can
leverage the origination of that data in one place. Rather than have
lots of people originating the data more than once. Data is manipulated
a lot to get, to consolidate reporting. Especially within our
Sarbanes-Oxley regulated business. Doing that isn’t compliant in
Sarbanes-Oxley. We have to understand, normalize the terminology when we
describe objects. For instance, mapping keys in MDM is one of the things
that we would have to do. Our resource level to handle data maintenance,
have risen. And I know that is a direct result of, I think, of
implementing SAP. But when I say they’ve risen, they’ve just become more
visible. They were always there. When you implement SAP, people realize
data management or data maintenance becomes greater a key activity. Our
data quality is not consistently measured. In fact, we weren’t doing any
data quality measurement. I don’t know if anybody can say how clean your
product data is. What would you say
50%, 60%? How would you measure it? We worked our way of measuring all
our quality against our objects and I’ll talk a little bit about that in
a minute. Data, the next are from, is that data facilitated in the
context of single applications. Procurement, North America think of
their vendors. The thought doesn’t enter their head that they might be
using vendors that are also used by procurement in the UK or in Europe.
And we’ve under invested in the technology support all elements of data.
I don’t know how much we’ve invested in our ERP program and so on. But
we’ve invested a fraction of that in data management technology.
Therefore, it’s quite easy to justify that we’ve got these million pound
systems and we’ve spent thirty-five pence on data management. What are
we doing to realize our data strategy? Well our data strategy resulted
in three key areas. First of all that flies in the wrong order but XI’s
in that area of linking all those systems together. Our data dictionary
is an in house solution and that’s to define the data. Because one of
the things that we realize is that we don’t have consistent
pieces of data. So, we created our data dictionary.
SAP does not have a Master Data definition solution. We put our in house
system in place for that. We then created; what we realized is what we
need is something to keep our data quality up. So, the data cleansing
tool kit, things like the base of quality, integrity, and harmonization
and so on. MDM teaches for that very well. But it’s key in that area
that we have the KPI reporting capability too. MDM does not report on
itself. We use say BI to do that. As far as the Master Data maintenance
and management, MDM fulfills our requirements in that area. And if you
look at those scenarios or components in detail, let’s take a look at
them. The data maintenance, our objective for data maintenance is to
gather all the data about be it customer, product, or whatever into one
place centrally. To do that, we have to get it all into that one place
and Sunil talked about consolidation and so on. But for instance, vendor
data
maintenance we do internally and customer data maintenance internally.
It is done by different groups in different places to different levels
of quality and so on. If we want to get consistency to that data, then
applying a single process to that makes a hell of a difference. Our
basic steps with that would be to consolidate our sources, cleanse them,
harmonize them and create some data capture interface.
Now, MDM itself, you can maintain your data in MDM. But our feeling is
that we need maybe a webbased interface to do that. You map the data to
the outbound receiving targets. Create your workflow and this great
strong workflow support in MDM and distribute that on a scheduled basis.
Be near real time or on a nightly basis. Then you begin your data
maintenance; seems easy. From a data-cleansing point of view, there are
occurrences when a business rule changes. You might find then in your
own organization that because you’ve started your own process
differently then the whole of your customer file has to have a certain
field changed and rules.
Now that’s quite difficult in an R/3 environment for instance. And
requires somebody with some
programmatic capability I suspect. However, you can easily make a change
to your global file on a certain aspect, according to some rules within
MDM. You could do that if you had already consolidated your data in the
MDM repository and then release to the R/3 environment. Or for instance
if you want to do it as a one off change, you could do just exactly
those steps that I’ve outlined there. Use it as a tool to make a global
change for certain aspects of data. For a data quality point of view, we
use our data dictionary to set quality rules. So, you may say for
example; well there are some of the rules up there. The invoice to
customer number must have a trade class of zero. Don’t know exactly what
that means. Don’t know the detail of them. But these are rules that
we’ve set and we measure those rules in a system that we call our
Anchors. But in effect, you have to assign owners to those and create
the KPIs against them. That is where we take those KPIs from that Data
Steering Group I was talking about before. And we can actually say to
them, by measuring all the various quality rules that you may have about
a Master Data object, you can then give a percentage. Any single number
that says data quality
for product is at 61%. You can then draw a graph and show it to the
stakeholder. From migration point of view, we foresee using MDM in a
migration environment. To take the data into MDM and then to use it to
load on extra parts of our business into the R/3 environment. So for
instance, we may acquire a company or we want to expand the amount of in
market companies that we have using SAP R/3. Our last of the steps would
be to define the object and extract the data. Do the cleansing if you
like, in MDM and then release that over time into the R/3 system.
Global Data Synchronization, being in the consumer goods industry we are
keen to do that. This will indicate what our architecture would look
like were we to do that. Our PLM and bar code management processes would
probably still be in some sort of product data management area. And then
MDM and MDM DDS would be used to serve up the data-to-data pools. By
using the MDM product database to feed into that GDS environment, it can
also synchronize that into R/3 and legacy environment. Let’s just talk
about journey and some bullet points. Establishing the organization
first is certainly the way that we found success in getting BI into the
MDM, the Data Management roadmap. In establishing a data steering group
established the stakeholder group lower level to actually make those
forums for people to discuss data issues. Then the infrastructure came
next. The data dictionary is a huge part of our
infrastructure. And we have already installed MDM at our foundational
area, a layer in Diageo. But there is a great future for MDM in Diageo,
there’s an MDM roadmap covering for instance vendor data maintenance.
That project is up and running. We are looking at data quality cleansing
and consolidation if you like, already in progress. And supply chain
data alignment as you might imagine with GDS in mind is a major project
for us too. We’re quite busy. But our vision is based on a single source
of the truth and other MDM presentations have seen this highlighted too.
We’ve all arrived at the same thing.
One single place to get an authority, definition of your products, your
customers, is a very useful thing to have. We established a data
strategy to achieve that. It was focused on a forum to transformation.
And to make it happen we had to target the business support. It’s not an
IS load activity. It’s not a systems load activity. So we told them,
retold them and we’ll tell them again. As Johnnie Walker might say
journey of a thousand mile steps might seem as that we’re a long way to
go but we’ve got a good starting position with SAP. That’s all I’ve got
to say.
Thanks. Thank you very much.
Sunil Gupta
Thank you very much. I think we have a few minutes to take a few
questions if you’re interested. We’re also around here if you want to
ask some questions. Okay? Thank you very much.
|