Enterprise performance management (EPM) spending remains strong in 2006, with nearly $23B planned for software, hardware, labor (internal and external), and integration services. This is according to the results of a detailed survey AMR Research conducted of more than 200 companies across all industry sectors within North America. The data reflects each organization’s plans for expenditures and technology deployment, as well as their goals and objectives for EPM. The details of this study will be published in AMR Research’s Market Analytix Report “Trends in Enterprise Performance Management, 2006” in the next few weeks.
Year-Over-Year Comparison: Modest Growth, Big Shifts in Spending
We queried companies about their planned spending in five major areas:
The following are some trends we uncovered:
Better management is still impetus for investment
This year, decision makers noted what they hoped to achieve with their
performance management initiatives. In order, the top three responses were as
follows:
Additional Information:
Research provided by kind permission of
AMR Research.
10
Mistakes to Avoid in Data Warehousing
10
Mistakes to Avoid in Data Warehousing
INFORMATION IMPACT International, Inc
White Paper By Larry English.
Our Server
Five Steps to Evolving into an Intelligent, High-Performing Enterprise
Five
Steps to Evolving into an Intelligent, High-Performing Enterprise
SAS White Paper
Our Server
This executive overview of the Basel II initiative from a credit and operational risk perspective offers a primer on Basel II methodologies and associated technologies. It is a condensed version of a recent Aberdeen White Paper entitled Analytics and Reporting: A Basel II Requirement. For a more in-depth look at the specific technological requirements of the legislation, you can read the full paper free of charge at Analytics and Reporting: A Basel II Requirement.
Executive Summary
Although most bankers know that reducing risk can directly boost earnings, few
banks other than the big money centers have developed and implemented
comprehensive risk programs and platforms.
This disparity between the need for risk prudence and the apparent unwillingness among many financial institutions to pick up the pace has not gone unnoticed by the banking industry’s watchdog, the Basel Committee. This international rulemaking body for banking compliance has issued the 2001 Basel II Accord, demanding that banks either increase their capital reserves or demonstrate that they can systematically control both their credit and operational risk.
Under the accord, banks that exhibit the appropriate risk diligence get a big payoff. They can use a significant portion of the money formerly forced into reserve to fund other business. Although the accord officially begins in 2007, the Basel Committee expects banks to start proving that they already are getting a handle on risk compliance today. Failure to demonstrate an upward trend in compliance will force “risky” banks, starting in 2007, to reserve far more money than they have had to in previous years.
Proving and documenting risk acumen demand taking credit and operational risk analysis to the extreme. The risk manager’s technology arsenal — data warehousing, multidimensional analysis, data mining, and reporting –– will be more necessary than ever before and will need to be applied in novel ways. Institutions will not only have to gather, organize, and analyze new types of data linked to the risk characteristics of financial instruments and transactions but also measure and document the effectiveness of their risk mitigation strategies. Risk officers and other senior management will need metrics that enable them to assess their risk positions and observe the continuing effects of actions on that position.
Although Basel II is often confusing, the accord could not be clearer on one point: banks must build “an appropriate systems infrastructure” to identify and gather data for an enterprise-level risk database. To meet these demands, banks will need robust, risk-related analytic technology and processes (Figure 1).
Figure 1: Basel II Analytic Requirements for Risk Management
Source: Aberdeen Group, April 2003
What Capital Reserves Mean to Banks
Aberdeen conservatively estimates that banks will spend $3.2 billion in the next
four years preparing for Basel II. When compared with the $400 million that
banks spent in 2002 on risk management technologies, however, the $3.2 billion
reflects an annual compounded growth rate of 20%. The $3.2 billion signals the
significance of Basel II; it is important because it determines the amount of a
bank’s capital reserves, thus directly influencing a bank’s financial and
operational performance.
As a rule of thumb, the larger the reserve requirement, the less business a bank can transact. In essence, the Basel II initiative is a complex calculation, the capital ratio that is applied to a bank’s risk-related assets, including the bank’s trading portfolio, book of loans, and operational systems. This percentage is based on Basel dictates as well as on the bank’s own assessment of its performance. To derive its risk position and thereby calculate the amount of money it must place in reserve, the bank must tally the amount of risk associated with each asset.
Basel II must be examined relative to its predecessor. Much to the dissatisfaction of banks that already have the technological wherewithal to measure themselves, Basel I demanded (and still does) that banks reserve approximately 8% of each asset class. Thus, if a bank has about $1 billion in loans, it must reserve approximately $80 million, or 8% of this asset class. Basel II is much more liberal, but much more demanding. Basel II is a much more granular calculation than its predecessor, but the onus is on a bank to derive and prove its risk position.
As every banker knows, the actual amount of capital reserve pales in comparison with the detrimental, multiplier effect it has on the bank’s ability to invest and lend. In short, the capital reserve takes money out of the hands of bankers who want to do business with it. Here is why. Banks are built on a leveraged model. Basel II dictates that a bank must reserve between $0.37 and $42.00 for every $100.00 in business loans. Banks obviously want to keep to the low end of the spectrum.
Aggressive banks play with riskier activities, and the accord says that these banks have one of two options. The banks either must keep a larger capital reserve than their more conservative brethren or demonstrate a superior ability to monitor and control the elements of risk exposure.
Basel II: Incenting Risk Experience and Computational Skill
Many leading banks are ready to prove that they can demonstrate the “right
stuff.” In fact, these institutions lobbied to remove the “one size fits all”
dictates of Basel I. The more sophisticated financial institutions contend that
their risk-based internal models protect them from running into financial
trouble. According to the argument, this extra margin of safety should lower the
requirements for capital reserves, in both absolute and relative terms. And the
Basel II Accord has embraced the argument. To get the preferential treatment,
banks must compile an auditable trail that compares what they think they will
lose with their actual performance.
To remove a potential bias toward rich and powerful institutions and incent banks of all sizes to get on the risk bandwagon, the accord has fashioned a two-phased approach to risk reduction, with increased benefits at successive stages. In the first phase of risk measurement, called the Foundation Methodology, banks compute the probability of loss for each customer, translate those losses into hard dollars, and apply a multistep risk formula to that profile. With the Advanced Methodologies Approach (AMA), banks measure themselves against the same risk factors and weightings as in Phase I, but they are allowed to apply their own internal calculations. In both phases, these computations are somewhat different for credit and operations, based on their different natures, but they are essentially the same. In the credit discipline, these phases are called the Internal Ratings-Based (IRB) method; in the operational context, Phase 1 is called the Standardized Approach, and Phase II is called the Internal Measurement Approach.
In Phase II for both credit and operations, the weightings and the resulting reserve obligation can change dramatically across the entire book of business, thus rewarding the experience and computational acumen of the financial institution.
A Multidimensional Primer on Credit Risk
Credit risk applies to all banks because they make loans, guarantee the
performance of another loan, or extend lines of credit to customers. Basel II
requires banks to measure credit risk for the following types of credit:
corporate, interbank, sovereign (i.e., government), retail, project finance, and
equity.
Although the Basel II methodology can still evolve, its current IRB requirements are quite rigorous and in principle will remain the same. Once a bank elects to use the IRB approach in one portion of its loan book, it must do so for all of its loans. This universal application will challenge most banks because they typically run their lending businesses by department or branch.
As a result, banks can expect to undergo a huge cultural shift as they attempt to consolidate the lending risk. Besides changing practices that have until now been based on qualitative factors, the need for hard numbers will overwhelm many currently used technologies. Measuring the effect of many loan characteristics will overwhelm, for example, the dimensional, volume, and computational capacity of most online analytic processing (OLAP) structures. OLAP can be quite useful, but it should not be viewed as the sole tool in credit risk.
Operational Risk and Everyday Business
Basel II defines operational risk as the threat of loss because of failed
internal processes, systems, people, or external events. For the most part,
banks consider operational risk to be a blanket expense, a charge that is built
into the cost of other business functions. Like Basel I, Basel II permits banks
to set aside 20% of its gross income to protect against operational risk. By
accepting this so-called Basic Indicator Approach, banks forgo the ability to
further reduce their reserve requirements. However, Basel II did introduce a
stepped approach (similar to credit risk) for obtaining capital relief. These
phases, the Foundation and AMA, respectively, require a core competence in
operational risk analysis, and additional process and technology outlays.
Basel II’s Technology Demands
The law specifically charges banks to continuously monitor their operational
risk exposures using advanced techniques such as risk-related KPIs (key
performance indicators), scorecards, event alerts, and predictive measures. And,
leaving no room for ambiguity, the accord specifically requires banks to build
“an appropriate systems infrastructure” to identify and gather data for a
substantial risk database.
IT departments within banks and their technology suppliers and integrators will need to consider the following analytic risk requirements:
This shortlist of technology does not exclude other software that banks can deploy to keep to the letter and the spirit of the accord. Given the Basel evolution, banks should expect the committee to raise the technological bar as it discovers the capacity of technology in terms of power and timeliness. In short, banks should expect tighter reporting windows, shut by more demands on the infrastructure.
Aberdeen Conclusions
Basel II is a difficult and changing body of regulations. Although it is easy to
get lost in the tricky rules and calculations of Basel II, the rules themselves
bring other “big picture” implications for banks and their technology suppliers
to contemplate. Universally, the accord’s new approach for calculating capital
reserves –– and the competitive advantage a lower capital reserve brings –– will
transform the way the banking industry approaches strategic, financial, and
operational planning.
These considerations require immediate action. Financial institutions will need to develop Basel II understanding and incorporate it into BI and various other analytic and reporting technologies. The more tightly suppliers weave these technologies into a Basel-ready platform, the better off they will be. Bankers and suppliers should note that the Basel Committee will measure the caliber of a solution, and it clearly has shown an appetite for repeatability and extensibility in solutions.
Whether they know it or not, banks that do not take Basel II seriously are putting themselves at risk in a different way. The danger is twofold: other banks embracing Basel II will gain a better handle on their risks, a better competitive position; banks that remain unaware of Basel run the danger of a drastic increase in their capital reserves, making them less competitive. But the bottom line is this: Basel II should be a call to action for banks to develop a technology-enabled risk process. In an era of increased uncertainty and business volatility, banks that heed the Basel wake-up call and learn to control their risk stand a good chance of making a lot more money.
Additional Information:
All material copyright © 1996-2003 by Aberdeen Group, Inc. All rights
reserved.
What Exactly Is Operations Research?
Basically, operations research (O.R.) is the discipline of applying advanced
analytical methods to help make better decisions. It is applied mathematics that
follows the “scientific method” to deliver uniquely powerful enhancements to
decision making in real life human situations.
O.R. traces its roots back to World War II, long before high-speed digital computers. It originated as a way to bring scientific calculations to Allied Warfare against Nazi Germany. Executive departments were given a quantitative basis for making decisions that resulted in winning the Battle of Britain. What began with the studying of radar signals in 1936 has yet to reach its full potential. Since the 1990s, however, when numerical processing became faster and more widely available, O.R. has joined forces with digital high speed computer technology, mathematics and industrial engineers to pursue new ways to apply mathematics with far reaching impacts.
With O.R., decision makers need no longer rely solely on intuition. Today, O.R. gives executives the power to make effective decisions and build productive systems based on:
Articles and ads about software solutions that claim to enhance decision-making capabilities are commonplace. O.R. is not a buzz word for a “fad” with a claim to fame, so you won’t find O.R. listed under the “what’s hot in technology” section. Operations Research is best of breed, employing highly developed methods practiced by specially trained professionals. It’s powerful, using advanced tools and technologies to provide analytical power that no ordinary software or spreadsheet can deliver out of the box. And it can be tailored to you, because an O.R. professional can define your specific challenge in ways that make the most of your data and uncover your most beneficial options.
To achieve these results, O.R. professionals draw upon the latest analytical technologies, including simulation, optimization, visualization, queuing theory, scheduling, game theory, probability and statistics.
What Operations Research Can Do For Your Enterprise
O.R. delivers significant value to the organizations and executives who take
advantage of it. As organizations continue to become more sophisticated and
collect more electronic records, the task of analyzing data becomes much more
daunting. Fortunately, business intelligence software, data warehouses and O.R.
have all matured to the point of giving companies that employ them more precise
information – and insight - than they’ve ever had before. With these software
resources at their fingertips, O.R. professionals confront and overcome
challenges that involve large numbers of variables, complex systems and
significant risks.
In fact, the essence of O.R. is to deliver accurate knowledge in a timely fashion to make confident, calculated decisions with less risk than ever before.
Today’s software technologies for optimization and management science methods are used to profitably tackle a wide range of business issues, including:
Organizations that use O.R. have found it to be a strategic weapon in the fight for competitive advantage. According to the www.scienceofbetter.org website, here are a few examples:
Five Sure Signs That You Could Benefit From Operations Research
According to the Institute for Operations Research and the Management Sciences
(INFORMS®), if one or more of the following applies to your organization, O.R.
can deliver what you need to confidently make better decisions:
Conclusion
Strategic decision making is crucial to the success of business initiatives and
entire organizations. It’s important to ask the right questions, think out of
the box, sort through the myriad of factors and consider all potential options,
before you move forward to select your course of action in order to achieve
maximum results.
Operations Research is a proven management solution in the analytics field today and it will continue to grow exponentially from its World War II roots. No matter what stage of growth an organization falls into, O.R. techniques can help make dramatic improvements, decision by decision. The sooner O.R. is implemented into a company’s decision-making process, the more far-reaching the benefits will be.
Additional Information:
Mary Crissey is manager of worldwide strategy for advanced analytics at SAS, and a chapter officer for the Institute of Operations Research and the Management Sciences (INFORMS®). Mary can be reached at mary.crissey@sas.com.
Today's giant corporation is much like the CIA in its ability to gather and generate terabytes of information and intelligence. It can produce unlimited reports, metrics, benchmarks, and segmentation of every customer, operation, process, and transaction. But what does it all mean? That's where good analytics—based on both technology and human interaction—come into play; analytics make the difference between information overload and real business value. What follows are best practices of four businesses in very different industries that each found the right mix of tools, processes, and management to maximize the value of their data.
Nowhere is the value of business intelligence (BI) and enterprise data management more evident than in the gaming, entertainment, and hospitality industries. When a loyal casino customer hits the jackpot at a slot machine in Reno, Nev., or Atlantic City, N.J., what's important is not how many terabytes of data are stored in our data warehouse or the kinds of reports some analyst can run after the fact. Rather, it's how the information is leveraged in real time to serve the customer immediately on the casino floor. If the customer is celebrating a birthday, for example, that data will let employees know instantly so they can surprise the guest with a special greeting or gift in the casino or hotel room.
To take another example, consider a Total Rewards Diamond card customer who's a regular at one of our Midwest riverboat casinos. BI lets us know what gaming experience that customer most enjoys, as well as personal preferences as to room accommodations, restaurants, and entertainment amenities, so we can determine what incentives will best persuade the customer to visit our Rio or Flamingo hotel casinos in Las Vegas.
These are just some of the personalized services that Harrah's Entertainment can increasingly provide guests as a result of our focus on customer-facing, analytical, and operational BI efforts. No matter how broadly our casino entertainment locations continue to grow, or how complex our business and IT operations get behind the scenes, we recognize that we're in a discretionary consumer-service industry where customer loyalty and satisfaction, at the individual interaction level, make or break our business every day. That's why we've been investing in innovative CRM approaches and BI processes and technologies, as well as refining our core Total Rewards loyalty program. We're also integrating myriad other customer-facing and back-office systems and capabilities to give us a 360-degree view of our business.
Our goal is to create individual relationships and differentiated service among the nearly 40 million customers in our database who choose us for casino entertainment.
Setting us apart from competitors is the fact that all these customers—including our recently acquired base of Horseshoe Gaming and Caesars Entertainment guests—will enjoy the benefits of Total Rewards and our data-driven CRM and BI approaches. Our expanded customer base lets us offer an even broader set of locations, brands, and amenities in the United States—and, increasingly, abroad. In fact, international development is an exciting new growth area for us. We now operate in Canada and South America, and recently announced planned projects in Europe, the Caribbean, and Asia. These will leverage and enhance our existing capabilities and also let us create new value and insights. (For more on the integration of Caesars, see below.)
As with any successful enterprise BI approach, many sources of business data can be leveraged at the local or enterprise level for analytical insights. At Harrah's, these include financial, operational, transactional, marketing, product, labor, and customer-satisfaction data. Business and technical processes are used to acquire, archive, and organize that data into a data warehouse so that standardized and ad hoc reporting and analysis tools can ensure the efficient communication of the insights gleaned from the data in the right format, at the right time, and to the right people.
We've partnered with Cognos, IBM, SAS, Teradata, Tibco, and others to create a highly integrated and customized solution to meet our needs. And while BI technology and techniques are critical to the execution of our strategy, another key is our strategic customer focus and a culture of analytical inquisitiveness and data-driven decision-making throughout the company.
Loyal to the Core
Usually, one core element at a BI-focused company really drives the technology's
use and adoption. At Harrahs, it's perhaps no surprise that the core of our CRM
and BI efforts remains our Total Rewards loyalty program, together with the data
and operational insights it provides. By refining our closed-loop insights and
efforts around this core, we've established a scalable, extensible, and
differentiated loyalty and service framework within our company to continuously
improve customer interactions and business outcomes. Through these core
efforts—and the leverage of our enterprisewide BI tools, processes, and
metrics—comes growth in terms of customers, locations, and revenue, as well as
valued relationships and greater loyalty with new and existing customers.
As a result, we've seen our share of customers' discretionary gaming entertainment spent with Harrah's versus competitors jump from just over 30% a few years ago to nearly 50% today. This is a solid indication of brand loyalty, relationship marketing, and closed-loop BI in action.
BI at Harrah's permeates the organization and is firmly embedded in our culture. For example, there's a dedicated BI analytics group that reports to the CFO and is supported by the IT team to provide daily automated and ad hoc reporting, both online and in printed "white books." The content of these reports can be parsed many ways depending on who has access, be it the CEO, the CFO, or one of our three regional operating-division presidents or corporate functional areas. Some may want general top-line trends, while others may wish to drill down to find out which games did well at which locations, how bad weather affected business, and so on. The white books usually include key performance metrics, revenue analysis, and the amount of game play and hold at every location, as well as food-and-beverage sales, Total Rewards metrics, retail statistics, and other data. We can then identify hot spots, or attention areas, and quickly engage resources to dig deeper and implement marketing, staffing, or operational changes as needed.
Likewise, our marketing, gaming, hotel, retail, and customer-service and satisfaction departments—among others—have access to a rich amount of predefined reports and ad hoc data on slot and table play, player behavior, hotel occupancy, marketing and promotional campaigns, restaurants, events, and retail sales. With this, they can design, test, and roll out campaigns, promotions, game changes, and service and staffing changes targeted to specific customers during specific times at specific locations. This personalized marketing, using a variety of delivery channels such as VIP Hosts, E-mail and direct mail, lets us create the right incentives for each customer and improve everyone's overall experience at Harrah's. In turn, we can use these insights to create future campaigns that give us the highest ROI and ensure our marketing dollars actively contribute to the bottom line.
We aim to transform our decision-making approach from "I think" to "I know." Virtually every customer-facing initiative—whether it's a direct-mail campaign or a new promotion or slot theme—is tested in a controlled environment using a test campaign and a control segment. With our BI systems and processes, we can then monitor the behavior of the test group and measure the relative effectiveness of the initiative, rolling out only those we think will be most successful. With this philosophy, we've evolved from simply analyzing historical data to predicting the future performance of many marketing campaigns—a critical lever in maintaining competitive advantage and effective margins in our markets.
Our corporate structure, from a people and system perspective, also bolsters our BI efforts and sets us apart. Unlike most of our competitors, Harrah's is organized with a federated mind-set of how corporate and properties work together and share accountability—similar to the way a world-class retailer or financial-services company operates.
In general, key business strategies and automated systems are developed collaboratively but deployed centrally. Daily usage of tools and capabilities is executed locally by the properties by means of best practices and templates across the company. While we appreciate the market insights and agility of each of our local properties, we've created centralized functions and processes so that customers see a coordinated experience no matter where they stay or play.
This federated approach also can be seen in our infrastructure and how data is collected. We've shifted almost all our systems to two redundant, centralized data centers. Only a few front-of-house systems, such as slot systems and point-of-sale devices, are located at the properties themselves. And all properties feed data into our central data warehouse.
In all we do, privacy and security are a big priority. The vast majority of the data we collect is provided voluntarily by our customers and based on their activities using our customer card. This voluntary approach to capturing player activity has proved to be a win-win model for our customers and Harrah's: We've grown Total Rewards card usage from approximately 50% of tracked revenue in our casinos to nearly 80%. And with our integrated systems and enterprise view of customers' preferences and transactions, we can accurately recognize and reward them through a variety of channels and touchpoints across any Harrah's, Caesars, or Horseshoe property.
This approach allows Harrah's and our customers to mutually build upon and benefit from the relationship, and ensures that we can best provide differentiated service to our Total Rewards customers, as well as gauge their satisfaction with our performance.
Over time, we've evolved our BI systems and processes to take advantage of new technologies and best practices from other industries like retail. An example is a unique spatial and data-visualization tool we co-developed with a New Zealand company named Compudigm. With it, we not only have a dynamic "heat map" view that displays the layout of the casino floor down to the individual slot machine, but also can visualize and drill down on the transactional data flowing through the games over any period of time. It addresses another key challenge data warehousing can pose: allowing us to sort through extensive amounts of transactional data in a non-numeric way.
Going into the new year, we'll continue our spirit and approach as we create new capabilities to enhance and differentiate the customer's overall experience. We'd like to be even more service-oriented and to provide more offers, amenities, and individualized experiences for our best customers. Our vision is to react in near-real time to serve customers while they're at our casinos, whether through employee interaction, an automated business process, or a self-service transaction.
Betting on Better Services
We're migrating from a purely analytical CRM approach—looking only backward at
past performance—to a more operational CRM approach that uses both historical
and current information. We'll also use a business-rules engine and defined
business processes to trigger actions we can do now to help create a
differentiated and personalized experience for customers "in the moment."
The technologies and initiatives we're pursuing to reach our goals are well under way. One pilot project, called Moonshot, uses sophisticated SOA-based application integration and business-rules engines as well as real-time messaging between customer-facing touchpoints to integrate front-of-house transactional systems with customer data available in real time from our Active Data Warehousing platform.
With these new capabilities, we can pilot and evaluate a number of new ideas and initiatives that range from proactive and personalized service on the gaming floor, to innovative promotions and product introductions. We've only begun to scratch the surface, but these innovations wouldn't be possible without our solid foundation of core analytical CRM systems and BI tools.
From a BI vantage point, the odds seem to indicate that the House has indeed created some compelling competitive advantages. Looking ahead, one sure bet is that we'll continue to leverage our new and existing capabilities to achieve our business-growth strategies in the United States and abroad.
Sidebar: Caesars Joins The Game
The acquisition of Caesars was our sixth in as many years, but at $6.8 billion
and with more than a dozen properties and locations, it's by far the biggest
deal we've done. It will also be the fastest and most complex IT integration
effort we've ever taken on. The deal was finalized the afternoon of June 13,
2005, but when folks came in the next morning, we already had the network
infrastructure and electronic communications systems up and integrated, and
employees from both companies were working together. Even I was a bit amazed
that when I first fired up my laptop that morning and connected through the
Caesars LAN, I could get to everything I needed.
We've since completed the rest of the multimillion-dollar infrastructure build-out of our redundant network and back-of-house operational middleware, as well as data-warehousing platforms in our two enterprise data centers. We closed out 2005 by completing the entire integration, conversion, and centralization of all Caesars back-office systems and functions. Everything has been running well, despite essentially doubling the number of employees to nearly 100,000 and handling significantly more daily, monthly, and year-end financial and procurement transactions than we could have imagined.
In the third and fourth quarters, we also deployed new capabilities for cross-property customer hotel booking and in December, we successfully completed our first full front-of-house conversion of the Flamingo in Las Vegas including integrating the slots, tables, casino, hotel, events, and point of sale. This let us offer all of our Total Rewards capabilities in record time and with encouraging results. Through March, we'll convert front-of-house systems for the rest of the Caesars properties in Nevada, Mississippi, Indiana, and New Jersey, and then we can relaunch and market Total Rewards across more than 40 U.S. properties. Then we'll start the process internationally, first in Canada, then likely in South America. Those will be good tests as we begin adapting our systems in the Bahamas, Spain, and Slovenia, and for planned expansion in Asia.
Financially, fully two-thirds of the value expected from this merger is revenue upside, with only about a third in cost savings. That's why there's a lot of focus on getting Total Rewards into the Caesars properties. For the third quarter of 2005, the first one reflecting the Caesars acquisition, the company reported record revenue of $2.3 billion, up 78.2% from the previous year.
Tim Stanley
Sidebar: 12,000 Opportunities At Dunkin'
Launching successful business-intelligence initiatives isn't about doing things
you couldn't accomplish before. Rather, it means collecting data and making
decisions in a more efficient and effective manner. And when you have 12,000
worldwide retail outlets—including Dunkin' Donuts, Baskin-Robbins, and Togo's
restaurants—collecting that data and executing a comprehensive BI strategy is a
big job.
We at Dunkin' Brands previously lacked the tools to efficiently gather the information necessary to better understand our products, franchisees, and customers. To gain insight into the business, our knowledge workers would have to spend as much as 80% of their time piecing together data from disparate systems.
Three years ago, we launched a BI strategy primarily to analyze what was going on in the market for our brands. We were particularly interested in analyzing marketing and promotional efforts and developing applications that would support our strategic vision. To achieve our goals, it was critical to develop an integrated, single source of the truth and then leverage that information to measure the success of our business investments.
But as a 100%-franchised organization, we faced unique challenges. We needed to implement technologies and applications that could gather information in the most efficient way possible, then quickly turn that information into products and services to drive the success of our franchisees. By making accurate, timelier decisions about the products and services we delivered to our franchisees and customers, we could boost revenue and profitability across the system.
To start, we had to create a data warehouse to integrate this information. Doing this required us to ask: What type of data would we need to gather? How available would that data be? And what would be the most efficient way to store and analyze it?
At first, we took an iterative approach, with input from both technology and business partners. The process included gathering the available data and identifying gaps in that data, such as missing elements or incorrectly coded item numbers. This is another unique feature of a franchise-business model: The franchisee owns the source of the data and can inadvertently introduce anomalies. But we resolved many of these gaps and completed the analysis that would give us the business insights we sought.
We have an Oracle back-end database and Oracle Discover on the front end to create reports and analysis. Each night, we collect a feed of point-of-sale (POS) data from a representative sample of our restaurants. This data is aggregated in the data warehouse, along with other pertinent data points such as retail information, profitability metrics, and external information from third parties.
A good example of such an analysis involves the introduction of our line of espresso beverages. Using tools from product inception through national rollout, we could immediately gather and evaluate POS information from participating restaurants. In short order, we could compare test markets with control markets to see the effect of our promotional campaigns and measure outcomes, even unintended ones.
These tools have greatly enhanced the analytical capabilities of our knowledge workers and we require every user to be certified. Currently, we have 260 users who are fully trained across the business: including finance, marketing, operations, and new-product development. It's important for these power users to have full control and a deep understanding of the data and analysis tools. Our main goal is to support them.
The biggest challenge so far has been the volume of data. In 2004, the company had $4.8 billion in annual sales, and since each transaction is relatively small, the volume of data is enormous. But this is also one of the benefits; previously, it was a labor-intensive effort to provide a clear view of enterprisewide sales. In the past year, we've seen the data-collection process accelerate and adoption within the organization grow more widely.
To determine the ROI of this ongoing initiative, we've used time and efficiencies as a measure of success. Reducing the time it takes to gather information from weeks to hours is a real money-saver. And taking action on that information has clearly let the organization make better business decisions.
Assessing business opportunities—such as identifying new markets in which to launch stores—is critical to our company's continued success. Our BI platform will scale as we gather data that accurately reflects the expanding universe of Dunkin' Brands. Armed with this information, our BI users will effectively measure business results and thereby help the company achieve its strategic goals.
Rick Broughton is acting CIO of Dunkin' Brands.
Sidebar: BI Wins At McKesson
About three years ago, McKesson Pharmaceutical, a $71 billion business unit of
San Francisco-based McKesson Corp., began an extensive business-intelligence
implementation. Executive management had identified opportunities where BI could
provide gains in productivity, close off profit leaks, and improve business
processes. They asked that the team focus on these areas by developing specific,
process-based analytic solutions.
To start, the strategy needed to capture the new data from our ERP implementation. My team faced the challenge of getting the ERP team on the same page. It was critical for them to jointly deliver the enterprise-data-warehouse layer—a more robust architecture connecting financial information with new sales, logistics, and inventory information. We brought all of this information into an SAP Business Intelligence solution with the goal of deriving significant benefits to the company's bottom line.
Within nine months, we developed the data warehouse layer that captures the bulk of existing SAP and legacy information, as well as new sales, logistics, and inventory data. This wasn't an insignificant achievement. From a volume perspective, we extract upward of 15 million records nightly and load 35 million records per night into the SAP platform. Currently, we're methodically building out process-based analytics that utilize this data.
For this effort, we've built more than 30 process-based analytic solutions to focus on several critical areas: finance/profitability; sales and marketing; procurement; customer and supplier contract management; and distribution-center operations.
For example, we're in process of more closely analyzing the accuracy of customer and supplier contracts and how they compare with the actual information we now retain in the BI system. These analytics are increasing the effectiveness and efficiency of contract compliance with both customers and suppliers. In operational analytics, we need a very clear inventory analysis of our more than 30 distribution centers. The BI platform developed process-based analytics for inventory dating, inventory adjustment, and quality metrics, which allows us to determine how accurately we deliver the right product to the right customer.
Keeping Score
In addition, we're starting to deliver the first iteration of scorecards and
dashboards. This effort is in the initial stages, but we hope to eventually
provide corporate performance-management alerts that isolate critical key
metrics for all business processes.
In the three years since implementing the BI platform, the number of named users with access to the system has grown to 1,500, including about 300 who use it daily. Although user acceptance and adoption continues to climb, achieving growth hasn't been easy. In many instances, BI isn't mandatory, and we see varying levels of participation. Additionally, many users are accustomed to the old way of reporting and analyzing from the legacy-data warehouse. We're trying to extend SAP BI much farther into the organization, and we're slowly transitioning people away from other existing data sources.
To help drive user acceptance and adoption, we spend a good deal of time on training and support. Initially, we allowed many of the BI users to build their own queries, but we discovered that this freedom sometimes complicates the ways users gather and analyze data. To improve ease of use, we've optimized the performance of our BI solutions and simplified the query-development and execution processes. For example, as part of the sales-process-based analytic solutions, we reduced the number of queries from approximately 700 to 45, and we improved query performance in certain instances by more than 300%, on average.
Moving forward, our next major BI initiative will address the need for more operational, or just-in-time, analytics. As with most BI implementations, analytics or reports are provided next-day. Our goal is to move data more quickly across the environment, providing even greater benefits for the business.
Indeed, we've made significant strides with our BI effort. The executive committee—initially consisting of the McKesson Pharmaceutical CIO, CFO, the SVP of marketing, and the SVP of business-process reengineering—set the direction. The committee has evolved based on business requirements; the BI team continues to deliver accordingly. Of course, it's not possible to satisfy every business-user request. We remain focused on our goals: providing process-based analytic solutions for the business to increase productivity, close profit leaks, and improve processes.
It's this business-intelligence focus that has enabled McKesson Pharmaceutical to deliver significant benefits to the company's bottom line.
Brian Hickie is VP of business intelligence with McKesson Pharmaceutical.
Sidebar: Outward-Facing BI
Like many of our corporate counterparts, we at PHH Arval are using
business-intelligence and dashboard technology to differentiate ourselves in the
market. But unlike our competitors, we've launched our initial foray into the BI
arena as a customer-service initiative, rather than as an internal analytics
application.
As the second-largest provider of outsourced commercial fleet-management services, we manage company car programs for nearly a third of the Fortune 500 companies as well as for hundreds of government agencies. We estimate that our PHH InterActive Dashboard application—introduced in the summer of 2005—will save our clients as much as three work days each month in downloading and analyzing fleet data.
As a strategic partner, PHH handles all the processes involved in managing a fleet of business vehicles. We design and tailor ways for company-car drivers to order the vehicles they need, then deliver the vehicle to the driver. We provide a variety of lease options to the company, and issue drivers charge or credit cards to use at the fuel pump. We ensure that appropriate maintenance gets done, and when bad luck happens, we manage the accident-response process. We take care of properly registering and reregistering the vehicles; and because the cars are titled to PHH, once the lease expires, we can sell the cars on behalf of the client. Every one of these processes generates a tremendous amount of information that can help our clients manage their fleets more efficiently.
PHH InterActive Dashboard allows clients to access and analyze fleet trends by providing three years of summary information on key performance indicators and fleet expenses. It gives them access to fleet billing, inventory, and fuel-purchase information, as well as information about vehicle orders and used-vehicle sales options. Clients can monitor their fleet expenses, which are summarized and presented in a format that can be incorporated into their budget-analysis process. For example, trend-analysis reports show the average price per gallon consumed by the fleet over a period of as long as 24 months.
We introduced PHH InterActive in 1998 as a comprehensive enterprise information and asset-management tool. At the same time, we developed an extensive data warehouse, which integrates information from several systems to provide the level of reporting clients needed. While pleased to have the information, they were often overwhelmed by its sheer volume. This was the catalyst for the Dashboard—a collaborative effort with clients. As CIO, I attended many client meetings and worked with our Client Technology Advisory Board. Once we identified their requirements, I worked with the developers to build the prototype. We went through several stages of prototyping.
In terms of vendor selection, we were already using the Brio enterprise reporting toolset, but hadn't upgraded to Hyperion's Performance Suite after Hyperion's acquisition of Brio. Once we determined we wanted to upgrade, we brought in the leading BI vendors and narrowed the field to a short list of providers. In the end, we went with Hyperion. For us, in addition to providing a comprehensive enterprise toolset, it was the path of least resistance—we wouldn't have to redo all our existing reporting, and we would continue to have an integrated solution.
We took an incremental approach. We didn't try to solve all our problems with the first release. We released the beta at the end of August and the second version in December. With each iteration of the Dashboard, we've received valuable feedback from customers. Such input helps us deliver critical requirements incrementally and steadily.
The second release focused on used-vehicle information. Customers are interested in understanding how well those vehicles perform when they come out of service to determine resale value. In our third release—slated for this month—we'll show summary stats about maintenance and accidents, and correlate them with resale value.
Eventually, we're looking to use the tool internally. But the big bang for the buck that has justified the implementation is how the Dashboard has separated us from the pack in our market.
Tim Talbot is senior VP and CIO at PHH Arval.
Sidebar: Travelocity's Mamie Jones on BI
Mamie Jones, senior VP of strategic sourcing at Travelocity, Sabre's online
travel-reservations site, spoke recently with Optimize senior managing editor
Paula Klein about Travelocity's BI efforts.
Q: Sabre has projected a dramatic 40% rise in revenue this year for Travelocity. Will BI play a role in achieving that growth?
A: IT plays a very big role in our business since we're a technology-centric company. We also know that as we increase the personalization of our marketing and target campaigns to be more relevant to customers, booking rates can exceed as much as eight or 12 times the standard methods [of outreach]. Our data warehouse plays a key role in assuring that our merchandising efforts pay off.
Q: What are your specific BI strategies?
A: We're moving from an enterprise data warehouse to an active data warehouse which should give us more real-time data, not just historical data. It will help us make better, more accurate, and automated business decisions and offer more timely updates to consumers. We want to use data to improve our products and learn more about which promotions, experiences, and services customers find valuable. Our merchandising objective is to increase customer value and offer a richer online experience. We also want to increase suppliers' value through all of our channels. This vision is evolutionary; it's not one project, but an ongoing effort.
Q: How far along is the data warehouse?
A: We've done some benchmarking and we're in the process of planning the architecture, the design, and the capacity. We're building this internally based on our current Teradata platform. We think we're out in front on this. Having current data will help us retain customers by building loyalty and satisfaction. It also leads to better predictive modeling.
Additional Information:
Tim Stanley is senior VP and CIO of Harrah's Entertainment.
©CMP Media LLC. All Rights Reserved. A United Business Media company. This article was originally published in Optimize magazine. Reprinted by permission.
The Challenges of Customer Data
Integration
By Robert Lerner
Current Analysis
One of the major challenges facing companies today is developing a single, consistent, and complete view of their customers across all applications, databases, and customer touch points. The challenge is significant, because developing such a complete view of customers is not an easy task, but it is crucial to a company's success and to its ability to comply with any number of state, federal, and international government regulations.
Certainly, most companies have an intuitive grasp of the significance of
obtaining such a view of their customers. But many companies fail to understand
its overall importance and far too many underestimate the difficulty of getting
this view.
Indeed, it is surprising that not every company appreciates the importance of a
single, consistent, and complete view of the customer. (In fact, among the
companies that do understand the importance, not every one of them recognizes
that they lack such a view today.) Without a single, consistent, and complete
view of its customers, a company cannot begin to understand its customers - and
it cannot begin to serve its customers well.
Consider, for example, a company that has duplicate customer records. In all likelihood, some customer interactions with the company will be associated with some records and not with others. The duplicate records complicate the effort to track customer actions such as customer buying habits as well as customer interactions with various customer touch points (e.g., the Web, call center, etc.). The company will also have difficulty determining the total value of the customer. As a result, the company's ability to serve and support the customer may be limited.
In fact, the company's ability to retain the customer could be jeopardized. Customers are easily put off by customer support representatives who do not have a complete history of their interactions with the company. Moreover, many customers have ceased to do business with a company after having been bombarded with marketing messages that the company intended to communicate to different individuals but were, in fact, directed to the same individual because of duplicate customer records.
But companies also incur potentially significant costs because of duplicated customer records and other customer data problems. While the costs of retaining a single customer can be significant, these costs can skyrocket if the company has to retain the same customer more than once because of duplicate customer records.
However, problems like these are only the tip of the iceberg for companies without a single, consistent, and complete view of their customers. Without such a view, expensive applications such as CRM, ERP, and the like will not only fail to live up to expectations but will require much more of an investment than the company anticipated. CRM applications, for example, can be a boon to companies, but only if the customer data fed into them is consistent, accurate and reliable. Inconsistent customer data (data containing multiple customer records for the same customer, etc.) can lead CRM applications to present the company with an erroneous idea of who the company's customers are and their overall value to the company.
If the goal of a CRM application is to help companies maximize the customer experience at each stage of interaction - and to facilitate deep, long-term, and profitable relationships with their customers - then feeding the application incomplete, inconsistent and duplicate customer data can defeat this goal and increase the costs of maintaining both the customer and the application. Now consider a company's problem if it relies on the same customer data used in its CRM application for a variety of applications (such as front and back office applications, data warehouses, etc.). In this case, the problem is magnified and the costs associated with the problem grow exponentially.
And in an era of increasingly strict compliance standards, a single, consistent, and complete view of the customer is critical to keeping a company in line with regulatory or industry standards. Indeed, the degree to which a company has a comprehensive view of its customers can determine the degree to which it can comply with state and federal government and international regulations.
In recent years, and especially since 9/11, governments have enacted regulations that essentially require a company to know who its customers are. Regulations such as OFAC, the USA Patriot Act, HIPAA, Graham-Leach-Bliley, state and federal Do Not Call legislation, and so forth depend, at their core, on a company's ability to know its customers. This requires a single, consistent, and complete view of the customers.
HIPAA (Healthcare Insurance Portability Accountability Act), for example, requires hospitals, physicians, and managed care companies to adopt patient information, privacy, and security standards. It also allows patients to access medical records, make corrections to the information contained within them, and monitor how this information is used or disseminated. But unless healthcare institutions have high-quality, complete patient records, they cannot begin to protect the privacy of a patient's medical information or provide the patient complete access to his or her records. A lack of data-related safeguards could open hospitals, physicians, and so forth to a variety of liabilities.
Graham-Leach-Bliley, on the other hand, is designed to regulate the sharing of information about customers who purchase financial products or services from financial institutions. As with HIPAA, the absence of a single, reliable view of the customer hinders a company as it begins to control how it shares customer information - and it opens itself up to fines and worse.
The risks associated with an inaccurate, limited view of the customer should be apparent. The question now is how to get a single, consistent and complete view. To date, companies have used a variety of methods that have offered them only limited success. A typical IT environment complicates the issue, as it is difficult to standardize on a single vendor's applications and have multiple customer data sources and multiple customer touch points. Indeed, how does a company go about integrating all this customer information, information which is often incompatible (due to standards, application incompatibility, format, etc.), into a consistent, complete view of its customers and create a single view that can be shared by all of the applications and touch points.
It seems an impossible task, but it can be done effectively with the right kind of technology. Emerging technologies, such as Customer Data Integration (CDI) solutions, that emphasize data quality are the best way to approach the single view of the customer. The next article will explore some of the ways that companies have handled customer data and recommend an effective CDI solution.
Click here to view the Customer Data Integration Info Pack.
Additional Information:
Robert is a Senior Analyst for Data Warehousing and Application Infrastructure at Current Analysis. Robert is responsible for covering the competitive landscape for the Data Quality, Enterprise Portals, and Content Management markets, where he focuses on significant technology, product, and service developments impacting the evolution and emergence of these technologies. He can be reached at rlerner@currentanalysis.com.
Enhancing Value Through Data Mining
Enhancing Value Through Data Mining
SAS Institute Inc. Article
Our Server
In a recent HBS Working Paper, HBS professor Max Bazerman and colleagues explore how biases and human psychology impede policy-making efforts that could vastly improve people's lives.
Why is it that the U.S. federal government allows local communities to give tax dollars to wealthy sports team owners rather than to create better benefits for citizens? Why are organ-donor programs constrained to the point where thousands of Americans die needlessly each year? Why did the South African government take a stand against an effective AIDS treatment drug?
The inability of government to make wise tradeoffs—give up small losses for much larger gain—has been investigated by HBS professor Max Bazerman and his research colleagues for years. Much of this work used economic science and political science to explain drivers behind the crafting of public policy. Now Bazerman and coauthors Jonathan Baron and Katherine Shonk are looking into the psychology of decision making to provide a fuller explanation. Their paper, "Enlarging the Societal Pie through Wise Legislation: A Psychological Perspective," has been accepted for publication in Perspectives on Psychological Science later this year.
What they found was that psychological science does indeed help explain how governmental decision making is influenced by such forces as parochialism, nationalism, and dysfunctional competition, while also providing tools that foster rational decision making.
Consider this hypothetical question posed in the paper:
A. If you die in an auto accident, your heart will be used to save another person's life. In addition, if you are ever in need of a heart transplant, there will be a 90 percent chance that you will get the heart.
B. If you die in an auto accident, you will be buried with your heart in your body. In addition, if you are ever in need of a heart transplant, there will be a 45 percent chance that you will get the heart.
Which would you prefer? Most people choose Option A—the benefits of the tradeoff are quite clear. Yet the U.S. government, yielding to what psychologists term omissions bias, the "irrational preference for harms of omission over harms of action," follows an organ-donation program that favors Option B.
We asked Bazerman to discuss the research and its implications for policymakers.
Manda Salls: Your description of the "irrational preference for harms of omission over harms of action" is fascinating. Why is it, for example, that most people agree organ donation makes sense, but they don't donate?
Max Bazerman: Collectively, most Americans (and, more broadly, humans) believe that organ donation makes sense. But most people accept the default—the status quo. It is not that people are deciding in large numbers not to donate. Rather, they are not thinking about it. In other countries (e.g., Belgium), the default is that unless you specify that you do not want to donate your organs, you become a viable donor at death. In the U.S., unless you actively decide to donate, you are not likely to be a donor. Thus, the default that society imposes dramatically affects donor rates. As a result of the U.S. system, 6,000 people die each year who might not have, given a change in the default.
Q: Your research shows that given a choice, people prefer to receive less, provided others receive the same amount, than to receive a higher amount when others would receive even more. Is this a product of hyper-competitiveness in society?
A: The competitiveness of society plays some role. However, the bigger issue is that we often use the outcomes of others to assess our own outcomes. When you receive a 5 percent pay increase, and want to know whether to be happy or not, you find out the increases of your colleagues to create meaning of the 5 percent.
People do not choose less for themselves. If I ask people whether they would prefer a) $7 for themselves and for another person or b) $8 for themselves and $10 for the other person, people choose "b." However, when people are simply given "a" or "b," "a" makes them happier. Basically, in choosing between options, people are not that focused on social comparisons. But, in evaluating the outcomes they receive, they are very focused on comparisons.
Q: In the discussion of dysfunctional competition you talk about the harm of using tax dollars to build new sports facilities to retain professional sports teams. Doesn't this type of spending bring income to a city? Why should people be wary of this type of investment?
A: Yes, it may be true that Boston (or any other city) benefits by keeping its football team or baseball team. But, another city without a team also would benefit from taking away Boston's team. And, as a result, cities compete with taxpayer dollars to lure sports teams to their cities. The net result is that we make welfare payments from taxpayers to rich owners of sports teams. Society would be better off with those scarce tax dollars being used for schools, hospitals, etc. The U.S. government should be involved in stopping cities from dysfunctionally competing with each other through these welfare payments.
Q: In this working paper it is stated that, ". . . nationalism may be the last type of prejudice to be widely tolerated." Please elaborate.
A: It is no longer socially acceptable to proclaim males as intellectually superior to females or whites as intellectually superior to blacks or Hispanics. Society has largely accepted such stereotypes as offensive, inappropriate, and leading to the lack of consideration of people as unique. But, American society continues to value Americans as dramatically more important than citizens of other nations. This is true in terms of jobs, the value of life, and across many realms. Like our other prejudices, nationalism is an example of valuing "our" group over "other" groups based on arbitrary group membership.
Q: How should leaders, particularly in the political arena, use this research to make better decisions? It seems like the public and many politicians are unwilling or unable to see beyond the immediate effects of their actions.
A: Our article, and the book that we published in 2001 (You Can't Enlarge the Pie), focuses on wise strategies that reasonable members of both major political parties should see as superior to the current state. We are currently in a political environment where beating the other side is valued over making wise decisions. We hope that our work provides an outline for leaders who want to make wise decisions, rather than simply beat the other party.
Q: What other research are you working on?
A: I am working on a book with (HBS professor) Deepak Malhotra entitled Negotiation Genius, which focuses on the strategies that we all can adapt to be seen as negotiation geniuses within our organizations. We expect it to be published by Bantam in 2007. I am also developing my research with Dolly Chugh on bounded ethicality and bounded awareness, themes that have been at the core of my recent publications in Harvard Business Review.
Additional Information:
Manda Salls is a content developer for
Baker Library.
Max H. Bazerman is the Jesse Isidor Straus Professor of Business Administration at Harvard Business School.
All materials copyright of the Harvard Business School Working Knowledge.
Some of the following information is adapted from the guidebook, Field Guide to Leadership and Supervision.
Much of what managers and supervisors do is solve problems and make decisions. New managers and supervisors, in particular, often make solve problems and decisions by reacting to them. They are "under the gun", stressed and very short for time. Consequently, when they encounter a new problem or decision they must make, they react with a decision that seemed to work before. It's easy with this approach to get stuck in a circle of solving the same problem over and over again. Therefore, as a new manager or supervisor, get used to an organized approach to problem solving and decision making. Not all problems can be solved and decisions made by the following, rather rational approach. However, the following basic guidelines will get you started. Don't be intimidated by the length of the list of guidelines. After you've practiced them a few times, they'll become second nature to you -- enough that you can deepen and enrich them to suit your own needs and nature.
(Note that it might be more your nature to view a "problem" as an "opportunity". Therefore, you might substitute "problem" for "opportunity" in the following guidelines.)
1. Define the Problem
This is often where people struggle. They react to what they think the problem
is. Instead, seek to understand more about why you think there's a problem.
Defining the problem: (with input from yourself and others)
Ask yourself and others, the following questions:
Defining complex problems:
Verifying your understanding of the problems:
Prioritize the problems:
Understand your role in the problem:
2. Look at Potential Causes for the Problem
3. Identify Alternatives for Approaches to Resolve the Problem
4. Select an Approach to Resolve the Problem
When selecting the best approach, consider:
5. Plan the Implementation of the Best Alternative (this is your action plan)
6. Monitor Implementation of the Plan
Monitor the indicators of success:
7. Verify If the Problem Has Been Resolved or Not
One of the best ways to verify if a problem has been solved or not is to resume
normal operations in the organization. Still, you should consider:
Additional Information:
Written by Carter McNamara, MBA, PhD. Used by permission.
Dave Crandon covers several of his main points in our webcast Capturing Hidden Opportunities: Achieving the Next Level in Performance Management.
Although many companies perform well, only a few achieve truly superior performance (Collins, 2001). Most find their successes limited by a lack of the organizational alignment and cohesive action that superior performance requires.
In many cases, the primary culprits that undermine performance are flawed measures and measurement-driven processes-core elements in the Human Performance Technology (HPT) model (Van Tiem, Moseley, & Dessinger, 2004). Indeed, these measures and processes are too often the weak links in HPT, which aims to apply scientific and organized knowledge to improve individual and organizational performance (Stolovitch, 2000). They undermine effective HPT by failing to focus organizational attention on the most important issues-those that offer the greatest performance leverage.
This article presents four principles by which executives can ensure their organizations' measurement systems and measurement-driven processes support organizational alignment and cohesive action. These principles have been derived both from a review of the academic research literature and from implementation of systems of measures in a variety of companies. To be able to generate superior performance, companies need to (1) implement a decision-oriented model of controllable performance, (2) estimate the performance potential for all elements of the model, (3) assign responsibility for all the drivers of performance, and (4) develop a common language of performance that relates operational performance measures and managerial responsibilities to corporate performance objectives, at all levels in an organization.
Building the Right Management Information and Using It Well
Although executives recognize that good management information is essential for
strong performance, few recognize that the biggest obstacles to developing the
information and using it effectively are not the high cost and long time frame
that such efforts entail, though both are substantial. The greatest obstacles
lie in the complex judgments that executives must make to create the vocabulary
and processes for managing performance. These judgments make the difference
between an organization that pursues excellence with well-focused, cohesive
action and one in which fragmented decision making limits success.
As illustrated in Figure 1, most executives start by defining a framework for structuring raw data, deciding which disciplines to track, what outcomes to monitor, and for which business units, individuals, customers, and suppliers. Then, after realizing that their data provide an incomplete picture of performance, they make decisions about how to augment it. For example, they employ analytic methodologies, such as activity-based costing and transfer pricing, to manufacture information that will support a richer, more complete understanding of performance. They also direct the acquisition of third-party data to develop a better understanding of their customers, markets, and competitors. Finally, they specify the processes by which people and business units will work together to achieve corporate objectives. These include a broad array of processes for managing people (e.g., assigning responsibilities, setting goals, and evaluating performance), making decisions, and executing.
Figure 1: The Development and Use of Management Information
We are not the first to make such critiques, of course. Luminaries such as Jack Welch (2005), Peter Drucker (1999), and Hope and Fraser (2003) have criticized companies' measurement systems and budgeting processes, around which the vast majority of firms build their most important target-setting and accountability processes. Among other things, these critics have noted that most companies' budgeting processes are rife with politics and are prone to motivate managers to strive only for mediocrity. These processes also take too long, are too expensive, and produce budget targets that can become obsolete too quickly. The challenge for executives is to make the judgments that will produce a well-aligned organization and that will guide decision makers to focus on those situations that offer the best opportunities for strengthening performance.
Principles for Making Performance-Related Information Decisions
Although the challenges are formidable, companies can ensure that individual and
business unit objectives, skills, and resources are well aligned with corporate
performance objectives by acting on four basic principles.
Principle 1: Develop a Decision-Based Model of Performance
Develop a decision-based model of controllable performance and apply it
throughout the organization. This performance model should meet three important
criteria:
The model should be decision-centric. That is, it should be composed of factors about which people make decisions. Almost by definition, the components of the model should be operational in nature, addressing the managerial disciplines about which most line personnel make decisions. These include issues such as customer acquisition and retention, productivity, efficiency, and risk control. Performance models that focus heavily on financial factors are seldom decision-centric and frequently cause two types of problems. First, they force decision makers to translate too often between financial terms and operational terms, and then back again. These translations complicate and slow down both communication and decision making. Second, because financial measures typically summarize outcomes of myriad operational decisions, they can easily disguise the reasons for performance and weaken decision making.
The model should reflect all major dimensions of performance. Performance models that are too narrow risk producing behavior that undermines success. The most commonly cited example of a narrow model is one that focuses too much on sales volume and not enough on value or customer retention. The refrain "We'll make it up in volume" unfortunately illustrates the broader reality of too many companies, where individuals can be spectacularly successful in achieving their goals and maximizing their compensation, while undermining organizational success.
The model must distinguish controllable aspects of performance. Failure to do so often leads to a misunderstanding of how performance can be managed, which in turn can lead to poor decisions.
Channel profitability measures offer a common example of a failure to focus on controllable performance. Too often, channel managers are evaluated at least in part on the bottom lines of their units, despite the fact that they can influence very few of the factors that drive the bottom line. In their efforts to produce good earnings, they can too often manage the disciplines over which they do have control (e.g., customer service, staffing levels and skills) in a manner that produces near-term profit improvement at the cost of longer-term customer dissatisfaction.
Principle 2: Estimate Performance Potential
Estimate performance potential for all aspects of the decision model. Managers
must answer the question, "How well should we be able to perform?" for each
performance metric and for each individual and business unit measured.
Consider the first essential step in using performance measurement information-comparing actual performance with something, such as a hurdle rate, a plan, or historical performance. Companies routinely compare actual performance to planned performance to assess the accomplishments of individuals and to make decisions about compensation and advancement. But a comparison with plan has little value for determining whether to invest in building a line of business. For that, performance is more appropriately compared to a hurdle rate or some other organizational objective, to determine whether a unit's performance justifies further investment of resources, time, and money.
Although these two comparisons are necessary, they are not sufficient. They do nothing to indicate where the application of effort can produce the best returns. This requires an understanding of how well individuals and units are doing in comparison to how well they ought to be able to perform or, in other words, their potential.
To illustrate, consider this question: Is a 92% customer retention rate good performance for a business unit that has a goal of 90% and that achieved 89% in the prior year? When compared to these two benchmarks, 92% retention sounds very good. But what if other similar units are routinely achieving 95% customer retention? Suddenly, 92% retention does not seem to be quite so strong a performance.
In sum, the decision support value of any performance measure is determined by what it is compared to. And the comparison of actual and potential performance is the key to good decision support value.
Principle 3: Assign Responsibility for Controllable Aspects
Clearly, the effectiveness of these processes, which will ensure that people focus on the right issues and strive to achieve the right targets, depends on application of the first two principles. Too many companies fall short of achieving their full performance potential because of self-created decision-making blinders. These blinders result from the use of performance models that are incomplete or that focus too much on financial outcomes at the expense of the factors about which individuals make decisions. They also result from the use of goals and compensation programs that allow individuals to be judged successful not only when their performance falls short of potential, but even when their individual performance undermines organizational success.
Decision makers throughout an organization must not only be able to answer the question, "Where are our best opportunities to improve performance?" They must also be able to answer the question, "Who is responsible for capturing each of these opportunities?"
Principle 4: Develop a Common Performance Language
Develop a common language of performance that relates operational performance
and individual responsibilities to corporate performance objectives. This is
essential to ensure organization alignment and cohesive decision making.
Few companies have achieved the common language that creates bridges between constituencies with very different needs. We see this shortcoming in several types of situations, in which organizational constituencies pursue objectives that appear to conflict. Unable to bridge such differences, decision making suffers, often producing organizational turmoil and fragmenting action.
Well-aligned organizations need a performance language that ensures a shared understanding of the interrelationships between the myriad disciplines that, in combination, create organizational success. This language must relate the operational factors about which managers make decisions to corporate performance objectives, for all operational metrics and for all units and people measured.
Case Study
Consider one firm's experience. A regional president of retail banking for a top
10 bank was preparing for the annual planning and goal-setting process.
Recognizing that the bank's approach for setting his goals would require his
region to increase earnings by more than 10% for the coming year, this executive
was eager to find new ways to achieve his target. Although he and his team had
built a strong sales culture and had an impressive growth record, he was
concerned that his team was pushing the limits of aggressive selling. He was
determined to take a step back from his traditional approach, "sell more and do
it more efficiently," to see whether other actions might produce the value he
needed.
To this end, he asked his team to identify the region's best performance improvement opportunities. The team's job was diagnostic in nature: to examine the ways in which the branch network worked to accomplish the region's performance objectives, and to determine how "smarter management" might complement the bank's traditional "brute force" approach to earnings improvement. The team had several tools at its command for completing this assessment, including an extensive database of operational information about customers and channel performance, plus well-established methodologies for measuring profitability.
Approach
Following Principle 1, the team drew on the work of Grant and
Schlesinger (1995) and elected to use a model of performance with three major
dimensions: acquiring more customers, earning more from the customers, and
keeping customers longer (see Figure 2). The team determined that this model was
substantially complete because all aspects of financial performance could be
related to one of the dimensions in this model. They selected one metric to
represent each of the performance dimensions and used these three metrics to
evaluate the performance of managers at all levels in the region's hierarchy.
Having developed this complete, albeit high-level, model of performance of each branch, the team proceeded to apply the second principle: to estimate the potential performance for each of the three metrics for each branch. They decided to estimate potential by looking within, using the performance of stronger branches to represent an achievable level of performance for other branches.
Figure 2: Sources of Value
After using a sophisticated statistical routine to identify clusters of like branches, they confronted the challenge of selecting specific benchmarks. What should they use to represent potential? Should it be set to be the performance level of the best branch in each cluster? Or would the 75th percentile or even the average be more realistic? The president and his management team determined that their best initial approach for estimating potential should be to use the average performance for each metric within each cluster. Then they would look for the branches whose performance fell short of average and judge that these branches should be able to strengthen their performance up to at least the average levels of other branches in their clusters. They decided that branches that were performing better than average should, initially, be assumed to be achieving close to their reasonable levels of potential.
From their perspective, this approach made sense for two reasons. First, it estimated a level of potential that was demonstrably achievable, given the bank's current business practices. The team felt it would be very difficult for branch managers to argue that they should not be able to hit a level of performance that about half the branches in their clusters were already exceeding. The regional president would depend on this to keep the managers focused on how to achieve their potential, rather than arguing over methodology. Second, the explicit estimation of potential would lead to an extremely valuable, constructive discussion of different constituencies about performance metrics, branch characteristics, and best practices. The executive felt that over time these discussions would generate ongoing performance improvement and an evolving understanding of achievable performance.
Following Principle 3, the team assessed how well each manager was achieving his or her branch's performance potential. Team members did this by comparing actual performance with potential performance for each metric, for each branch.
And finally, following Principle 4, the team translated differences between actual and potential operational performance into earnings, the basis for the president's performance goals. This translation into earnings created a common, opportunity-based language for discussing performance. The project team's results allowed them to identify where branches were "leaving the most money on the table" and to see which managerial disciplines offered the greatest leverage for improving the region's bottom line.
Figure 3: Average Performance of Branch Clusters
Outcomes
By applying the four principles described, the team reached some very productive
conclusions.
The region had some very good profit-improvement opportunities. The team concluded that "smarter management" could contribute at least 10% to 15% to the region's bottom line over the following year or two, allowing for some much-needed moderation and realignment of the bank's "sell more, and do it more efficiently" managerial style. The core of the "smarter management" would be to make sure that people focused on the most important issues (i.e., those that offered the greatest performance improvement leverage for individual branches and for the region) and to identify and share best practices among branches in the region.
Success would require that they remove self-created blinders that had misdirected attention and decision making. These blinders resulted primarily from performance evaluation processes that gave inadequate consideration to important differences between the market conditions in which business units and managers operated. The branch with the largest earnings improvement opportunity provides the best illustration of this blinder. This branch was one that the regional president had previously thought of as one of his best, a so-called "platinum performer." Over the previous few years the officers in this branch had been extraordinarily successful in beating their performance targets, and they had earned the highest possible incentive compensation. This branch's performance was stronger than at least 95% of the region's other branches, not only for the disciplines measured by the bank's incentive program, but also for the "opportunity metrics" that the team used when searching for untapped potential.
Yet, after applying the principles described above, the team came to realize that this branch's officers had, more than those in any other branch, "left money on the table." The team reached this conclusion by comparing the operational performance of this branch with that of similar branches ("affluent urban" branches). This group of branches, which served some of the bank's most affluent customers in growing markets, performed better than almost all other branches in the region (see Figure 3).
So all branches in this cluster looked very good in the region's across-the-board manager rankings. But among this group, the high-opportunity branch was the weakest by far (see Figure 4). Its customer retention rate was five percentage points behind that of its peers, and a much larger portion of its customers were unprofitable than was the case in other branches. Customers are unprofitable when they do not purchase products or services in volumes sufficient to cover the bank's costs of maintaining and servicing their accounts. The team recognized that, by comparing apples and oranges, regional managers had reached incorrect conclusions about performance and failed to note some situations with the potential for substantial performance improvement.
Success would require more thoughtful personnel management. The team concluded that there were too many cases in which successful managers, those who had captured most of the potential in their markets, had been judged to be underperformers. At the same time, too many weaker managers, who had "left lots of money on the table," had been given too much reward.
As illustrated in the top right cell in Figure 5 on page 22, 4% of the region's managers were judged to be very weak, fourth-quartile performers, despite the fact that they were among the most successful at capturing the full potential in their markets. At the same time, another 4% of branch managers (bottom left cell) were judged to be top-tier performers, despite the fact that they were particularly weak in achieving their potential performance.
These situations occurred because the region's performance model was incomplete, focusing on sales but not market share or customer retention, and because goals did not reflect a good assessment of performance potential. Furthermore, the region had been weak in deploying people to those areas that could most benefit from their skills.
Figure 4: Performance of "Affluent Urban" Market Branches
Figure 5: Comparison of Branch Manager Performance Evaluations: Existing
Rankings versus Opportunity-Based Rankings
Specifically, by failing to recognize each manager's success in achieving his or her potential in each of the disciplines represented by the customer-centric model, regional management had failed to identify and share best practices. Success would require better alignment of the regional president's objectives with those of his managers. As one team member described it, "Actions that produce good numbers for branches can too easily play havoc with regional earnings." Although team members had recognized this discontinuity, they had not understood its importance until they applied the principles described here.
In sum, after applying the four principles, the team concluded that the region had routinely "left money on the table" by failing to ensure that managers recognized and focused on the most important issues.
Conclusion
Companies that have adopted these principles have found opportunities to improve corporate performance by 5% to 10%, simply by ensuring that people focus on the right issues. Most have discovered these opportunities "hiding in plain sight," previously unrecognized because of blinders these companies had built into their goal-setting and performance evaluation processes. The four principles provide a systematic way of recognizing and removing such blinders.
Dave Crandon covers several of his main points in our webcast Capturing Hidden Opportunities: Achieving the Next Level in Performance Management.
Additional Information:
Reprinted with permission of the International Society for Performance
Improvement. Copyright 2006, Volume 45, Number 2.
www.ispi.org
References
Drucker, P.J. (1999). Management challenges for the 21st century. New York: HarperBusiness.
Grant, A.W.H., & Schlesinger, L.A. (1995). Realize your customers’ full profit potential. Harvard Business Review, 73(5), 59-72.
Hope, J., & Fraser, R. (2003). Beyond budgeting: How managers can break free from the annual performance trap. Boston: Harvard Business School Press.
Merchant, K.A. (1990). The effects of financial controls on data manipulation and management myopia. Accounting, Organizations and Society, 15(4), 297-313.
Merchant, K.A., & Van der Stede, W. (2000). Ethical issues related to ‘results-oriented’ management control systems. Research on Accounting Ethics, 6, 153-169.
Stolovitch, H.D. (2000). Human performance technology: Research and theory to practice. Performance Improvement, 39(4), 7-16.
Van Tiem, D.M., Moseley, J.L., & Dessinger J.C. (2004). Fundamentals of performance technology: A guide to improving people, process, and performance (2nd ed). Silver Spring, MD: International Society for Performance Improvement.
Welch, J. (2005). Winning. New York: HarperBusiness.
David S. Crandon is a management consultant who has developed a methodology for helping clients build organizations that are better focused on identifying and capturing their best performance improvement opportunities. With this methodology, David has helped clients identify opportunities for substantial earnings improvement (5% to 15% through tactical action, and 10% and more through strategic action). David started his consulting career with McKinsey & Company and later cofounded the Treasury Services Corporation, a consulting and software business that delivered information technology solutions to address a broad range of strategic, financial, and marketing business needs. David may be reached at dcrandon@opportunitybrowser.com.
Kenneth A. Merchant, PhD, holds the Deloitte & Touche LLP Chair of Accountancy at the University of Southern California (USC). At USC he has served as both Senior Associate Dean for Corporate Programs, Marshall School of Business, and as Dean of Leventhal School of Accounting. He has also held faculty positions at Harvard Business School and the University of Maastricht (The Netherlands). Professor Merchant is the author of eight books, including Management Control Systems: Performance Measurement, Evaluation and Incentives (with W. Van der Stede) and numerous journal articles. He received his PhD from the University of California-Berkeley. He may be reached at kmerchant@marshall.usc.edu.
Make the most of your BI investment by centralizing expert resources
After investing in business intelligence (BI) solutions, wouldn't you do everything possible to ensure that your commitment to BI gets results? From strategic concerns like developing and updating your long-term BI vision to tactical issues like keeping up with software enhancements and providing user training, many organizations are developing BI Competency Centers to be the brain of their BI efforts.
"If structured effectively, a BI Competency Center can ensure that all the parts of an organization are moving forward together in a healthy, efficient way," explains SAS International professional services programs manager Dagmar Bräutigam.
While many of today's software vendors tout services, technologies and methodologies for competency centers, many are narrow in focus. The BI Competency Center is not just a replacement for existing help desks addressing product issues and periodically helping acquire new software. The BI Competency Center has to be about ensuring that the organization leverages its BI investment by helping users use BI technology in decision making.
"While no single strategy fits all companies, SAS takes a much more holistic approach to competency centers. We believe they must address not only the tactics of how BI is maintained and deployed throughout an organization, but even more importantly, they must promote a long-term, sustainable information delivery strategy and ensure the technology is used throughout the organization to support the business," says Bräutigam.
Gartner Research Vice President Frank Buytendijk adds, "Organizations that are successful with BI realize that information is a critical resource and have created a culture in which information is shared and leveraged. Only then can there be true performance accountability."
So exactly what does a BI Competency Center do? While there is no one-size-fits-all solution, a BI Competency Center provides a central location for driving and supporting a company's overall information delivery strategy. It coordinates and complements an organization's existing efforts, while reducing redundancy and increasing effectiveness. This centralization ensures that information and best practices are shared through the entire organization so that everyone can benefit from successes and lessons learned.
Other functions that a BI Competency Center may address include:
BI Strategy: Make Sure You're Seeing the Forest
The most valuable BI Competency Center implementations are designed to balance
the tactical needs of today with a strategy and vision for the future. The BI
Competency Center, therefore, must define the BI strategy if one is not yet
defined. Common questions to be answered include:
Staying competitive and meeting the pressure of regulatory compliance requirements demand that information sources and business and analytic intelligence be accessible to a growing number of information consumers across the enterprise and beyond. So investing in an information infrastructure is well worth the effort.
A leading insurance company in South Africa decided to implement an enterprisewide management information system. In the process, the company realized it would need a structure that would provide a sustainable environment for driving the use of business intelligence; to do so, it chose SAS as the end-to-end provider. To deliver BI information to approximately 700 users, it decided the best approach was to develop a BI Competency Center, in collaboration with SAS and consulting partner KPMG. "The collaboration between SAS and KPMG produced a complete solution that addressed people, process, infrastructure and culture and provided the foundation for the establishment of a BI Competency Center that ensured ongoing value to the customer," says Wayne Brider, manager of advanced technologies with EOH KPMG Consulting.
Dr. Stefanie Gerlach, senior program consultant for professional services programs at SAS, emphasizes: "It is vital that the competency center has executive sponsorship. The aim must be to align business intelligence goals across various functional areas, in support of the company strategy."
Ongoing Support Ensures Success
The development and deployment of the BI Competency Center is just the
beginning. The ongoing services that the BI Competency Center offers users, as
well as the services it receives from technology vendors, help create a
well-rounded culture of – and reputation for – innovation.
Aiman Zeid, SAS Americas principal BI consultant adds, "SAS is launching a global initiative to advocate the importance of a BI Competency Center. Our goal is to provide our customers with the technology, services and resources to develop and own their BI Competency Center. SAS customers will soon have access to a comprehensive framework of services provided by SAS Consulting and the SAS partner community. This collective effort combines SAS' leading BI technology and services with the expertise of SAS partners."
SAS also provides a full range of technical support, training, documentation and user group services that can help complete the BI Competency Center as a one-stop shop for business intelligence.
Additional Information:
Kelly LeVoyer is editorial director of
sascom magazine.
SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. ® indicates USA registration. Other brand and product names are trademarks of their respective companies. Copyright © 2005 SAS Institute Inc. Cary, NC, USA. All rights reserved.