aaeducgreenb.gif (1592 bytes) firstpagebfashion.gif (1050 bytes) secondpagebfashion.gif (1079 bytes)

line.gif (2401 bytes)

Home CRM Ch1 Retention CRM  Webcasts Ch2 BI
Ch3 On Demand Ch4 ABM Ch5 Opinion Ch6 Hospitality
Ch7 Automotive Appendix CRM MAGAZINES  

line.gif (2401 bytes)

 

   
 

CHAPTER [2] 

(Business Intelligence), (Analysis and Reporting),  (Data Management)

Business Intelligence    Business intelligence (BI) uses knowledge management, data warehouse, data mining and business analysis to identify, track and improve key processes and data, as well as identify and monitor trends in corporate, competitor and market performance.  Analysis and Reporting    Business intelligence reporting and monitoring includes ad hoc and standardized reports, dashboards, triggers and alerts. Business analytics include trend analysis, predictive forecasting, pattern analysis, optimization, guided decision-making and experiment design. Data Management      Data management ensures data integrity and availability through methodologies such as data warehousing, cleansing, profiling, stewardship, modeling and definition. Effective business decisions rely on data accuracy and reliability. Knowledge Management Knowledge Management methodologies record and disseminate both explicit and tacit process and performance strategies and actions to identify best practices and innovative techniques and ideas.

Getting CRM right means integrating processes both within and across business functions to drive more effective customer interactions and unlock greater customer value. More mature areas such as campaign management, sales force automation, contact center and ecommerce are adding advanced capabilities through analytics, business process management and knowledge management tools. Newer areas such as Field Service, Marketing Resource Management, and Sales Asset Management are broadening departmental capabilities and enabling CRM to reach new heights. Customer data integration (CDI), Customer Interaction Hubs and Customer Experience Management make the relationship visible and customer interactions cohesive throughout the organization. Customer value analysis and customer data mining enable more insightful customer interactions within the context of the interaction.

line.gif (2401 bytes)

 

   

From Bettermanagement.com   

Six-part series of data management articles sponsored by DataFlux. As a leader in the data management market, DataFlux helps you turn data into a strategic information asset. Article 1 | Article 2 | Article 3 | Article 4 | Article 5 | Article 6

 

 

line.gif (2401 bytes)
[1]

The Challenges of Data Management
By Robert Lerner
 

Poor data quality costs U.S. companies billions of dollars each year.  In a frequently-quoted study published by The Data Warehousing Institute in 2002, the total value of this cost is over $600 billion.  This is a staggering amount of money, and it is likely greater today because of the increasing amount of business transacted in the U.S. since 2002 and because of the exponential growth in the amount of data produced every year (computers are running non-stop every day churning out terabytes of new data). 

However, if taken at face value, the dollar value doesn't convey the sorts of data problems that companies face each day, and it doesn't provide a real sense of how such data problems can impair a company's ability to function at peak levels.  To change this situation and limit the loss due to poor data quality, companies need to be cognizant of the importance of their data and of the problems that can impact the quality of their data.  They should also be aware that there are tools and strategies available that can clean up their data and help them keep their data clean and accurate in perpetuity

In its most basic sense, high-quality data is essential to a company's ability to understand its customers.  Customer data that is riddled with errors (e.g., incorrect addresses or other personal information, misspelled customer names, etc.) or is inconsistent (data lacking a single, standardized format), redundant (multiple records for the same customer), or outdated will undermine a company's ability to understand its customers. After all, how can a company understand a customer if it doesn't know where the customer lives or how to spell the customer's name?  If a company cannot understand its customers, then it will have problems serving its customers according to the customer's needs, preferences, goals, and the like.

Equally important, companies will have limited success up-selling and cross-selling to customers without having accurate and up-to-date customer information at their fingertips.  They will have difficulties distinguishing high-value customers and segmenting customers for promotions and campaigns.  Moreover, the absence of good quality data will increase the costs of obtaining and retaining customers.  If a company has two records for a single customer, the costs of sending a promotion to that customer will double, while the duplicated mailing itself could irritate the customer and cost the company customer loyalty and goodwill. 

However, customer data is only one part of this overall problemBusiness data, sometimes called non-name-and-address data, is just as crucial to a company's health and success.  Business data can be anything from an email address to a part number to a genome sequence.  If a company doesn't have a correct email address for a customer, it may have trouble contacting the customer or directing a promotion to the customer if email is the customer's preferred method of contact.  Or consider the case in which a part number has two digits that were accidentally transposed.  In a manufacturing setting, the transposition could delay the arrival of the correct part to the assembly line, which in turn could delay the assembly process.  The transposition could also impact the actual value of the company's inventories and, depending on the cost of the correct and incorrect parts, create variances in the company's books.  And an incorrect genome sequence could negatively impact scientific research, drug discovery, or patient trials. (The Challenges of Data Management By Robert Lerner Current Analysis. Our Server.)

line.gif (2401 bytes)

 

[2]

Data Profiling: The Blueprint for Effective Data Management
By Robert Lerner

Let’s suppose that we want to travel from Washington, DC, to La Jolla, California to meet an old acquaintance. We decide to save some money and, instead of flying, we rent a car and begin driving. Of course, there are some time and costs involved in this venture, so we stuff our pockets with $20 (for tolls and miscellaneous expenses that might crop up) and take off.

Although we know our destination—and the approximate time our friend will be waiting for us—we don’t bother consulting a map. Why should we? Everyone knows that La Jolla is in California, and all we need to do is point the car in the appropriate direction and everything will come out right. We will meet our friend and save money at the same time.

Certainly, this is a ludicrous situation, and it is bound to fail or at least exceed our budget, especially if we turn down one dead end after another trying to wend our way to La Jolla. This is not to say that we won’t make it to La Jolla, only that there are quicker, cheaper, and more effective ways of traveling there.

Indeed, no sane traveler would embark on such a trip without a map, and yet data-driven projects of all kinds begin this very way. Organizations will decide on, say, a CRM application, and they will then go about implementing it without first consulting a roadmap of their data.

It is therefore not surprising that over half of all CRM implementations either fail or fail to live up to expectations, because many organizations attempt to implement applications without such a roadmap. To put it another way: Too many companies lack a complete and necessary understanding of their data. Without such an understanding, or data roadmap, organizations will spend more time, energy, and money than they should simply to achieve limited results from the application. As any IT manager knows, no enterprise application will ever deliver on its promises if the data populating it is of questionable quality.

To ensure the best results from any data-driven project, an organization should begin by making a thorough inspection of its data, noting all the problems or potential problems and assessing the time and effort needed to rectify these problems. While this can be done manually, a manual review process tends to be long, intensive, and costly. Furthermore, manual review is not only susceptible to human error, but it is also completely impractical for large organizations that have thousands, if not millions, of customer and product records. It is unnecessary as well, because of the data profiling technology that is now available.

A data profiling tool is designed to provide an organization with a thorough understanding (or roadmap) of its data. It can inspect the content and structure of the data and provide detailed information on its accuracy and completeness. It can also uncover areas in the data that are ambiguous and redundant. Ultimately, a data profiling tool provides information on whether the data is fit for the purpose for which it was—and is—intended.

(Please see Data Profiling: The Blueprint for Effective Data Management ) .                        Our Server

line.gif (2401 bytes)

 

[3]

The Data Quality Process

Organizations today have an ever-increasing amount of data and data sources at their fingertips. A large organization, for instance, will typically have numerous databases, data warehouses and data marts, as well as a variety of enterprise applications such as CRM, ERP, SCM, etc. It will also have a massive amount of unstructured data and a range of third-party data sources. A small organization may have less data and fewer data sources, but this is only a matter of degree, for it will also have a smaller staff to manage the data and its sources.

Organizations depend on information to be competitive in the market and to function smoothly and effectively. The real concern is that much of the information has errors of some sort in it (incorrect values, missing values, inconsistent values, etc.), and all too frequently the data sourcesthe applications, databases, etc.—are incompatible because each has its own data format and business rules. Such problems inhibit an organization’s ability to leverage its data to its fullest, which ultimately impacts the quality of decisions based on the data.

Now, it is certainly possible for an organization to assess its data and to address manually whatever problems it discovers. For most organizations, this is not an efficient and cost-effective method for handling data quality issues. It is also unlikely that any manual effort, even if it can be accomplished in one’s lifetime, will achieve the same level of results as a solid set of data quality tools.

The only practical, effective method of rectifying data problems is through the use of next-generation data quality tools and processes, which can do more than correct data errors and render disconnected information meaningful. Such data quality tools can also keep information clean and consistent on an ongoing basis. For this, an organization should consider a data management solution that includes a tightly integrated data quality tool set and processes that provide

  • cleansing,
  • parsing,
  • standardization,
  • matching and linking,
  • and householding.

The data quality process begins by using

  • a data profiling tool to analyze and assess the quality of the organization’s data.
  • The results from the data profiling tool locate the precise areas and problems that need to be corrected.
  • Ideally, the data profiling tool should be tightly integrated with the data quality tool set, as this makes the process of discovering errors and business rules—along with sharing rules with the data quality tools—seamless and efficient.
  • After the data profiling process, the data quality process can proceed, beginning with parsing.

(Please see  The Data Quality Process By Robert Lerner Current Analysis. Our Server.)

line.gif (2401 bytes)

 

[4]

Enhancing the Value of Data Through Integration and Enrichment
By Robert Lerner
 

To be successful, businesses require accurate, consistent, and timely information to make sound, productive decisions. Regardless of whether the data concerns customers, products, suppliers, employees, or whatever, information must be readily available to every person in the organization who needs it, even if they are located in different departments, divisions, subsidiaries, or even geographical regions.

Unfortunately, not every organization has easy access to information. In fact, even organizations with relatively accurate data can have data silosdata (in applications, departments, etc.) that is not shared with the rest of the organization.

Consider the case in which an organization’s call center application does not share data with its CRM application, perhaps because the technologies are from different vendors (and the data has different formats) or because the applications reside in different business units. Since the applications don’t share data, any new data (about customer interactions, updated customer information, new customers, etc.) arriving through the call center is unlikely to be available to the CRM application, and vice versa. Thus the overall value of the applications to the organization is diminished, since each application needs to have complete, accurate, and timely information about the customer – not bits and pieces – to fulfill its promise. As a result, the organization’s ability to understand and support its customers will suffer, while meaningful reporting and analysis across the applications will be difficult at best.

Multiply this situation across the organization and it is easy to see how silos of information can hurt an organization’s ability not only to know its customers but also to have real insight into its business.

However, there is a solution to this problem -- data integration. In order to get the most from its data – and ensure that an organization has the best foundation on which to make business decisions – an organization must integrate its data. In fact, data integration is the critical next-step in the data management process, following the process of data cleansing and standardization (if poor quality data is integrated into any application, database, or whatever, the value and effectiveness of that application will be undermined).

Of course, linking, matching, and standardization can be part of the data integration process, but they are not the entire process. In most instances, an organization will require a data integration tool to integrate data throughout the organization or into a data warehouse, application, or a repository such as a customer data master file.

Essentially, the data integration process of data management entails

  • pulling data from its sources,
  • integrating it (consolidating records, eliminating duplicates, etc.),
  • and then either pushing it back to the source systems or delivering it to some other target system.

Once completed, the entire organization can then operate on essentially the same data.

(Please see Enhancing the Value of Data Through Integration and Enrichment ) .   Our Server

line.gif (2401 bytes)

 

[5]

Keeping on Top of Data
By Robert Lerner
 

Suppose that we have finally eliminated most, if not all, of our organization’s data problems. We began the process of eliminating our data problems by profiling our data, after which we cleansed it, integrated it, and finally enriched it. Now, imagine we completed this task at 5:00 on Friday afternoon. With nothing else to do, we turn out the lights, lock the doors, and prepare to enjoy our weekend, confident that no one will touch the data until Monday morning.

On Monday morning, we are the first ones in the office. Rested, we go about our work as we normally do, but we quickly discover that some of our applications are not delivering the results that we had anticipated. This is surprising, since there is no particular reason why the data – and by extension, the organization – shouldn’t be operating optimally. Soon, we discover the cause of our problems – our once pristine data now has errors in it. Without even touching it, the data has declined in both quality and usefulness, and it is now negatively impacting our organization.

Of course, this is an absurd tale, but it does highlight one of the central truths about data – data, even if it's left by itself, changes. The validity of data is always temporary, and changing data is as inevitable as the sun rising on Monday morning. Data changes, or decays, because people and things change. Over the course of this hypothetical weekend, any number of customers have changed some aspect of their personal information (e.g., addresses, phone numbers, etc.); some have changed some aspect of their household (married, divorced, added children), and others may have died or simply severed their relationship with the organization. Left unchecked, the data quality reaches levels similar to the one that led to its implementation of a data management solution in the first place.

Consider the following statistics compiled by Dun & Bradstreet:

On a typical morning between 9:00 and 11:00:

  • 706 businesses will move
  • 578 businesses will change their phone numbers
  • 60 businesses will change their names
  • 40 businesses will shut down
  • 10 businesses will file bankruptcy
  • 1 business will change ownership

 

Without any intervention, and without the fault of any individual, the quality of an organization's data will decline almost the instant that it has been cleansed, integrated and enriched. But the problems impacting the quality of an organization's data are not just problems that originate outside of the organization. Consider the host of data problems that could crop up throughout the rest of the week, such as input errors, the integration of incompatible third-party data, and a repurposing of some existing data. By Friday, our data is rife with problems, and its usefulness is now being questioned.

(Please see Keeping on Top of Data )  Our Server

line.gif (2401 bytes)

 

[6]

How to Choose a Data Management Solution
By Robert Lerner
 

In the previous articles of this series, we discussed data management technology. With these articles in mind, we can now consider how to buy a data management solution, or what to look for when considering a data management solution.

The following is a discussion about some of the features and functions that an organization should consider when making a buying decision. However, we are assuming that the organization has already come to some understanding regarding the depth of its data problems and ultimately its goal in implementing a data management solution. We’re also assuming that medium- to large-sized organizations would most likely undertake this strategy, since small organizations may lack the resources to accomplish these goals effectively.

Ultimately, when choosing a data management solution, an organization should consider a range of issues that can lead to more useful, actionable data. These issues include data support, technology, international capabilities, methodology, architecture, platform, implementation, usability, delivery, vendor selection, and finally cost.

Data Support
Traditionally, data quality and data integration strategies have focused on customer data. As a result, many solutions often provide only cursory data quality abilities outside of the realm of customer contact information.

Organizations should consider a data management solution that can address all data, not just names and addresses. Most organizations have a range of data besides names and addresses (products, inventory, suppliers, finance, etc.), and this data needs the same support that name-and-address data does. Organizations should therefore consider a solution that can handle any data, whether this data is product codes or statistical data, commodity codes (e.g., UNSPSC and eCl@ss) or gene sequences, or even proprietary data.

(Please see How to Choose a Data Management Solution Our Server

line.gif (2401 bytes)

 

[7]

The Challenges of Customer Data Integration
 
One of the major challenges facing companies today is developing a single, consistent, and complete view of their customers across all applications, databases, and customer touch points.  The challenge is significant, because developing such a complete view of customers is not an easy task, but it is crucial to a company's success and to its ability to comply with any number of state, federal, and international government regulations.

Certainly, most companies have an intuitive grasp of the significance of obtaining such a view of their customers. But many companies fail to understand its overall importance and far too many underestimate the difficulty of getting this view. 
 
Indeed, it is surprising that not every company appreciates the importance of a single, consistent, and complete view of the customer.  (In fact, among the companies that do understand the importance, not every one of them recognizes that they lack such a view today.)  Without a single, consistent, and complete view of its customers, a company cannot begin to understand its customers - and it cannot begin to serve its customers well.

Consider, for example, a company that has duplicate customer records.  In all likelihood, some customer interactions with the company will be associated with some records and not with others.  The duplicate records complicate the effort to track customer actions such as customer buying habits as well as customer interactions with various customer touch points (e.g., the Web, call center, etc.).  The company will also have difficulty determining the total value of the customer.  As a result, the company's ability to serve and support the customer may be limited. 

In fact, the company's ability to retain the customer could be jeopardized.  Customers are easily put off by customer support representatives who do not have a complete history of their interactions with the company.   Moreover, many customers have ceased to do business with a company after having been bombarded with marketing messages that the company intended to communicate to different individuals but were, in fact, directed to the same individual because of duplicate customer records. ( Please see The Challenges of Customer Data Integration By Robert Lerner  Current Analysis. Our Server )

 

line.gif (2401 bytes)

 

[8]

Emerging Issues: Master Data Management and Data Quality
By Robert Lerner
 

Master data management (MDM) is getting a lot of attention these days as people begin to understand just how fragmented and dispersed their data sources are. Essentially, MDM combines technology and services to manage master data, to build accurate, consistent, and timely information from across the organization. The idea of managing master data isn’t new; organizations have been struggling with master data management issues for years. In fact, it was once thought that ERP and similar solutions would offer the ability to provide such a view of an organization’s master data, but of course this hasn’t happened.

Master data is best defined as mission-critical data, such as customer data, product data, bill of materials and so forth. It is not metadata or transactional data. Unlike transactional data, for instance, master data helps to classify transactional data and it is controlled by changes to the organization (e.g., the introduction of a new supplier or product line), while transactional data is generated by such events as a sale.

Many organizations continue to have trouble leveraging their master data to the fullest, because it is not consistent across the organization or because it exists in data silos. In large organizations, it is not uncommon to find multiple instances of ERP systems, and when the product data in one ERP system is not shared with other ERP systems (or any system that should have access to this data), it becomes difficult for the organization to understand and track the value of this data. Individual business units also have trouble accessing such data (if indeed they can), which could be crucial to their success, while the organization itself doesn’t really know what product data is correct—chances are that all of the same product data in the various systems is not correct, since it probably lacks information buried in one or many of the other systems.

But while these sorts of problems are not uncommon, the need to manage master data effectively is gaining a sense of urgency for a number of other reasons. The size and complexity of organizations are increasing, for example, while at the same time large organizations are becoming more global. All of this puts pressure on organizations to increase the number of systems or applications needed to run both the organization and its individual business units. Furthermore, mergers and acquisitions add to the problems (how do you integrate master data from disparate organizations?), as does regulatory compliance, which is forcing organizations to control their master data more effectively for reporting. As a result, organizations are experiencing difficulties understanding and properly valuing their customers, suppliers, and even partners; and they are facing problems controlling costs, executing effectively on business strategies, and complying effectively and cost-efficiently with Sarbanes-Oxley, Basel II, and other regulations.

MDM solutions, however, offer some real promise to these and other master data problems. In a sense, an MDM solution is not unlike a CDI (Customer Data Integration) solution, in that a CDI solution is designed to provide an accurate, timely, consistent enterprise-wide view of customer data. Indeed, CDI is a subset of MDM. Like a CDI solution, an MDM solution can leverage a central master data hub that serves as the “single version of the truth” for all of an organization’s master data. The hub manages the master data, integrating new data or updates, while synchronizing all new and updated data to an organization’s appropriate applications, systems, etc.

 (Please see Emerging Issues: Master Data Management and Data Quality By Robert Lerner Current Analysis. Our Server.)

line.gif (2401 bytes)

 

[9]

A CDI Solution for the Rest of Us
By Robert Lerner
 
Organizations are looking for ways to help them unify and understand their customer data better—that is, they are looking for a solution that will help them achieve a single, consistent, and accurate view of their customer data across all of their applications and throughout the enterprise.  This view is often called a 360 degree view of the customer. 

Unfortunately, the technologies—and the methodologies—that many organizations have leveraged to achieve this view have offered only modest success.  ETL, EAI, and EII tools have their uses, but they have proven inadequate in terms of providing a true 360 degree view of a customer.  An ETL tool, for example, can effectively load customer data into a repository, but it does little to ensure that the data which is loaded is clean, accurate, and unique (i.e., without duplicate records on the same customer). It also lacks the ability to share consistent, accurate customer data with an organization's applications or to guarantee that the data in the repository or in these applications remains consistent and accurate. Of course, small organizations can standardize on a single vendor's applications, which would at least increase consistency of their customer records, but this is impractical for enterprises that depend on multiple applications and data sources and it doesn't ensure that the data is accurate on an ongoing basis.

What an organization needs, therefore, is technology that is specifically designed to create the 360 degree view of the customer—technology that can ensure that customer data is clean, accurate, and unique, and technology that can share this data with all of the organization's applications, databases, data warehouses, and the like. To achieve a true 360 degree view of a customer, an organization requires a customer data integration (CDI) solution. By CDI, I mean the technology and services that enable a company to create a single, consistent, and complete view of a customer across all of its applications, databases, and other customer data sources.

CDI is not a new concept, but it can be a slightly confusing one, given the diversity of CDI solutions and approaches that are currently available. Unfortunately, not all of these approaches are effective and few are suitable for most organizations. What I will offer below is one possible CDI solution. However, of the CDI solutions available today, this is one of the most effective and one that is applicable for almost any organization.

(Please see A CDI Solution for the Rest of Us )

line.gif (2401 bytes)

 

[10]

A Real-World CDI Implementation
By Robert Lerner
 
The first article in this series explained how a customer data integration (CDI) solution could provide an organization with greater insight into its customers and the overall importance of this insight.  The second article highlighted the features of a particular CDI solution; it also positioned this solution as perhaps the most effective CDI solution for the broadest range of organizations, regardless of size and industry.  The final article in this series provides a look at a successful implementation of this CDI solution. This case study demonstrates how the solution not only delivered the promised benefits but also offered a number of additional rewards. 

The company (hereafter referred to as the "Company") that implemented this CDI solution owns and operates upscale ski villages, golf resorts and beach resorts. Each year millions of people flock to its properties.  The competition for customers in this industry is strong, so the Company is constantly looking for ways to enhance the experience that it provides its customers. In addition, the Company began to look for  new and innovative services to offer to a select group of high-profile customers. 

Prior to implementing a CDI solution, the Company relied on CRM (Customer Relationship Management) and related technologies to enhance customer service levels.  However, while its CRM applications improved its ability to understand some of its customers, the applications never truly lived up to their promises. First, the Company had some underlying data problems that CRM is not designed to resolve. Second, each property of the Company had its own CRM application, and few of these applications were compatible with the other CRM applications in other properties. 

Compounding the Company's problems, each of the properties had a variety of applications or systems that captured customer information and transactions for specific purposes (e.g., food service, lodging, green fees) but did not (or could not) share this information with the other applications and properties of the Company.  The upshot is that the Company could not get a complete, or 360-degree, view of all of its customers, and this dramatically hindered the Company's ability to compete and to deliver new services that might attract certain customers. 

To address the problem with inconsistent enterprise customer data, the Company looked for a technological solution that could tackle each of these problems.  The Company considered a number of technologies, including an application developed in-house, but it ultimately decided to implement a CDI solution, which would target the very problem that it faced in understanding its customers.  A variety of CDI implementations were considered before the Company decided on a solution similar to the one described in the second article - a solution that combined a customer data repository with a tightly-integrated data quality solution, and which was built on an SOA (services-oriented architecture). 

The solution delivered the promised results.  Once implemented, the customer data repository became the central source for the Company's customer reference data.  The repository consolidated all the customer reference data from the Company's data sources and properties, after which it shared accurate, consistent, and timely views of this data to all of the Company's subscribing data sources, systems, etc.  In essence, the repository eliminated the silos of customer data that had existed throughout the Company and throughout each property.  Updates to the data gathered at one property -- or new customer information gathered at a different property -- became immediately available to every property throughout the Company.  Because of the repository, the entire Company was on the same page concerning the Company's customers. 

(Please see A Real-World CDI Implementation )

line.gif (2401 bytes)

 

 

 

 

 

Accenture

 

CRM Webcasts

 

1to1® media a division of Peppers & Rogers Group

 
1to1 Glossary

 
1to1 Helpful Links

 
1to1 on.the.run

 

 

 

All your R. Software 

 

Additional Text Materials

 

DSS & I. Systems

 

Database Management  Our Server Only

 

TechEncyclopedia

TechWeb

 

The Data Warehousing Institute

 
Business Intelligence Network

 

  

JW Marriott Hotel Cairo

Marriott

 

 

 

      

Hyperion

Oracle

SAP

SAS

IBM

Microsoft

Cisco

SAS Links

SalesForce.com

Business Objects

Teradata

 

 

 

Publication

Knowledge@Wharton
 
Harvard Business Review 
 
HBO e-learning
 
INSEAD Knowledge
 
Strategy + Business
 
The McKinsey Quarterly
 
MIT Sloan M. Review
 
CIO
 
Executive Summaries
 
HBS Working Knowledge
 
Accenture Outlook on line
 
Stanford G. S. of Business
 
Stanford Audio & Video
 
Better Management.com
 
Businessweek: Media center
 
A & A Readings Portal
 
A & A Readings
 
Audio & Video
 

Mainly CRM

CRM Today
 
Oracle Magazine
 
DestinationCRM.com
 
TechRepublic
 
BNET
 
CRM Daily.com
 
CRMGuru
 
eCRMGuide
 
IntelligentCRM
 
IT Papers.com
 
IT Toolbox CRM
 
searchCRM
 

Mainly Hospitality

Hospitality Technology
 
Hospitality Upgrade
 
Hotel Online
 
Hotel Marketing Coach
 
 
 

 

 

 
     
     

line.gif (2401 bytes)

line.gif (2401 bytes)

 


 LEFastCounter    

  sssyahoo.gif (264 bytes) 

 

shaw4545@yahoo.com

Copyright © 1997-2007 [A & A Trading Enterprises]. All rights reserved.