Quick guide to implementing Business Intelligence, Data Warehousing & BPM

Definitions and overview

Business Performance Management (BPM) establishes a framework that improves business performance by measuring key business characteristics that can be used to provide feedback to the decision-making process and guide operations in an effort to improve strategic organizational performance. Other popular terms for this include; Enterprise PM (EPM), Corporate PM (CPM) Enterprise Information Systems (EIS), Decision Support Systems (DSS), Management Information Systems (MIS).

BPM: Cycle with goal setting, performance monitoring and return to new goals.

Business Intelligence (BI) can be defined as the set of tools that give end users easy access to relevant information and the ability to analyze this to assist in decision making. More broadly, the “intelligence” is the insight derived from this analysis (e.g., trends and correlations).

BI: Tools to access and analyze data

Key Performance Indicators (KPIs) are strategically tailored enterprise measures used to monitor, predict and predict the performance of the organization. They form the basis of any BPM solution, and in an ideal world, it should be possible to relate strategic KPIs to the actual operational performance within the BI application.

KPIs provide a quick indication of the organization’s health and guide management to the operational areas that impact performance.

In many companies, data analysis is complicated by the fact that data is fragmented within the industry. This causes problems with duplication, inconsistent definitions, inconsistency, inaccuracy and wasted work.

Data Silos: Fragmented, Departmental Data Stores, often in line with specific business areas.

Data Warehousing (DWH) is often the first step towards BI. A data warehouse is a centralized pool of data structured to facilitate access and analysis.

DWH: Centralized / consolidated data warehouse

DWH will be filled in from various sources (heterogeneous) using an Extract, Transform & Load (ETL) or data integration tool. This update can be performed in regular periodic batches, such as a one-time load, or even synchronized with the (real-time) source data.

ETL: The process of extracting data from a source system, transforming (or validating) it, and loading it into a structured database.

A reporting layer (or BI) can then be used to analyze the consolidated data and create dashboards and custom reports. A modeling layer can be used to integrate budgets and forecasts.

As these solutions become more complex, the definitions of the systems and what they make are more important. This is known as metadata and represents the data that defines the actual data and their manipulation. Each part of the system has its own metadata that defines what it does. Good management and use of metadata reduces development time, simplifies ongoing maintenance, and provides users with information about the data source, increasing their confidence and understanding of them.

Metadata: Data on data that describes how and where they are used, where they came from and what changes have been made.

Commercial justifications

There is a clear commercial rationale for improving the quality of information used for decision making. A study by IDC found that the average repayment of BI implementation was 1.6 years and that 54% of companies had a 5-year ROI of> 101% and 20% had a ROI> 1000%.

ROI of BI> 1000% from 20% of organizations

There are now also regulatory requirements to consider. Sarbanes-Oxley requires U.S. listed companies to disclose and monitor key risks and relevant performance indicators – both financial and non-financial in their annual reports. A robust reporting infrastructure is important to achieve this.

SarbOx requires disclosure of financial and non-financial KPIs

Poor data quality is a common barrier to accurate reporting and informed decision making. A good data quality strategy that includes non-system issues, such as user training and procedures, can have a big impact. Consolidating data into a DWH can help ensure consistency and correct bad data, but it also provides an accurate data quality measure so that they can be managed more proactively.

Data quality is essential, and a formal data quality strategy is important to continually manage and improve it.

Recent research (PMP Research) asked a wide cross-section of organizations about their opinion on their data quality before and after a DWH implementation.

– “Don’t know” answers dropped from 17% to 7%

– “Bad” or “Very Bad” dropped from 40% to 9%

– Satisfactory (or better) increased from 43% to 84%

DWH deployments improve data quality.

Overview of Tools Market

At present, BI is considered a significant IT growth area, and as such everyone is trying to get into the BI bandwagon:

ERP tools have BI solutions, such as SAP BW, Oracle Apps

CRM Tools Do It: Siebel Analytics,

ETL vendors add BI capabilities: Informatica

BI vendors add ETL tools: Business Objects (BO) Data Integrator (DI), Cognos Decision Stream

Database vendors extend their BI & ETL tools:

Oracle: Oracle Warehouse Builder, EPM

Microsoft: SQL 2005, Integration Services, Reporting Services, Analytical Services

Improved tools

Like all mature markets, consolidation has taken place, with fewer suppliers now covering more functionality. This is good for customers as more standardization, better use of metadata and improved functionality are now readily available. BI tools today can now satisfy the most demanding customers’ information requirements.

Thinking and tools have moved on – we can now build fast, business-focused solutions in small chunks – enabling companies to view data, store knowledge, learn opportunities for new tools and refine their requirements throughout the project! Gone are the days of the massive data storage project that was obsolete before it was completed.

A typical DWH project should deliver useful results within 3 – 6 months.

Advice and best practices

Preliminary phase

Successful BI projects are never completed. It should, if possible. Develop to meet the changing needs of the business. So first ‘wins’ need to come quickly and tools and techniques need to be flexible, quick to develop and quick to implement.

Experience is crucial

We have often been brought in to correct unsuccessful projects, and it is frightening how many fundamental mistakes have been made through inexperience. A data warehouse is fundamentally different from your operating systems, and the correct design and infrastructure are essential to meeting business requirements.

Keep internal control

We believe BI is too close to business and is changing too quickly to outsource. Expertise is required at the initial stages to ensure a solid infrastructure is in place (and use of the best tools and methods.) If there is insufficient experience internally, external resource may be useful at the initial stages, but this MUST include skills transfer to internal resources. DWH can then grow and develop (with internal resources) to meet the changing needs of the business.

Ensure management and user acquisition

It may sound obvious, but internal knowledge and support are critical to the success of a DWH, yet ‘reporting’ is often a low priority and can easily be neglected unless supported at a high level of business. It is common to find that there is a limited knowledge of user requirements. It is also true that the requirements will change over time both in response to changing business needs and to the results / results of the DWH implementation and the use of new tools.

Strong project management

The complex and iterative nature of a data warehouse project requires strong project management. The relatively non-quantifiable risk of data quality needs is managed together with changing user requirements. Plan for changes and allow extra budget for the unexpected. Using rapid application development (RAD) techniques mitigates some of the risks of exposing them early in the prototype use project.

End user education

Do not underestimate the importance of training when implementing a new BI / DWH solution. Trained users have 60% more success in realizing the benefits of BI than untrained users. But this training needs to consider specific data analysis techniques as well as how to use the BI tools. In the words of Gartner, it is “more critical to train users on how to analyze the data.” Gartner goes on to say “… that focusing only on BI tool training can triple the workload on the IT help desk and result in user disillusionment. A user trained on the BI tool but does not know how to use it in the context of his or her BI / DWH environment will not be able to get the analytical results he or she needs … “. Therefore, customized user training on your BI system and your data is very important.

Careful planning of the training needs and utilization of the various training media that are now best available can solve this problem. Look for training options such as: Structured classroom (on or off site), web-based e-learning (CBT), on job training and skills transfer, customized training around your solution and data.

Technical overview

Information portal: This allows users to manage and access reports and other information through a company’s web portal. As users create and require multiple reports, the ability to easily find, manage and distribute them becomes more important.

Cooperation: The ability of the information portal to support communication between relevant persons centered on the information in the portal. These may be discussion threads linked to reports or workflows around strategic goal achievement.

Guided Analysis: The system guides users where to look next during data analysis. Taking knowledge from people’s heads and placing it in the BI system.

Security: Access to system functionality and data (both rows and columns) can be controlled down to the user level and based on your network logon.

Dashboards and scorecards:

Providing management with a high, graphic image of their business performance (KPIs) with easy drilling down to the underlying operational detail.

Ad-hoc reporting and data analysis: End users can easily extract data, analyze them (disk, dice & drill) and formally present them in reports and distribute them.

Formatted / standard reports: Predefined, perfect pixels, often complex reports created by IT. The power of end-user reporting tools and data storage now makes these types of reports less technical and more business-focused.

Tighten MS Office integration: Several users rely on MS Office software, so the BI tool must seamlessly connect to these tools.

Write back: The BI portal must provide access to write back to the database to maintain: reference data, goals, forecasts, workflow.

Business modeling / warning: around centrally maintained data with predefined, end-user-maintained business rules.

realtime: When the source data is changed, it is immediately transferred to the user. Often via message queues.

Near real time: Source data changes are batched together and sent over a short period of time, said every few minutes – this requires special ETL techniques.

Batch Processing: Source data is captured in bulk, said overnight, while the BI system is offline.

Relationship database against OLAP (dice, slice & dice, lathe)

This is a complex argument, but quite simply, most things done in an OLAP cube can be achieved in the relational world, but can be slower to execute and develop. As a rule of thumb, if you are already working in a relational database environment, OLAP should only be needed if analytics performance is a problem or you need specialist functionality such as budgeting, forecasting or ‘what if’ modeling. The leading BI tools provide seamless access to data in either relational or OLAP form, which is first and foremost a technological decision rather than a business decision.

Top down or bottom up approach?

The top-down approach focuses on strategic goals and business processes and organizational structure to support them. This may produce the ideal business processes, but existing systems probably do not support them or provide the necessary data to measure them. This can lead to a strategy that is never adopted because there is no physical delivery and strategic goals cannot be measured.

The bottom up method takes the existing systems and data and presents it to the company for them to measure and analyze. This may not provide the best strategic information due to the limited available data and data quality.

We recommend a compromise with both approaches: Build the pragmatic bottom-up solution as a means to get accurate business goals and a better understanding of current processes while performing a top-down analysis to understand what the business has need strategic. The analysis of what can be achieved today and what is strategically desired will then provide the future direction for the solution, and if the solution is designed with change in mind, it must be relatively straightforward and build on the system foundations already in place. .

Advanced business information

The following describes some advanced BI requirements that some organizations may want to consider: Providing an integrated BPM solution that has business rules and workflow built in so that the system can quickly guide the decision maker to relevant information.

Collaboration and guided analysis to help guide the required action as a result of the information obtained.

More user-friendly Data Mining and Predictive Analytics, where the system finds relationships between unrelated datasets to find the ‘golden nugget’ of information.

More integration of BI information in Front Office Systems e.g. a gold-rated customer gets VIP treatment when they call in, data profiling to suggest that this customer may be swarming, and therefore offering them an incentive to stay.

Increased use of Real-time data.

End to end Data Lineage automatically recorded by the tools. Better metadata management of the systems will allow users to easily see where the data came from and what transformations they have undergone, improving confidence in the data and reports. Systems will also be self-documenting and provide users with more information and simplify ongoing maintenance.

Integrated, real-time data Quality as a means of measuring the accuracy of operational process performance. This would provide cross-system validation and verify the business process performance by monitoring the accuracy of data, leading to better and more dynamic process modeling, re-processing of business processes and thus efficiency gains.

Packaged analytical applications as funding systems in the 80s and packed ERP (Enterprise Requirement Planning) in the 90s. Packaged BI may become the standard in this decade. Why build your own data warehouse and package of reports and dashboards from scratch when your business is similar to many others? Purchase packaged items and use quick installation templates and tools to configure them to meet your exact needs. This quick implementation capability then supports you as your business evolves.



BI for the masses:
As information becomes more critical to managing operational efficiency, more people need access to this information. Now, BI tools can technically and cost-effectively give more people access to information, BI for the masses is now a reality and can significantly improve a business. The increased presence of Microsoft in the BI space will also increase the use of BI and make it more attractive. BusinessObjects’ acquisition of Crystal and recent release of XI will also extend BI to more people inside and outside the organization – now everyone can have secure access to information!

conclusion

The potential benefits of a BI / DWH implementation are enormous, but too many companies fail to realize these: lack of experience, poor design, poor choice and use of tools, poor data quality management, poor or no project management, limited understanding of the importance of metadata, no realization that if successful, it will inevitably evolve and grow, give limited attention to the importance of education ….. with all these areas to consider using a specialist consulting firm like IT Performs makes significant sense.