Saturday, November 21, 2009

Learn BI



  • What is Business Intelligence?



    • BI & ROI





    • BI Environment



    • Business Modeling



  • Data Modeling



  • Dimensional Modeling



  • Star Schema



  • ETL & Data Warehouse



  • BI Reporting



  • ETL - Informatica:





    • Informatica Tutorial




  • OLAP





    • What is OLAP?



    • OLAP - Practical Examples




  • Business Intelligence Tools:





    • BI Tools Directory



    • BI Reporting Tools Guide




  • KPI | Score Cards | Dashboards



  • Key Performance Indicators



  • BI Dashboards



  • Dashboard Softwares



  • BI Scorecards



  • Data Mining:






    • What is Data Mining?


What is Business Intelligence?

Business Intelligence(BI) is a terminology representing a collection of processes, tools and technologies helpful in achieving more profit by considerably improving the productivity, sales and service of an enterprise. With the help of BI methods, the corporate data can be organized, analyzed in a better way and then converted into an useful knowledge of information needed to initiate a profitable business action. Thus its about turning a raw, collected data into an intelligent information by analyzing and re-arranging the data according to the relationships between the data items by knowing what data to collect and manage and in what context.

Importance of Business Intelligence:

A company's collected raw data is an important asset where one can find solutions to many of an organisation's critical questions like 'what was the net profit for a particular product last year and what will be sales this year and what are the key factors to be focussed this year in order to increase the sales?'. So there arises a necessity of a well planned BI system which can lead to a greater profitability by reducing the operating costs, increasing the sales and thereby improving the customer satisfaction for an enterprise.

With the help of a Business Intelligence System, a company may improve its business or rule over its competitors by exploring and exploiting its data to know the customer preferences, nature of customers, supply chains, geographical influences, pricings and how to increase its overall business efficiency.

Business Intelligence & Return on Investment(ROI):

Business Intelligence enables us to take some action based on the intelligence aquired using BI strategy. If this knowledge or information is not utilized properly in the right direction, there is no point in analyzing and finding the intelligence. For example, lets assume a company has implemented a BI system to analyze the customer interests and requirements enabling them to promote a particular product in the near future. All the analysis and knowledge management will be pointless and a waste of investment if they dont have a proper plan to approach the right customer at the right time. So Business Intelligence is all about strategies in increasing business efficiency while vastly cutting down the operating costs.

Implementing a Business Intelligence system in an organization requires a significant amount of money to be invested in order to build and implement a BI system and its applications. It requires more skilled top level managers to build a ROI(Return on Investment) model to analyze the costs involved in implementing and maintaining these BI models and methods to get the return on investment sooner.

A proper business action should be taken based on the strategies derived with the help of these intelligence models. Often an erraneous model and wrong assumptions can bring a loss much greater than building the entire Business Intelligence system itself. Once everything is done more properly in a way an organisation want them to be, then the benefit that comes out of it is priceless.

BI Environment & Business Flow:

Business Intelligence is all about converting large amounts of corporate data into useful information, thereby triggering some profitable business action with the help of knowledge acquired through BI analysis. Implementing BI is a long process and it requires a lot of analysis and investment. A typical BI environment involves business models, data models, data sources, ETL, tools needed to transform and organize the data into useful information, target data warehouse, data marts, OLAP anaysis and reporting tools.

Setting up a Business Intelligence environment not only rely on tools, techniques and processes, it also requires skilled business people to carefully drive these in the right direction. Care should be taken in understanding the business requirements, setting up the targets, analysing and defining the various processes associated with these, determining what kind of data needed to be analysed, determining the source and target for that data, defining how to integrate that data for BI analysis and determining and gathering the tools and techniques to achieve this goal.

Following sections explain each of these areas in detail and the sample figure shows a BI environment in its simplest form.

Sample Business Intelligence Environment

Business Modeling:

Business Modeling depicts the overall picture of a business like what is it about, what specific business problem it is intend to solve and how the information flows from source to destination. Business modeling involves, business models and diagrams that provide information in a graphical way to the members of an organization to understand and communicate the business rules and processes effectively. Business process modeling, process flow modeling and data flow modeling are sub-categories of business modeling. It is not necessary for the business modeling to go into the details of the project and often it will hide the programming complexities required to achieve the task. Business Modeling strategies and the underlying business models differ from one organization to other depending upon their needs and goals.

1. Business Process Modeling:

A business process modeling is a group of related activities or business processes. Business processes are visually represented as diagrams of simple box with arrow graphics and text labels, better known as Business Process Models.

2. Process Flow Modeling:

Process Flow Modeling is used to graphically describe the various processes that happen in an organization and the relationships between them.

3. Data Flow Modeling:

Data Flow Modeling(DFD) is a top-down approach, that focuses on the flow of data between various Business Processes and helps to capture & document the movements within a business or organization.

Data Modeling:

Data Modeling is about representing the real world set of data structures or entities and their relationships in the form of Data Models, required for a database. Simply put, data model is a visual representaion of the database. Data Modeling consists of various types and phases like conceptual data modeling, logical data modeling, physical data modeling, enterprise data modeling, relational data modeling and dimensional data modeling.

1. Conceptual Data Modeling:

Conceptual Data Modeling visualizes the overall structure of the database and provides high-level information about the subject areas or data structures of an organization and it does not contain much detailed level of information about attributes.

2. Logical Data Modeling:

Logical Data Modeling is an extension to Conceptual Data Modeling and its includes almost all of the entities, attributes and their relationship. A logical Data Model will not contain any attribute specific information like type, length etc., instead it defines and conveys business information and rules.

3. Physical Data Modeling:

Physical Data Model includes data structures like tables, columns, properties and the relationship between them.

4. Enterprise Data Modeling:

Enterprise Data Modeling is known as global business model as it consolidates the information across the enterprise.

5. Relational Data Modeling:

Relational Data Model revolves around Entity-Relationship Modeling where entities(tables) are normalized to avoid possible redundancy and this type is the prefered technique in OLTP.

Dimensional Modeling:

Dimensional Model comprises a fact table and many dimension tables and is used for calculating summarized data. Since Business Intelligence reports are used in measuring the facts(aggregates) across multiple dimensions, dimensional data modeling is the prefered modeling technique in a BI environment. A Fact table contains various measures or facts like sales amount, loan amount etc., whereas a Dimension table describes the particular entity like time, state etc., based on which the required facts are measured.

Star Schema:

In a Dimensional Data Model, a fact table is the centralized table which is connected to multiple dimensions related to that fact. This type of approach is known as the Star Schema design based on which data warehouse and data marts are built. Since BI takes advantage of data displayed in the form of mutli-dimensional cubes, this star schema approach helps analyze and produce complex reports very easy by slicing and dicing the dimensions of interest. From the sample diagram shown below, the required Fact 'Loan Amount' can be calculated across various dimensions like state, branch, time, product, loan officer and investor dimensions.

ETL Process:

ETL, an acronym for 'Extraction, Transformation and Loading' is a collection of processes associated with extracting the source data, transforming that data and finally loading that data into a data warehouse. Before loading the required into data warehouse, it should be transformed in order to meet the needs of the data warehouse. This transformation involves several processes like data cleansing, data profiling, data type conversion, validating for referential integrity, performing aggregation if needed, denormalization and normalization.

Data Warehouse:

Data warehouse is a centralized repository where all the information for analysis is kept in an organization. This is the data collected from variant sources for the purpose of analytical processing and reporting. This data is non-volatile and a data warehouse is built on a dimensional data model. From this data warehouse, data can be extracted for reporting needs with the help of query tools or many data marts can be built based on subject area requirements.

Data Mart:

Data Mart is subject oriented, basically a sub-set of data warehouse, built for the purpose of analyzing a particular line of business or department. It holds the data specific to a particular subject area like sales, purchase etc. Data marts can be of derived from a data warehouse or built for the sole purpose of BI directly from the source and like data warehouse, data marts are also constructed from dimensional data models.

What is OLAP?

OLAP, an acronym for 'Online Analytical Processing' is a technique by which the data sourced from a data warehouse or data mart is visualized and summarized to provide perspective multidimensional view across multiple dimensions. Generally OLAP refers to OLAP Tools(e.g Cognos, Business Objects etc.,) that help to accomplish these tasks. Since data warehouse is designed using a dimensional data model, data is represented in the form of data cubes enabling us to aggregate facts, slice and dice across several dimensions. OLAP tools provide options to drill-down the data from one hierarchy to another hierarchy.

For example sales amount can be calculated for a particular year or it can be drilled down to its next hierarchies like month, week, day etc. In the same way, data can be rolled up for summarization from product to product group, product group to product sub-class then from product sub-class to product class. Thus with this cube structure, data can be viewed from multiple points providing the data analysts, a greater insight into data.

There are many OLAP hybrids or variants like MOLAP(Multidimensional OLAP), HOLAP(Hybrid OLAP), ROLAP(Relational OLAP), DOLAP(Desktop OLAP or Database OLAP) available in the market and can be used depending on the needs and requirements of an organization.

OLAP - Examples:

Topmost executives of an organization are really interested in aggregated facts or numbers to take decisions rather than querying several databases (that are normalized) to get the data and do the comparison by themselves. OLAP tools visualize the data in an understandable format, like in the form of Scorecards and Dashboards with Key Performance Indicators enabling managers to monitor and take immediate actions. In todays business life, OLAP plays a vital role by assisting decision makers in the field of banking and finance, hospitals, insurance, manufacturing, pharmaceuticals etc., to measure facts across geography, demography, product, and sales.

OLAP can be performed in data warehouses that undergo frequent updates and that do not. Following are some of the examples to show how OLAP solves complex queries involving facts to be measured across company's best-interested dimensions.

Comparison of sales (fact) of a product (dimension) over years (dimension) in the same region (dimension).

How may members (fact) have opened a savings account (dimension), in USA branch (dimension), over a period (dimension)?

How many mortgage loans (fact) have been approved in fixed mortgage (dimension) or Adjustable Rate Mortgage (dimension) in New York City (dimension), over a period (dimension)?

What is the total sales value (fact) of a particular product (dimension) in a particular grocery store (dimension), over a period (dimension)?

What is the amount spent (fact) for a particular product promotion (dimension) in a particular branch (dimension) or in a particular city (dimension), over a period (dimension)?

What is Data Mining?

Data Mining is a set of processes related to analyzing and discovering useful, actionable knowledge buried deep beneath large volumes of data stores or data sets. This knowledge discovery involves finding patterns or behaviors within the data that lead to some profitable business action. Data Mining requires generally large volumes of data including history data as well as current data to explore the knowledge. Once the required amount of data has been accumulated from various sources, it is cleaned, validated and prepared for storing it in the data warehouse or data mart. BI reporting Tools capture the required facts from these data to be used by the knowledge discovery process. Data Mining can be accomplished by utilizing one or more of the traditional knowledge discovery techniques like Market Basket Analysis, Clustering, Memory Based Reasoning, Link Analysis, Neural Networks and so on.

Data Mining Life Cycle:

Find out the Business Problem: Consider a company's current year sales dropped by a percentage when compared to the previous year. By using OLAP Tools, the exact sales fact can be determined across several dimensions like region, time etc.

Knowledge Discovery: Given this business problem, various reasons for the decrease in sales have to be analyzed utilizing one or more of the Data Mining Techniques. Causes may include poor quality or service of the product or flaws in marketing schemes or less demand for the product or seasonal changes or regulations enforced by the Government or competitors pressure, and so on. The exact solutions have to be found out in order to resolve this sales drop, which we call it as the Knowledge Discovery here.

Implement the Knowledge: Based on above discovery, proper actions should be taken in order to overcome the business problem.

Analyze the Results: Once it is been implemented, results need to be monitored and measured to find out outcomes of that action.

OLAP vs Data Mining:

OLAP helps organizations to find out the measures like sales drop, productivity, service response time, inventory in hand etc. Simply, OLAP tell us 'What has happened' and Data Mining helps to find out 'Why it has happened' at the first place. Data Mining can also be used to predict 'What will happen in the future' with the help of data patterns available within the organization and publicly available data.

For example if a borrower with bad credit and employment history apllies for a mortgage loan, his/her application may be denied by a mortgage lender since he/she may default the loan if approved. The mortgage lender would have come to this decision based upon the historical data previously mined following a similar pattern.

Business Intelligence Tool Guide:

Business Intelligence Tools assist organizations to improve their overall performance by helping them to plan, track, monitor, analyze and report the business activities. These tools improve customer relationship management thereby increasing company's profitability significantly.

Below are the few guidelines which may be of help while working with BI tools.



  • How to install and setup the BI software?



  • How to get license and training from BI software vendors?



  • How to create users, administrators and assign privileges to users?



  • How to connect to the different database servers from BI applications?



  • How to understand and work on BI data models or universe?



  • How to frame the select statement according to the business requirements?



  • How to select the tables that have to be used in the report?



  • How to select the columns that are required for reporting?



  • How to write the join condition to join (inner join, outer join, equi join) different tables in select statement?



  • How to write multiple select statements in a single report?



  • How to write the filters (null, in, equal to, greater than) that are required after the where clause?



  • How to create dimensions and facts?



  • How to drill up and drill down?



  • How to set user prompts for user to enter values?



  • How to process the query and retrieve the results?



  • How to work on results?



  • How to modify field formats?



  • How to sort data?



  • How to create computer items like date functions, numeric and string functions?



  • How to create pivots?



  • How to add data?



  • How to create totals?



  • How to group data?



  • How to create charts?



  • How to create reports?



  • How to work on reporting body?



  • How to work on report group headers?



  • How to work on report header/footer?



  • How to work on page header/footer?



  • How to design the report layout?How to use page breaks?



  • How to schedule, monitor, modify, delete, and refresh a job(report)?



  • How to write report design document, /report testing document, test reports and get user acceptance?



  • How to distribute reports and results via email, printers, intranet server, and web?



  • How to export and import data?



  • How to track on scorecards, balancing scorecards, forecasting, key performance indicators and dashboards?


Business Intelligence & Key Performance Indicators:

Key Performance Indicators, commonly referred to as KPIs, are a list of measurements that are identified as critical factors in achieving the organizational goals or mission. KPIs are often identified in a business to help them drive a business towards its success and are associated with a number of business activities like Customer Relationship Management(CRM), Supply Chain Analytics or any other activity that is happening within the organization.

Requirements of a good KPI:

There can be a number of factors related with the success of a company; All of these factors cannot be chosen as the indicators; Only those that are mission critical, strictly adhering to the organizational goals and accurately measurable should be selected as the company's KPIs. It is always better to keep the number of KPIs to a minimum to make sure that greater focus can be given to each of these indicators. So the important factors to be considered in selecting a KPI are as follows:



  • Measurable: A KPI should be quantifiable in terms of numbers.





  • Reflect the organizational Goals: A KPI should drive a business towards success.





  • ctionable: It should help the managers to initiate some business action as a result of all the analysis and measures lead by KPI.

Examples of KPIs:

A KPI may reflect regional sales by sales person, supply chain statistics by supplier, productivity by units, customer satisfaction, customer growth or it may reflect employee turnover. In either case, it should give a high-level, real time information to the top level managers enabling them to concentrate in the company's success.

What is a Dashboard in Business terms?

A Business Intelligence Dashboard visually represents the key organizational performance data in a near real time, user friendly manner that can be understood instantaneously. Technically speaking, a Dashboard is a visual representation that reflects the Key Performance Indicators(KPIs) of interest for managerial review and not only that it enables them to drill-down further. Business Intelligence Dashboard is similar in function to a car dashboard in that it displays and provides access to the powerful analytical systems and key performance metrics in a form enabling business executives to analyze trends and more effectively manage their areas of responsibility.

Features of Dashboard:

A typical web based Business Intelligence Dashboard encompasses the following features:



  1. Web based Interface: Managers can gain broad visibility into the real-time key measurements of a business with the help of this multi-window, intuitive and interactive interface.





  2. Role Based View: Executives can clearly track their organization's overall performance against its goals.


    Reports: Configurable, user-level as well as management-level reports.





  3. Charting and Graphing: Dashboards are better known for their easy one-click charts and graphs that gives instant access to complex solutions.





  4. Pre-defined Performance Metrics: All the Dashboards are built with the common pre-defined metrics by default which eases the business user in tracking the regular yet important performance metrics.



Benefits of using Business Intelligence Dashboard:

Dashboards quickly convert and communicate the complex corporate data into a meaningful display of charts, graphs, gauges and other formats concurrently. A dynamic, intelligence Dashboard will allow the managers to drill-down data to go deeper into the analysis. It eliminates the need to go through several reports, in one shot Dashboard gives a clear picture about how a company is performing in its critical areas.

Dashboard Softwares:

Following are few of the famous Business Intelligence Dashboard softwares available in the current market.

Cognos ReportNet: A product from Cognos that helps to build, distribute, and view multi-object dashboards with graphical elements such as gauges, charts, and clickable pictures. Connects dashboards with underlying tabular data, letting people go from a high-level view to a deeper understanding.

Source: www.Cognos.com

MicroStrategy: Microstrategy Dashboards are designed to deliver maximum visual impact in a format optimized for quick absorption, using a combination of tables, graphics, gauges, dials and other graphical indicators, as well as conditional formatting, free-form labels, borders and background colors.

Source: www.microstrategy.com

BusinessObjects Dashboard Manager: Dashboard Manager lets one choose the level of detail they need - an overview, a specific chart or graph, or the underlying report.

Source: www.businessobjects.com

The Compliance Management Dashboard: A product from Hyperion provides easy-to-use screens, meters, and stoplights that quickly convey critical information. The Compliance Management Dashboard includes a data model with pre-built connectors to data sources, a set of compliance metrics, and a series of easy-to-use screens, meters, and stoplights designed to quickly convey critical information needed by finance executives.

Source: www.hyperion.com

Scorecards:

Scorecards are similar to Dashboards in a way that it provides easy-to-understand, summarized, at-a-glance data for the managers and top officials to tell them about their company's present and past performance. Scorecards thus help to monitor the Key Performance Indicators accurately and to communicate the goals and strategies across the organization in an efficient and elegant manner. In a Business Intelligence environment, Scorecards allows managers to set metrics or targets and monitor them to see their impact on every department.

Balanced Scorecards:

A methodology created by Drs. Robert S. Kaplan and David P. Norton in 1992, is a management concept which helps managers at all levels monitor results in their key areas, including financial performance, customer knowledge, internal business processes and learning. It has been implemented in thousands of corporations, organizations and government agencies worldwide.

Business Intelligence Tools Directory:

Actuate
Business Objects
Cognos
HummingBird
Hyperion
Information Builders
MicroStrategy
ProClarity
Siebel

Wednesday, November 18, 2009

Implementing Fast-Start Failover in 10GR2 Dataguard Broker Environment


IMPLEMENTING FAST-START FAILOVER IN 10GR2 DATAGUARD BROKER ENVIRONMENT

Doc ID:
359555.1
Type:
BULLETIN

Modified Date:
06-MAY-2009
Status:
PUBLISHED
IMPLEMENTING FAST-START FAILOVER
10GR2 DATAGUARD BROKER ENVIRONMENT
Product: Oracle 10GR2 DataGuard.
Minimum Version: 10.2.0.1.0
Abstract
----------
This paper covers the step-by-step for implementing 10GR2 DataGuard new feature -FAST-START FAILOVER.
FAST-START Failover is a new feature in DataGuard Broker 10GR2.
Fast-start failover allows the broker to automatically fail over to a synchronized
standby database in the event of loss of the primary database.
Fast-start failover quickly and reliably fails over the target standby database
to the primary database role, without requiring you to perform any
manual steps to invoke the failover.
Fast-start failover can be used only in a broker configuration and can be
configured only through DGMGRL or Enterprise Manager.
Here are detailed Steps to configure fast-Start Failover using DGMGRL.
== Tested on the following configuration ==
Operating System - Windows XP
Oracle Release - 10.2.0.1.0
Primary Database - RAJ (SINGLE INSTANCE)
Standby database - STDBY (SINGLE INSTANCE)
Broker Tool - DGMGRL
Observer Node - rghatge-in
=== The following Pre-requisites must be met before Enabling Fast-Start Failover ===
a) The broker configuration must be running in maximum availability mode.
Standby Redo logs are must for MaxAvailability Mode.
Create the standby redo logs on the Standby databaase.
Either of following ways can be used to set the Protection mode to - MAXAVAILABILITY.
DGMGRL> Edit configuration set Protection mode as MaxAvailability;
SQL> Alter database set standby database to maximize Availability;
b) Enable Flashback Database and set up a flash recovery area on both the
primary database and the target standby database.
Parameters of concerns for Flashback Database ===>
Alter system set db_recovery_file_dest_size=[];
Alter system set db_recovery_file_dest=;
To turn flashback feature, ensure database is mounted and issue the following command ==>
SQL> Alter database flashback on;
c) Install the DGMGRL command-line interface on the observer computer.
This requires you to install 10GR2 client side component or
10GR2 Enterprise/Personal Edition on observer Machine.
d) Configure the TNSNAMES.ORA file on the observer Machine so that the observer is able to connect to the
Primary database and the pre-selected target standby database.
== AFTER ABOVE PRE-REQUISITES ARE MET ===
PRIMARY Database (RAJ) ==
SQL> select protection_mode,protection_level from v$database;
PROTECTION_MODE PROTECTION_LEVEL
-------------------- --------------------
MAXIMUM AVAILABILITY MAXIMUM AVAILABILITY
SQL> select flashback_on from v$database;
FLASHBACK_ON
--------------------
YES
STANDBY Database (STDBY) ==
SQL> select protection_mode, protection_level from v$database;
PROTECTION_MODE PROTECTION_LEVEL
-------------------- --------------------
MAXIMUM AVAILABILITY MAXIMUM AVAILABILITY
SQL> select flashback_on from v$database;
FLASHBACK_ON
--------------------
YES
== TNSNAMES.ORA Entry on Observer Site ==
RAJ=
(DESCRIPTION=
(ADDRESS=
(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))
(ADDRESS=
(PROTOCOL=ipc)(KEY=PNPKEY))
(CONNECT_DATA=
(SERVICE_NAME=raj)
)
)
STDBY=
(DESCRIPTION=
(ADDRESS=
(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))
(ADDRESS=
(PROTOCOL=ipc)(KEY=PNPKEY))
(CONNECT_DATA=
(SERVICE_NAME=stdby)
)
)
== FAST_START FAILOVER SETUP ===
Creating DG Broker configuration using DGMGRL
DGMGRL>connect sys/sys123@RAJ
Connected.
DGMGRL> create configuration 'DGRAJ' as primary database is 'RAJ'
connect identifier is RAJ;
Configuration "DGRAJ" created with primary database "RAJ" <<< Primary added
DGMGRL> add database 'STDBY' as connect identifier is STDBY maintained
as physical;
Database "STDBY" added <<< Standby added
DGMGRL> show configuration
Configuration
Name: DGRAJ
Enabled: NO
Protection Mode: MaxAvailability <<< Protection Mode
Fast-Start Failover: DISABLED
Databases:
RAJ - Primary database
STDBY - Physical standby database
Current status for "DGRAJ":
DISABLED <<< Status still Disabled.
DGMGRL> enable configuration <<< Configuration enabled.
Enabled.
DGMGRL> connect sys/sys123@RAJ
Connected.
DGMGRL> show configuration;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: DISABLED <<<< Fast-Start Failover is disabled yet.
Databases:
raj - Primary database
stdby - Physical standby database
Current status for "DGRAJ":
SUCCESS <<< Status Success
Ensure "LogXptMode" property for both the Primary and Standby is set to 'SYNC'.
DGMGRL> show database verbose 'RAJ';
Database
Name: raj
Role: PRIMARY
Enabled: YES
Intended State: ONLINE
Instance(s):
raj
Properties:
InitialConnectIdentifier = 'raj'
LogXptMode = 'SYNC' <<<<
Dependency = ''
DelayMins = '0'
Binding = 'OPTIONAL'
MaxFailure = '0'
MaxConnections = '1'
ReopenSecs = '300'
NetTimeout = '180'
LogShipping = 'ON'
PreferredApplyInstance = ''
ApplyInstanceTimeout = '0'
ApplyParallel = 'AUTO'
StandbyFileManagement = 'MANUAL'
ArchiveLagTarget = '0'
LogArchiveMaxProcesses = '2'
LogArchiveMinSucceedDest = '1'
DbFileNameConvert = ''
LogFileNameConvert = ''
FastStartFailoverTarget = ''
StatusReport = '(monitor)'
InconsistentProperties = '(monitor)'
InconsistentLogXptProps = '(monitor)'
SendQEntries = '(monitor)'
LogXptStatus = '(monitor)'
RecvQEntries = '(monitor)'
HostName = 'rghatge-in'
SidName = 'raj'
LocalListenerAddress = '(ADDRESS=(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))'
StandbyArchiveLocation = 'D:\oracle\product\10.2.0\admin\RAJ\arch'
AlternateLocation = ''
LogArchiveTrace = '0'
LogArchiveFormat = 'raj%S%t%r.arc'
LatestLog = '(monitor)'
TopWaitEvents = '(monitor)'
Current status for "raj":
SUCCESS
DGMGRL> show database verbose 'STDBY';
Database
Name: STDBY
Role: PHYSICAL STANDBY
Enabled: YES
Intended State: ONLINE
Instance(s):
stdby
Properties:
InitialConnectIdentifier = 'stdby'
LogXptMode = 'SYNC' >>>>>
Dependency = ''
DelayMins = '0'
Binding = 'OPTIONAL'
MaxFailure = '0'
MaxConnections = '1'
ReopenSecs = '300'
NetTimeout = '180'
LogShipping = 'ON'
PreferredApplyInstance = ''
ApplyInstanceTimeout = '0'
ApplyParallel = 'AUTO'
StandbyFileManagement = 'MANUAL'
ArchiveLagTarget = '0'
LogArchiveMaxProcesses = '2'
LogArchiveMinSucceedDest = '1'
DbFileNameConvert = 'D:\oracle\product\10.2.0\oradata\RAJ, D:\oracle\product\10.2.0\oradata\STDBY\data'
LogFileNameConvert = 'D:\oracle\product\10.2.0\oradata\RAJ, D:\oracle\product\10.2.0\oradata\STDBY\data'
FastStartFailoverTarget = ''
StatusReport = '(monitor)'
InconsistentProperties = '(monitor)'
InconsistentLogXptProps = '(monitor)'
SendQEntries = '(monitor)'
LogXptStatus = '(monitor)'
RecvQEntries = '(monitor)'
HostName = 'rghatge-in'
SidName = 'stdby'
LocalListenerAddress = '(ADDRESS=(PROTOCOL=TCP)(HOST=rghatge-in)(PORT=1521))'
StandbyArchiveLocation = 'D:\oracle\product\10.2.0\oradata\STDBY\arch'
AlternateLocation = ''
LogArchiveTrace = '0'
LogArchiveFormat = 'raj%S%t%r.arc'
LatestLog = '(monitor)'
TopWaitEvents = '(monitor)'
Current status for "STDBY":
SUCCESS
DGMGRL> show configuration;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: DISABLED <<<< FAST_START Failover is not enabled yet.
Databases:
raj - Primary database
stdby - Physical standby database
Current status for "DGRAJ":
SUCCESS
DGMGRL> enable fast_start failover; >>>> Enabling FAST_START Failover.
Enabled.
=== EXTRACRS FROM BROKER LOG WHEN FAST-START FAILOVER WAS ENABLED ===
<< Drcraj.log >>
DG 2006-02-17-10:05:53 0 2 0 DMON: FSFO storing state flags=0x103, version=5, obid=0xfd89c4cf (4253664463), threshold=120
DG 2006-02-17-10:05:53 0 2 582629787 DMON: Fast-Start Failover (FSFO) has been enabled between:
DG 2006-02-17-10:05:53 0 2 582629787 Primary = "RAJ"
DG 2006-02-17-10:05:53 0 2 582629787 Standby = "STDBY"
DG 2006-02-17-10:05:53 0 2 582629787 DMON: EDIT_DRC: success
<< Drcstdby.log >>
DG 2006-02-17-10:05:52 0 2 0 drcx: FSFO storing state flags=0x101, version=4, obid=0x54f8108b (1425543307), threshold=120
DG 2006-02-17-10:05:52 0 2 0 drcx: FSFO storing state flags=0x101, version=4, obid=0xfd89c4cf (4253664463), threshold=120
DG 2006-02-17-10:05:53 0 2 0 drcx: FSFO storing state flags=0x103, version=5, obid=0xfd89c4cf (4253664463), threshold=120
DG 2006-02-17-10:05:54 0 2 0 DRCX: Start receiving metadata file: D:\ORACLE\PRODUCT\10.2.0\DB_1\DATABASE\DRRAJ2.DAT
DG 2006-02-17-10:05:54 0 2 0 DRCX: Receiving block #1, 86 blocks.
DG 2006-02-17-10:05:54 0 2 0 DRCX: End receiving metadata file: opcode EDIT_DRC
DG 2006-02-17-10:05:54 0 2 582629787 DMON: chief lock convert for resync
DGMGRL> show configuration verbose;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: ENABLED <<<<< FAST-START Failover Enabled.
Databases:
RAJ - Primary database
STDBY - Physical standby database
- Fast-Start Failover target
Fast-Start Failover
Threshold: 30 seconds <<< Default Threshold value
Observer: rghatge-in
Current status for "DGRAJ":
SUCCESS
DGMGRL> edit configuration set property FastStartFailoverThreshold=120;
Property "faststartfailoverthreshold" updated <<< Changing default value to 120 secs.
DGMGRL> show configuration verbose;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: ENABLED
Databases:
raj - Primary database
stdby - Physical standby database
- Fast-Start Failover target
Fast-Start Failover <<<< FAST_START Failover details
Threshold: 120 seconds <<<< New Threshold Value.
Observer: (none)
Current status for "DGRAJ":
Warning: ORA-16608: one or more databases have warnings <<< Warning because Observer is not started.
DGMGRL> show database verbose 'raj';
Database
Name: raj
Role: PRIMARY
Enabled: YES
Intended State: ONLINE
Instance(s):
raj
Properties:
InitialConnectIdentifier = 'raj'
LogXptMode = 'SYNC'
Dependency = ''
DelayMins = '0'
Binding = 'OPTIONAL'
MaxFailure = '0'
MaxConnections = '1'
ReopenSecs = '300'
NetTimeout = '180'
LogShipping = 'ON'
PreferredApplyInstance = ''
ApplyInstanceTimeout = '0'
ApplyParallel = 'AUTO'
StandbyFileManagement = 'MANUAL'
ArchiveLagTarget = '0'
LogArchiveMaxProcesses = '2'
LogArchiveMinSucceedDest = '1'
DbFileNameConvert = ''
LogFileNameConvert = ''
FastStartFailoverTarget = 'stdby'
StatusReport = '(monitor)'
InconsistentProperties = '(monitor)'
InconsistentLogXptProps = '(monitor)'
SendQEntries = '(monitor)'
LogXptStatus = '(monitor)'
RecvQEntries = '(monitor)'
HostName = 'rghatge-in'
SidName = 'raj'
LocalListenerAddress = '(ADDRESS=(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))'
StandbyArchiveLocation = 'D:\oracle\product\10.2.0\admin\RAJ\arch'
AlternateLocation = ''
LogArchiveTrace = '0'
LogArchiveFormat = 'raj%S%t%r.arc'
LatestLog = '(monitor)'
TopWaitEvents = '(monitor)'
Current status for "raj":
Warning: ORA-16819: Fast-Start Failover observer not started <<<<<<< Warning is clear.
DGMGRL> show database verbose 'stdby';
Database
Name: stdby
Role: PHYSICAL STANDBY
Enabled: YES
Intended State: ONLINE
Instance(s):
stdby
Properties:
InitialConnectIdentifier = 'stdby'
LogXptMode = 'SYNC' <<<<<
Dependency = ''
DelayMins = '0'
Binding = 'OPTIONAL'
MaxFailure = '0'
MaxConnections = '1'
ReopenSecs = '300'
NetTimeout = '180'
LogShipping = 'ON'
PreferredApplyInstance = ''
ApplyInstanceTimeout = '0'
ApplyParallel = 'AUTO'
StandbyFileManagement = 'MANUAL'
ArchiveLagTarget = '0'
LogArchiveMaxProcesses = '2'
LogArchiveMinSucceedDest = '1'
DbFileNameConvert = 'D:\oracle\product\10.2.0\oradata\RAJ, D:\oracle\product\10.2.0\oradata\STDBY\data'
LogFileNameConvert = 'D:\oracle\product\10.2.0\oradata\RAJ, D:\oracle\product\10.2.0\oradata\STDBY\data'
FastStartFailoverTarget = 'raj'
StatusReport = '(monitor)'
InconsistentProperties = '(monitor)'
InconsistentLogXptProps = '(monitor)'
SendQEntries = '(monitor)'
LogXptStatus = '(monitor)'
RecvQEntries = '(monitor)'
HostName = 'rghatge-in'
SidName = 'stdby'
LocalListenerAddress = '(ADDRESS=(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))'
StandbyArchiveLocation = 'D:\oracle\product\10.2.0\oradata\STDBY\arch'
AlternateLocation = ''
LogArchiveTrace = '0'
LogArchiveFormat = 'raj%S%t%r.arc'
LatestLog = '(monitor)'
TopWaitEvents = '(monitor)'
Current status for "stdby":
Warning: ORA-16819: Fast-Start Failover observer not started
>>>> The above warning is because the Observer process is not started <<<<
=== STARTING OBSERVER ====
The observer is integrated in the DGMGRL client-side component and should run on a
different computer from the primary or standby databases.
It monitors the DG Broker environment and initiates the failover on
detecting problem with Primary.
To start the observer, only the primary database must be running; it is not necessary
for the target standby database to be running.
DGMGRL> connect sys/sys123@RAJ
Connected.
DGMGRL> start observer
Observer started <<<< Observer started.
The Observer is continous foreground process so you need to stop it from another session.
The observer persistently maintains information about the fast-start failover
configuration in a binary file created in the working directory where you started the observer.
By default, the observer creates this file when it is started and names the file fsfo.dat.
This file contains connect descriptors to both the primary and the target standby databases.
Fsfo.dat = Observer Configuration file created in home directory from where the DGMGRL is invoked.
"Observer" status can also be checked from sqlplus using new columns
introduced in 10GR2 for v$database.
== RAJ (Primary database) ==
SQL> select fs_failover_observer_present,fs_failover_observer_host,fs_failover_threshold from v$database;
FS_F FS_FAILOVER_OBSERVER_HOST FS_FAILOVER_THRESHOLD
---- ------------------------- ---------------------
YES rghatge-in 120
=== STDBY (Standby database) ===
SQL> select fs_failover_observer_present,fs_failover_observer_host,fs_failover_threshold from v$database;
FS_F FS_FAILOVER_OBSERVER_HOST FS_FAILOVER_THRESHOLD
---- ------------------------- ---------------------
YES rghatge-in 120
=== Broker Configuration status after the Observer Started ====
DGMGRL> connect sys/sys123@RAJ
Connected.
DGMGRL> show configuration verbose;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: ENABLED <---- Fast Start Failover Enabled.
Databases:
raj - Primary database
stdby - Physical standby database
- Fast-Start Failover target
Fast-Start Failover
Threshold: 120 seconds <---- Threshold for Failover
Observer: rghatge-in <---- Host on which Observer is running.
Current status for "DGRAJ":
SUCCESS
DGMGRL> show database verbose 'raj';
Database
Name: raj
Role: PRIMARY
Enabled: YES
Intended State: ONLINE
Instance(s):
raj
Properties:
InitialConnectIdentifier = 'raj'
LogXptMode = 'SYNC'
Dependency = ''
DelayMins = '0'
Binding = 'OPTIONAL'
MaxFailure = '0'
MaxConnections = '1'
ReopenSecs = '300'
NetTimeout = '180'
LogShipping = 'ON'
PreferredApplyInstance = ''
ApplyInstanceTimeout = '0'
ApplyParallel = 'AUTO'
StandbyFileManagement = 'MANUAL'
ArchiveLagTarget = '0'
LogArchiveMaxProcesses = '2'
LogArchiveMinSucceedDest = '1'
DbFileNameConvert = ''
LogFileNameConvert = ''
FastStartFailoverTarget = 'stdby' <<<<<< fast-Start Failover Target for Primary
InconsistentProperties = '(monitor)'
InconsistentLogXptProps = '(monitor)'
SendQEntries = '(monitor)'
LogXptStatus = '(monitor)'
RecvQEntries = '(monitor)'
HostName = 'rghatge-in'
SidName = 'raj'
LocalListenerAddress = '(ADDRESS=(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))'
StandbyArchiveLocation = 'D:\oracle\product\10.2.0\admin\RAJ\arch
AlternateLocation = ''
LogArchiveTrace = '0'
LogArchiveFormat = 'raj%S%t%r.arc'
LatestLog = '(monitor)'
TopWaitEvents = '(monitor)'
Current status for "raj":
SUCCESS
DGMGRL> show database verbose 'stdby';
Database
Name: stdby
Role: PHYSICAL STANDBY
Enabled: YES
Intended State: ONLINE
Instance(s):
stdby
Properties:
InitialConnectIdentifier = 'stdby'
LogXptMode = 'SYNC'
Dependency = ''
DelayMins = '0'
Binding = 'OPTIONAL'
MaxFailure = '0'
MaxConnections = '1'
ReopenSecs = '300'
NetTimeout = '180'
LogShipping = 'ON'
PreferredApplyInstance = ''
ApplyInstanceTimeout = '0'
ApplyParallel = 'AUTO'
StandbyFileManagement = 'MANUAL'
ArchiveLagTarget = '0'
LogArchiveMaxProcesses = '2'
LogArchiveMinSucceedDest = '1'
FastStartFailoverTarget = 'raj' <<<<<< fast-Start Failover Target for STDBY
StatusReport = '(monitor)'
InconsistentProperties = '(monitor)'
InconsistentLogXptProps = '(monitor)'
SendQEntries = '(monitor)'
LogXptStatus = '(monitor)'
RecvQEntries = '(monitor)'
HostName = 'rghatge-in'
SidName = 'stdby'
LocalListenerAddress = '(ADDRESS=(PROTOCOL=tcp)(HOST=rghatge-in)(PORT=1521))'
StandbyArchiveLocation = 'D:\oracle\product\10.2.0\oradata\STDBY\arch'
AlternateLocation = ''
LogArchiveTrace = '0'
LogArchiveFormat = 'raj%S%t%r.arc'
LatestLog = '(monitor)'
TopWaitEvents = '(monitor)'
Current status for "stdby":
SUCCESS
Observer triggers Fast-Start Failover when one of the following conditions
is observed on primary database ==>
1) Instance Failure
2) Shutdown Abort
3) Offline Datafiles due to I/O error
4) Network disconnection
For simplicity of testing, "Shutdown Abort" was executed on Primary database
(RAJ) to initiate Fast-Start Failover.
<<>>>
<< Observer.log >>
14:48:43.12 Monday, February 20, 2006
Initiating fast-start failover to database "STDBY"...
Performing failover NOW, please wait...
Failover succeeded, new primary is "STDBY"
14:49:20.95 Monday, February 20, 2006
<< BROKER LOG EXTRACTS WHEN FAST-START FAILOVER TRIGERRED >>
Extracts from DG logs during Fast-Start Failover --
<>
<>
0 2 0 Foreground: FSFO storing state flags=0x23, version=4, obid=0x9a155a3e (2585090622), threshold=120
0 2 0 Database Resource[IAM=PHYSICAL]: SetState READ-WRITE-XPTON, phase TEAR-DOWN, External Cond GRACEFUL-FAIL-OVER-PHYS_STBY, Target Site Handle 0x02001000
0 2 0 Executing SQL [ALTER SYSTEM REGISTER]
0 2 0 SQL [ALTER SYSTEM REGISTER] Executed successfully
0 2 0 Executing SQL [ALTER DATABASE RECOVER MANAGED STANDBY DATABASE FINISH FORCE]
0 2 0 SQL [ALTER DATABASE RECOVER MANAGED STANDBY DATABASE FINISH FORCE] Executed successfully
0 2 0 Executing SQL [ALTER DATABASE COMMIT TO SWITCHOVER TO PRIMARY WAIT WITH SESSION SHUTDOWN]
0 2 0 SQL [ALTER DATABASE COMMIT TO SWITCHOVER TO PRIMARY WAIT WITH SESSION SHUTDOWN] Executed successfully
0 2 0 Database Resource SetState succeeded
Executing SQL [ALTER DATABASE SET STANDBY DATABASE TO MAXIMIZE AVAILABILITY]
Set log transport destination: SetState ONLINE, phase BUILD-UP, External Cond GRACEFUL-FAIL-OVER-PHYS_STBY
Executing SQL [ALTER DATABASE OPEN]
<< Drcraj.log>>
DG 2006-02-20-15:09:38 0 2 0 RSM Warning: SCN to flashback to is greater than DB SCN. Flashing back to DB SCN of 844202 instead
DG 2006-02-20-15:09:38 0 2 0 Executing SQL [FLASHBACK DATABASE TO BEFORE SCN 844202]
DG 2006-02-20-15:09:46 0 2 0 SQL [FLASHBACK DATABASE TO BEFORE SCN 844202] Executed successfully
DG 2006-02-20-15:09:46 0 2 0 Physical RSM: Reinstatement... Converting old primary control file to physical standby control file
DG 2006-02-20-15:09:46 0 2 0 Executing SQL [alter database convert to physical standby]
DG 2006-02-20-15:09:47 0 2 0 SQL [alter database convert to physical standby] Executed successfully
DG 2006-02-20-15:09:47 0 2 0 Executing SQL [ALTER DATABASE FLASHBACK OFF]
DG 2006-02-20-15:09:47 0 2 0 Foreground: FSFO storing state flags=0x41, version=6, obid=0x9a155a3e (2585090622), threshold=120
DG 2006-02-20-15:09:48 0 2 0 SQL [ALTER DATABASE FLASHBACK OFF] Executed successfully
DG 2006-02-20-15:09:48 0 2 0 Executing SQL [ALTER DATABASE FLASHBACK ON]
DG 2006-02-20-15:09:51 0 2 0 SQL [ALTER DATABASE FLASHBACK ON] Executed successfully
DG 2006-02-20-15:09:51 0 2 582889523 Resource: RAJ (0x01013000) State: PHYSICAL-APPLY-ON
DG 2006-02-20-15:11:05 0 2 0 Executing SQL [ALTER DATABASE RECOVER MANAGED STANDBY DATABASE THROUGH ALL SWITCHOVER DISCONNECT USING CURRENT LOGFILE]
DG 2006-02-20-15:11:11 0 2 0 SQL [ALTER DATABASE RECOVER MANAGED STANDBY DATABASE THROUGH ALL SWITCHOVER DISCONNECT USING CURRENT LOGFILE] Executed successfully
DG 2006-02-20-15:11:11 0 2 0 INSV: All instances have replied for message
== After Fast-Start Failover completed ==
NEW PRIMARY ==>> STDBY
$sqlplus "sys/sys123@stdby as sysdba"
SQL> select open_mode,database_role from v$database;
OPEN_MODE DATABASE_ROLE
---------------- ----------------
READ WRITE PRIMARY <<<< Standby resumed Primary role.
SQL> select fs_failover_status from v$database;
FS_FAILOVER_STATUS
---------------------
REINSTATE REQUIRED >>>> Indicates original Primary RAJ is not yet re-instated as Standby.
DGMGRL> connect sys/sys123@RAJ
Connected.
Error:
ORA-01034: ORACLE not available <<< Primary database is not yet available.
DGMGRL> connect sys/sys123@stdby
Connected.
DGMGRL> show configuration verbose;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: ENABLED
Databases:
RAJ - Physical standby database (disabled) <<<< original Primary is yet disabled.
- Fast-Start Failover target
STDBY - Primary database
Fast-Start Failover
Threshold: 120 seconds
Observer: rghatge-in
Current status for "DGRAJ":
Warning: ORA-16608: one or more databases have warnings
DGMGRL> show database verbose 'STDBY';
Database
Name: STDBY
Role: PRIMARY
Enabled: YES
Intended State: ONLINE
Instance(s):
stdby
Properties:
InitialConnectIdentifier = 'stdby'
LogXptMode = 'SYNC'
Current status for "STDBY":
Warning: ORA-16817: unsynchronized Fast-Start Failover configuration >>>>>>
DGMGRL> show database verbose 'RAJ';
Database
Name: RAJ
Role: PHYSICAL STANDBY
Enabled: NO
Intended State: ONLINE
Instance(s):
raj
Properties:
InitialConnectIdentifier = 'raj'
LogXptMode = 'SYNC'
Current status for "RAJ":
Error: ORA-16661: the standby database needs to be reinstated >>>>>>
=== At this stage, the original Primary database (RAJ) is mounted ===
$sqlplus "sys/sys123@RAJ as sysdba"
SQL> select fs_failover_status from v$database;
FS_FAILOVER_STATUS
---------------------
UNSYNCHRONIZED >>>>>
The Observer automatically re-instates the original Primary (RAJ)
as current Standby from new Primary (STDBY) using Flashback Logs.
== AFTER RE-INSTATING Original Primary Database (RAJ) ===
DGMGRL> show configuration verbose;
Configuration
Name: DGRAJ
Enabled: YES
Protection Mode: MaxAvailability
Fast-Start Failover: ENABLED
Databases:
RAJ - Physical standby database >>>>> Current Standby database
- Fast-Start Failover target
STDBY - Primary database <<<<<< Current Primary database
Fast-Start Failover
Threshold: 120 seconds
Observer: rghatge-in
Current status for "DGRAJ":
SUCCESS
sqlplus "sys/sys123@RAJ as sysdba"
SQL> select fs_failover_status from v$database;
FS_FAILOVER_STATUS
---------------------
SYNCHRONIZED >>>>>
sqlplus "sys/sys123@stdby as sysdba"
SQL> select fs_failover_status from v$database;
FS_FAILOVER_STATUS
---------------------
SYNCHRONIZED >>>>>
REFERENCES ==
Oracle® Data Guard
Broker
10g Release 2 (10.2)
.