Monday, December 29, 2014

OBIA - Naming Conventions for Oracle Business Analytics Warehouse Tables

Oracle Business Analytics Warehouse tables use a three-part naming convention: PREFIX_NAME_SUFFIX, as shown in below table:

PartMeaningTable Type
PREFIX
Shows Oracle Business Intelligence-specific data warehouse application tables.
W_ = Warehouse
NAME
Unique table name.
All tables.
SUFFIX
Indicates the table type.

_A = Aggregate
_D = Dimension
_DEL = Delete
_DH = Dimension Hierarchy
_DHL = Dimension Helper
_DHLS = Staging for Dimension Helper
_DHS = Staging for Dimension Hierarchy
_DS = Staging for Dimension
_F = Fact
_FS = Staging for Fact
_G, _GS = Internal
_H = Helper
_HS = Staging for Helper
_MD = Mini Dimension
_PE = Primary Extract
_PS = Persisted Staging
_RH = Row Flattened Hierarchy
_TL = Translation Staging (supports multi-language support)
_TMP = Pre-staging or post-staging temporary table
_UD = Unbounded Dimension
_WS = Staging for Usage Accelerator

And below table lists the types of tables used in the Oracle Business Analytics Warehouse:

Table TypeDescription
Aggregate tables (_A)
Contain summed (aggregated) data.
Dimension tables (_D)
Star analysis dimensions.
Delete tables (_DEL)
Tables that store IDs of the entities that were physically deleted from the source system and should be flagged as deleted from the data warehouse.
Note that there are two types of delete tables: _DEL and _PE. For more information about the _PE table type, see the row for Primary extract tables (_PE) in this table.
Dimension Hierarchy tables (_DH)
Tables that store the dimension's hierarchical structure.
Dimension Helper tables (_DHL)
Tables that store many-to-many relationships between two joining dimension tables.
Staging tables for Dimension Helper (_DHLS)
Staging tables for storing many-to-many relationships between two joining dimension tables.
Dimension Hierarchy Staging table
(_DHS)
Staging tables for storing the hierarchy structures of dimensions that have not been through the final extract-transform-load (ETL) transformations.
Dimension Staging tables (_DS)
Tables used to hold information about dimensions that have not been through the final ETL transformations.
Fact tables (_F)
Contain the metrics being analyzed by dimensions.
Fact Staging tables (_FS)
Staging tables used to hold the metrics being analyzed by dimensions that have not been through the final ETL transformations.
Internal tables (_G, _GS)
General tables used to support ETL processing.
Helper tables (_H)
Inserted between the fact and dimension tables to support a many-to-many relationship between fact and dimension records.
Helper Staging tables (_HS)
Tables used to hold information about helper tables that have not been through the final ETL transformations.
Mini dimension tables (_MD)
Include combinations of the most queried attributes of their parent dimensions. The database joins these small tables to the fact tables.
Primary extract tables (_PE)
Tables used to support the soft delete feature. The table includes all the primary key columns (integration ID column) from the source system. When a delete event happens, the full extract from the source compares the data previously extracted in the primary extract table to determine if a physical deletion was done in the Siebel application. The soft delete feature is disabled by default. Therefore, the primary extract tables are not populated until you enable the soft delete feature.
Note that there are two types of delete tables: _DEL and _PE. For more information about the _DEL table type, see the row for Delete table (_DEL) in this table.
Persisted Staging table (_PS)
Tables that source multiple data extracts from the same source table.
These tables perform some common transformations required by multiple target objects. They also simplify the source object to a form that is consumable by the warehouse needed for multiple target objects. These tables are never truncated during the life of the data warehouse. These are truncated only during full load, and therefore, persist the data throughout.
Row Flattened Hierarchy Table (_RH)
Tables that record a node in the hierarchy by a set of ancestor-child relationships (parent-child for all parent levels).
Translation Staging tables (_TL)
Tables store names and descriptions in the languages supported by Oracle BI Applications.
Pre-staging or post-staging Temporary table (_TMP)
Source-specific tables used as part of the ETL processes to conform the data to fit the universal staging tables (table types_DS and _FS). These tables contain intermediate results that are created as part of the conforming process.
Unbounded dimension (_UD)
Tables containing information that is not bounded in transactional database data but should be treated as bounded data in the Oracle Business Analytics Warehouse.
Staging tables for Usage Accelerator (_WS)
Tables containing the necessary columns for the ETL transformations.


ORACLE BI APPS 11.1.1.8.1 – HOW TO ENABLE SOFT DELETE PROCESS

What is the issue?
During the OBI Apps implementation and customization projects we were confronted with an uncommon issue. Even though we had run many validation processes and had acquired the agreed approvals within the Development, SIT and UAT environments, some metric issues appeared after going live with the Production environment.
One of the issues is the metric shown within the Oracle BI report was different from (normally greater than) the real number found in the data sources (i.e. EBS). The reason is some records were manually deleted in the EBS but not be reflected in the Data Warehouse.
In case you want to flag these records as deleted (soft delete) in the Data Warehouse, you must enable the related primary extract and delete mappings because this feature is disabled by default.
About Primary Extract and Delete Mappings Process
The primary extract mappings perform a full extract of the primary keys from the source system. Although many rows are generated from this extract, the data only extracts the Key ID and Source ID information from the source table. The primary extract mappings load these two columns into staging tables that are marked with a *_PE suffix.
The figure below provides an example of the beginning of the extract process. It shows the sequence of events over a two day period during which the information in the source table has changed. On day one, the data is extracted from a source table and loaded into the Oracle Business Analytics Warehouse table. On day two, Sales Order number three is deleted and a new sales order is received, creating a disparity between the Sales Order information in the two tables.
Above figure shows the primary extract and delete process that occurs when day two's information is extracted and loaded into the Oracle Business Analytics Warehouse from the source. The initial extract brings record four into the Oracle Business Analytics Warehouse. Then, using a primary extract mapping, the system extracts the Key IDs and the Source IDs from the source table and loads them into a primary extract staging table.

The extract mapping compares the keys in the primary extract staging table with the keys in the most current the Oracle Business Analytics Warehouse table. It looks for records that exist in the Oracle Business Analytics Warehouse but do not exist in the staging table (in the preceding example, record three), and sets the delete flag to Y in the Source Adapter, causing the corresponding record to be marked as deleted.

The extract mapping also looks for any new records that have been added to the source, and which do not already exist in the Oracle Business Analytics Warehouse; in this case, record four. Based on the information in the staging table, Sales Order number three is physically deleted from Oracle Business Analytics Warehouse, as shown in following figure. When the extract and load mappings run, the new sales order is added to the warehouse.
The following graph describes how the Primary Extract and Delete mappings interact with the database tables:
·         The _Primary mappings perform a full extract of the primary keys from EBS source system and load the result into the primary extract (_F_PE) table.
·         The _IdenfifyDelete mappings identify deleted records in the source by doing comparison between primary extract table (_F_PE) and the target table (_F) and load the results into a staging table (_F_DEL).
·         The _SoftDelete mappings update the delete flag column with a value ‘Y’ on the target table (_F) for all the records that were identified as ‘deleted’, driving from the staging area table (_F_DEL).



Enabling Soft Delete Process in Oracle BI Apps 11.1.1.8.1
In order to enable the Primary Extract and Delete mappings you will have to apply changes to the SOFT_DELETE_PREPROCESS data load parameter using Oracle BI Applications Configuration Manager (BIACM).
Here is the list of steps required:
1.       Log in to BIACM as the BI Applications Administrator user.
2.       Select the Manage Data Load Parameters link to display the Manage Data Load Parameter dialog.
3.       Select the Source Instance need to configure and search SOFT_DELETE_PREPROCESS parameter as shown below. Click Search.


1.       Select the SOFT_DELETE_PREPROCESS Parameter Code and you will see all the OOTB Dimensions/ Facts that you can enable soft delete process.
2.       Change the Parameter Value to ‘Yes’ in which Dimensions/ Facts you want to enable Primary Extract and Delete mapping, as shown below:
1.       By default, OBI Apps include the filter with the DELETE_FLG field (DELETE_FLG=’N’) but it would be good to double check.
You can check whether the filter is added in the Logical Table Source as screenshot below:

Soft Delete feature for new facts
The above instruction is just used for the pre-build facts in the Data Warehouse. For the new facts you need to see the OOTB Primary Extract and Delete mappings for reference and build the similar custom mappings for these new facts.
Below is the sample sql code generated by the Primary Extract and Delete Mappings process of the main process which populate the Purchase Cost data from EBS source to W_PURCH_COST_F table:

_Primary:
select    
    TO_CHAR(PO_DISTRIBUTIONS_ALL.PO_DISTRIBUTION_ID)       C1_INTEGRATION_ID
from    BI_ACCNT.PO_DISTRIBUTIONS_ALL   PO_DISTRIBUTIONS_ALL
where    (1=1)
And (PO_DISTRIBUTIONS_ALL.CREATION_DATE>=TO_DATE(SUBSTR('#BIAPPS.INITIAL_EXTRACT_DATE',0,19),'YYYY-MM-DD HH24:MI:SS'))

_IdenfifyDelete:
insert into PMSAN3_DW.W_PURCH_COST_F_DEL
(DATASOURCE_NUM_ID, INTEGRATION_ID )
select
T.DATASOURCE_NUM_ID, T.INTEGRATION_ID 
from
PMSAN3_DW.W_PURCH_COST_F  T
left outer join PMSAN3_DW.W_PURCH_COST_F_PE  S
on T.DATASOURCE_NUM_ID =S.DATASOURCE_NUM_ID and
   T.INTEGRATION_ID =S.INTEGRATION_ID
where S.DATASOURCE_NUM_ID IS NULL
and S.INTEGRATION_ID IS NULL 
and T.DELETE_FLG = 'N' 
and T.CREATED_ON_DT > TO_DATE(SUBSTR('#BIAPPS.LAST_ARCHIVE_DATE',0,19),'YYYY-MM-DD HH24:MI:SS')
and exists
 (select 1
  from PMSAN3_DW.W_PURCH_COST_F_PE DS
  where T.DATASOURCE_NUM_ID = DS.DATASOURCE_NUM_ID
 )
               
_SoftDelete:
update PMSAN3_DW.W_PURCH_COST_F  T
set
 T.DELETE_FLG = 'Y'
,T.W_UPDATE_DT = SYSDATE
,T.ETL_PROC_WID = #BIAPPS.ETL_PROC_WID
where (T.DATASOURCE_NUM_ID, T.INTEGRATION_ID) IN
 (select D.DATASOURCE_NUM_ID, D.INTEGRATION_ID
  from PMSAN3_DW.W_PURCH_COST_F_DEL  D
 )

After you finish building Primary Extract and Delete mappings, remember that you should apply the filter (DELETE_FLG=’N’) in the Logical Table Source.

Thursday, August 7, 2014

OBIA Customization - Configure SCD behavior for added custom dimension column

In Oracle Business Intelligence Applications, customization is defined as changing the preconfigured behavior to enable you to analyze new information in your business intelligence dashboards. For example, you might want to add a column to a dashboard by extracting data from the source field and storing it in the Oracle Business Analytics Warehouse in the X_<Column_Name> field.

For added custom dimension column, you must configure SCD behavior after reversing it into ODI model so that the load plan can populate data to that column. The following describes how to configure this:
  1. In ODI Designer, modify the dimension datastore. In the Models view, expand the 'Oracle BI Applications' folder, Oracle BI Applications (Model), Dimension (Submodel), and Columns.
  2. Double-click the column whose SCD behavior you want to change.
  3. In the Description subtab's 'Slowly Changing Dimensions Behavior' drop-down list, select the column behavior. To implement Type I behavior, select Overwrite on Change. To implement Type II behavior, select Add Row on Change.


Tin Dang

Tuesday, July 22, 2014

Oracle Business Intelligence Applications 11.1.1.8.1 - Notes when setting up the Data Lineage

When setting up Data Lineage on Oracle Business Intelligence Applications 11.1.1.8.1 as document (http://docs.oracle.com/cd/E51479_01/doc.111181/e51483/datalineage.htm#CDEDDIGA), you may get some errors like below:

ODI-1217: Session DATALINEAGE_ETL_SDE_DL_OBIEE_BMM_HIERARCHY (44440500) fails with return code 7000. 
IOError: (2, 'ENOENT', 'E:\\software\\obiee\\Oracle_BI1\x08iapps\\DataLineage/rpd_text.txt') 

ODI-1217: Session DATALINEAGE_ETL_SDE_DL_OBIEE_SQL_PARSER (44447500) fails with return code 7000. 
IOError: (2, 'ENOENT', 'E:\\software\\obiee\\Oracle_BI1\x08iapps\\DataLineage/sql_query_list.txt') 

ODI-1217: Session DATALINEAGE_ETL_SDE_DL_FUSION_METADATA_EXTRACT (44543500) fails with return code 7000. 
OSError: [Errno 0] No such directory: 'E:\\software\\obiee\\Oracle_BI1\x08iapps\\DataLineage' 

ODI-1217: Session DATALINEAGE_ETL_SDE_DL_FUSION_TEMPORARY_LOAD (44703500) fails with return code 7000. 
ODI-1226: Step SDE_DL_FUSION_AM_Extract_Temporary.W_FUSION_PILLAR_AM_TMP fails after 1 attempt(s). 
ODI-1240: Flow SDE_DL_FUSION_AM_Extract_Temporary.W_FUSION_PILLAR_AM_TMP fails while performing a Loading operation. This flow loads target table W_FUSION_PILLAR_AM_TMP. 
ODI-1227: Task SrcSet0 (Loading) fails on the source FILE connection BIAPPS_DW_FILE. 
Caused By: java.sql.SQLException: ODI-40438: File not found: E:\software\obiee\Oracle_BI1\biapps\etl\data_files\src_files\BIA_11/AM_List.dsv 

ODI-1217: Session DATALINEAGE_ETL_SDE_DL_ODI_INTERFACE_HIERARCHY_DERIVE (44798500) fails with return code 7000. 
ODI-1226: Step INTERFACE_HIERARCHY fails after 1 attempt(s). 
ODI-1232: Procedure INTERFACE_HIERARCHY execution fails. 
Caused By: org.apache.bsf.BSFException: exception from Jython: 
Traceback (most recent call last): 
File "<string>", line 1, in <module> 
File "<string>", line 158, in process_interface_hierarchy 
IOError: (2, 'ENOENT', '$ORACLE_BI_HOME/biapps/DataLineage/adaptor_list.txt') 

ODI-1217: Session DATALINEAGE_ETL_SIL_DL_COMMON_INTERFACE_HIERARCHY_FACT (44799500) fails with return code 7000. 
ODI-1226: Step SIL_DL_COMMON_Interface_Hierarchy_Fact.W_INTERFACE_HIERARCHY_F fails after 1 attempt(s). 
ODI-1240: Flow SIL_DL_COMMON_Interface_Hierarchy_Fact.W_INTERFACE_HIERARCHY_F fails while performing a Loading operation. This flow loads target table W_INTERFACE_HIERARCHY_F. 
ODI-1227: Task SrcSet0 (Loading) fails on the source FILE connection BIAPPS_DW_FILE. 
Caused By: java.sql.SQLException: ODI-40438: File not found: E:\software\obiee\Oracle_BI1\biapps\etl\data_files\src_files\BIA_11/interface_hierarchy_output.txt 

ODI-1217: Session DATALINEAGE_ETL_SIL_DL_COMMON_ETL_SUMMARY (44804500) fails with return code 7000. 
ODI-1226: Step SIL_DL_COMMON_ETL_SUMMARY.W_ETL_SUMMARY_F fails after 1 attempt(s). 
ODI-1240: Flow SIL_DL_COMMON_ETL_SUMMARY.W_ETL_SUMMARY_F fails while performing a Loading operation. This flow loads target table W_ETL_SUMMARY_F. 
ODI-1227: Task SrcSet0 (Loading) fails on the source FILE connection BIAPPS_DW_FILE. 
Caused By: java.sql.SQLException: ODI-40438: File not found: E:\software\obiee\Oracle_BI1\biapps\etl\data_files\src_files\BIA_11/lineage_summary.txt 



If you have got one of those, please ensure that some following configurations had been done (I tried and the 'Data Lineage Extract and Load' load plan ran successfully):

- Use / instead of \ when setting DL_HOME variable on the Data Lineage Exract and Load load plan
- Copy APP-INF from $ORACLE_BI_HOME/fsm/modules/oracle.setup/SetupLite.ear to $ORACLE_BI_HOME/biapps/DataLineage
- Update config of BIAPPS_DW_FILE, change directory from $ORACLE_BI_HOME/biapps/etl/data_files/src_files/BIA_11 to $ORACLE_BI_HOME/biapps/DataLineage (same with the value set for the DL_HOME variable on the Data Lineage Exract and Load load plan.
- Copy all source files from $ORACLE_BI_HOME/biapps/etl/data_files/src_files/BIA_11 to $ORACLE_BI_HOME/biapps/DataLineage
- Enable SDE_DL_OBIEE_BMM_HIERARCHY & SDE_DL_FUSION_METADATA_EXTRACT step in the Data Lineage Exract and Load load plan (they are disabled by default). Disable SDE_DL_ODI_MAPPING_LIST_WIDS step if it is enabled (disabled by default, enabling it will caused SDE_DL_ODI_INTERFACE_HIERARCHY_DERIVE step runs for a long time)
Tin Dang

Wednesday, June 11, 2014

Install OBI 11g on Linux

Cần cài đặt các packages sau đây cho hệ điều hành trước khi tiến hành các bước tiếp theo:
  • GNOME Desktop Environment
  • Editors
  • Graphical Internet
  • Text-based Internet
  • Development Libraries
  • Development Tools
  • Server Configuration Tools
  • Administration Tools
  • Base
  • System Tools
  • X Window System

1.       Add host
[root@localhost ~]# vi /etc/hosts


[root@localhost ~]# vi /etc/sysconfig/network


2.       Chỉnh sửa kernel
[root@ora1 softs]# vi /etc/sysctl.conf
Add thêm các dòng sau:
#Oracle
fs.suid_dumpable = 1
fs.aio-max-nr = 1048576
fs.file-max = 6815744
kernel.shmall = 2097152
kernel.shmmax = 536870912
kernel.shmmni = 4096
# semaphores: semmsl, semmns, semopm, semmni
kernel.sem = 250 32000 100 128
net.ipv4.ip_local_port_range = 9000 65500
net.core.rmem_default=4194304
net.core.rmem_max=4194304
net.core.wmem_default=262144
net.core.wmem_max=1048586 
Kiểm tra lại các tham số:
/sbin/sysctl -a | grep <param-name>
Apply các tham số vừa gán:
/sbin/sysctl -p
               
[root@ora1 softs]# vi /etc/security/limits.conf
Add các dòng sau:
oracle              soft    nproc   2047
oracle              hard    nproc   16384
oracle              soft    nofile  1024
oracle              hard    nofile  65536
oracle              soft    stack   10240
Disable secure linux:
[root@ora1 softs]# vi /etc/selinux/config

SELINUXTYPE=disabled

3.       Cài đặt các gói cần thiết cho việc cài đặt Oracle
# From Oracle Linux 5 DVD
cd /media/cdrom/Server
rpm -Uvh binutils-2.*
rpm -Uvh compat-libstdc++-33*
rpm -Uvh compat-libstdc++-33*.i386.rpm
rpm -Uvh elfutils-libelf*
rpm -Uvh gcc-4.*
rpm -Uvh gcc-c++-4.*
rpm -Uvh glibc-2.*
rpm -Uvh glibc-common-2.*
rpm -Uvh glibc-devel-2.*
rpm -Uvh glibc-headers-2.*
rpm -Uvh ksh*
rpm -Uvh libaio-0.*
rpm -Uvh libaio-devel-0.*
rpm -Uvh libgomp-4.*
rpm -Uvh libgcc-4.*
rpm -Uvh libstdc++-4.*
rpm -Uvh libstdc++-devel-4.*
rpm -Uvh make-3.*
rpm -Uvh sysstat-7.*
rpm -Uvh unixODBC-2.*
rpm -Uvh unixODBC-devel-2.*
rpm -Uvh numactl-devel-2*
rpm -Uvh compat-db-*
cd /
eject

Cai them cac goi nay:
elfutils-libelf-0.137-3.el5.i386.rpm
elfutils-libelf-devel-0.137-3.el5.i386.rpm
elfutils-libelf-devel-static-0.137-3.el5.i386.rpm
libXp

4.       Tạo user oracle và các groups
groupadd oinstall
groupadd dba
groupadd oper
groupadd asmadmin

useradd -g oinstall -G dba,oper,asmadmin oracle
passwd oracle

5.       Tạo user kết nối đến DB để tạo Repository Creation Utility
[root@ora1 data1]# su - oracle
[oracle@ora1 ~]$
[oracle@ora1 ~]$ sqlplus / as sysdba

SQL*Plus: Release 11.2.0.3.0 Production on Tue Sep 25 04:16:55 2012

Copyright (c) 1982, 2011, Oracle.  All rights reserved.


Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL>
SQL> create user obiadmin identified by Aa123456 account unlock;

User created.

SQL> grant sysdba to obiadmin;

Grant succeeded.

6.      Setup OBI Repository Creation Utility
[root@ora1 rcuHome]# cd /data2/softs/OBI/rcuHome/bin/
[root@ora1 bin]# ./rcu
             











7.      Setup Oracle BI EE
[oracle@ora1 bin]# cd /data2/softs/OBI/bishiphome/Disk1/
[oracle@ora1 Disk1]# ./runInstaller
[oracle@ora1 Disk1]# cd /data3/
[oracle@ora1 Disk1]# mkdir OBI













8.      Start/Stop BI Services
[oracle@ora1 bin]# vi /data3/OBI/user_projects/domains/bifoundation_domain/servers/bi_server1/security
[oracle@ora1 Disk1]# ./runInstaller
[oracle@ora1 Disk1]# cd /data3/
[oracle@ora1 Disk1]# mkdir OBI
Tạo file boot.properties trong thư mục /data3/OBI/user_projects/domains/bifoundation_domain/servers/bi_server1/security
Edit file và add 2 dòng sau:
username=weblogic
password=Password
Trong đó, Password là pass đã tạo cho user weblogic.
Copy file boot.properties vào thư mục /data3/OBI/user_projects/domains/bifoundation_domain/servers/AdminServer/security
Làm bước này để ko phải đăng nhập trong quá trình start/stop services
Scripts to stop/start Services:
File cấu hình biến môi trường (tham khảo)
export ORACLE_BASE=/data3/OBI
export ORACLE_HOME=$ORACLE_BASE/Oracle_BI1
export TNS_ADMIN=$ORACLE_HOME/network/admin
export ORACLE_SID=orcl
export PATH=$ORACLE_HOME/bin:$PATH

Toàn bộ scripts dùng để start/stop services
[root@GS-DW-APP2 scripts]# ll
total 3084
-rwxr-xr-x 1 oracle oinstall      79 Feb 29  2012 1_Start_WebLogic.sh
-rwxr-xr-x 1 oracle oinstall      54 Feb 29  2012 2_Start_Opmn.sh
-rwxr-xr-x 1 oracle oinstall      97 Feb 29  2012 3_Start_BI_Server.sh
-rwxr-xr-x 1 oracle oinstall      96 Feb 29  2012 4_Stop_BI_Server.sh
-rwxr-xr-x 1 oracle oinstall      53 Feb 29  2012 5_Stop_Opmn.sh
-rwxr-xr-x 1 oracle oinstall      78 Feb 29  2012 6_Stop_WebLogic.sh
-rwxr-xr-x 1 oracle oinstall     175 Mar  2  2012 BIenv
-rw------- 1 root   root     3112133 Sep 19 09:51 nohup.out
drwxr-xr-x 2 oracle oinstall    4096 Aug 22 23:55 PROD
-rwxr-xr-x 1 oracle oinstall      87 Mar  2  2012 Start_OBI_Services.sh
-rwxr-xr-x 1 oracle oinstall      84 Mar  2  2012 Stop_OBI_Services.sh

[root@GS-DW-APP2 scripts]# cat 1_Start_WebLogic.sh
sh $ORACLE_BASE/user_projects/domains/bifoundation_domain/bin/startWebLogic.sh
[root@GS-DW-APP2 scripts]# cat 2_Start_Opmn.sh
$ORACLE_BASE/instances/instance1/bin/opmnctl startall
[root@GS-DW-APP2 scripts]# cat 3_Start_BI_Server.sh
sh $ORACLE_BASE/user_projects/domains/bifoundation_domain/bin/startManagedWebLogic.sh bi_server1
[root@GS-DW-APP2 scripts]# cat 4_Stop_BI_Server.sh
sh $ORACLE_BASE/user_projects/domains/bifoundation_domain/bin/stopManagedWebLogic.sh bi_server1
[root@GS-DW-APP2 scripts]# cat 5_Stop_Opmn.sh
$ORACLE_BASE/instances/instance1/bin/opmnctl stopall
[root@GS-DW-APP2 scripts]# cat 6_Stop_WebLogic.sh
sh $ORACLE_BASE/user_projects/domains/bifoundation_domain/bin/stopWebLogic.sh
[root@GS-DW-APP2 scripts]#
[root@GS-DW-APP2 scripts]# cat Start_OBI_Services.sh
nohup ./1_Start_WebLogic.sh &
nohup ./2_Start_Opmn.sh &
nohup ./3_Start_BI_Server.sh &
[root@GS-DW-APP2 scripts]# cat Stop_OBI_Services.sh
nohup ./4_Stop_BI_Server.sh &
nohup ./5_Stop_Opmn.sh &
nohup ./6_Stop_WebLogic.sh &

Chỉ cần chạy script Start_OBI_Services.sh để start all services
Và script Stop_OBI_Services.sh để stop all services.
9.      Kiểm tra các services


-          Truy cập http://localhost:7001/em






Để bước đầu làm quen với  OBI, có thể tham khảo tutorial tại đây: