Tuesday, 30 December 2014

DAC Encryption key Change


Today we will discuss how to create a new Encryption key for DAC repository. Sometime we used to get an error while trying to start/stop the DAC server. Error description will be like  Encryption key in authentication file doesn't match the repository. Please use the correct authentication file to connect.”

 To change the DAC encryption key

1.       In the DAC Client, on the Tools menu, select DAC Repository Management, and then select Change Encryption Key.
 


2.       Click Generate Random Encryption Key to generate an encryption key. The key is displayed in the Key field. Alternatively, you can enter a key in the Key field. The key must be at least 24 characters long.



 

3.       Distribute the updated authentication file (cwallet.sso) to all users that connect to this repository.
 
Enjoy & Stay Well !! :)

 

 

 

Tuesday, 16 December 2014

OBIEE 11G – Error in Importing Metadata :“The connection has failed”

If you get the error “The connection has failed” when you try to Import Metadata into the RPD, this post may help you to resolve it. There could be n number of reason behind this, but this is of the solution.

We need to create an Environment Variable called NLS_LANG.

Follow this procedure to set the NLS_LANG environment variable for Oracle databases.
To set the NLS_LANG environment variable for Oracle databases
  1. Determine the NLS_LANG value.
    1. In the data warehouse database, run the command
    SELECT * FROM V$NLS_PARAMETERS
    1. Make a note of the NLS_LANG value, which is in the format [NLS_LANGUAGE]_[NLS_TERRITORY].[NLS_CHARACTERSET].
      For example: American_America.UTF8
 
      2. For Windows:
  1. Navigate to Control Panel > System and click the Advanced tab. Click Environment Variables.
  2. In System variables section, click New.
  3. In the Variable Name field, enter NLS_LANG.
  4. In the Variable Value field, enter the NLS_LANG value that was returned in Step 1.
    The format for the NLS_LANG value should be [NLS_LANGUAGE]_[NLS_TERRITORY].[NLS_CHARACTERSET].
    For example: American_America.UTF8.
 
    3. For UNIX, set the variable as shown below:
setenv NLS_LANG <NLS_LANG>
For example: setenv NLS_LANG American_America.UTF8.
If your data is 7-bit or 8-bit ASCII and the Informatica Server is running on UNIX, then set
NLS_LANG <NLS_LANGUAGE>_<NLS_TERRITORY>.WE8ISO8859P1
CAUTION:  Make sure you set the NLS_LANG variable correctly, as stated in this procedure, or your data will not display correctly
 
 
    4. Reboot the machine after creating the variable.
 
 
Enjoy and Stay Well !! :)

Useful URL/Blogs for OBIA/OBIEE

Monday, 15 December 2014

OBIEE 11g Error - NQSError: 13015

 We have created dashboard prompt where we are using session variable. And when we were trying to change the value for the dashboard prompt in different user, got the below error:


   nqserror: 13015 you do not have the permission to set the value of the variable report_currency'



This error will come if the "Enable any user to set the  value" is not checked in session variable.

Open RPD and the particular session variable. And checked the mentioned option.

 
I hope it will solve this issue. Enjoy and Stay well!!.
 

Monday, 1 December 2014

OBIEE - Sub Folder Creation

Sometimes its required to create a parent folder  and some sub folder under a subject area .
Suppose, we need to create a folder as like below:
 
 
Open RPD , go to presentation layer to specific subject area.  Here our parent folder is Project Details and under that we will be having sub folder like Cost Center, Project, etc.
First , you need to create a presentation table with a dummy column , where column name will be like as showed below.  
 
Then to create all sub folders , we need to specify the presentation table description like ‘->’.
 
Enjoy and stay well !! :)

Monday, 24 November 2014

Useful Linux Commands !!

Here, we have some important commands which going to help us to work with Linux OS.

Sl. No. Activity Syntax
1 To change permission of folder  chmod  –R  777  <folder_path>
2 To change folder owner of folder chown  –R  <user_name>:<group_name> <folder_path>
3 To get the details about a user ID  <user_name>
4 To restart the VNS server service  vncserver  restart 
5 To disable firewall  service  iptable  stop
6 To unzip tar file tar  –xvf   <tar_file_name>
7 To unzip the jar file  jar xvf  <jar_file_name>
8 To copy a file from one server to another server scp  –r  <file_name/folder_name>   <Target_Machine IP>:<Target_location>
9 To remove a file  rm  -rf  <folder_name>
10 To edit a file vi  <file_name with location> 
11 To save  a file :wq!
12 To check a running process ps  -ef|grep  i   “Informatica”
13 To give access control to any host xhost  +
14 To check the RAM/CPU Utilization ps -aux 
15 To check the memory top
16 To check the free memory free  -m
17 To clear the cache form server   echo 1 > /proc/sys/vm/drop_caches
18 To Reboot the server shutdown –r now
19 To see every process running in a system ps -A
Ps -e
20 To See every process except those running as root ps -U root -u root -N
21 To See process run by user Sudi ps -u sudi
22 To find a specific port status netstat -p -nlp|grep 7001



I will updating this post going onwards...

Enjoy & Stay Well!! 

Tuesday, 18 November 2014

Informatica: Update Target Table Without Update Strategy


Its very much possible to update the target table without using the update strategy transformation. In Informatica session we have option to enable this.

Open Informatica session , navigate to properties and set Treat source rows as ‘Update’ .
 

 
Then open Informatica session, navigate to Mapping and set target as "Update as update" Only.
 
 
 
 
But remember, It will always update the all the columns of target table with the data will come from source. Single column is not possible to update using this option.
 
Please put your comments !!
 

 

Informatica: Update Target Table with Non-Key Columns

Update Target Table with Non-Key  Columns.
 
Update Strategy transformation will update/insert/reject operation based on the Key defined in target table. Integration service will check for key column to do the update operation if we are using Update Strategy transformation in our mapping. But in our real-time project it can be required to update the target table based on Non-Key column also.
Lets take an example, here TEST_DS is our source table having the below data:
EMP_NUM
EMP_NAME
100
Sudi
200
Jayak
 
And TEST_D table is our target table  having the below data.
 
ROW_WID
EMP_NUM
EMP_NAME
PHONE_NUMBER
1
100
Sudipta C
9901
2
100
Sudipta C
9900
3
200
JK
8000
4
200
JK
8001
 
Now our requirement is to update the TEST_D table as per the TEST_DS data we have. If you notice in TEST_D table EMP_NUM is not a key column. But we need to use only EMP_NUM column for updating the TEST_D table. So our expected result should be like,
ROW_WID
  -EMP_NUM
 EMP_NAME
---PHONE_NUMBER
1
100
Sudi
9901
2
100
Sudi
9900
3
200
Jayak
8000
4
200
Jayak
8001
 
 
This can be handled using the “Update Override” property in target table.
 
 
Lets create a simple mapping to update the TEST_D table using Update Strategy transformation.


 
 
Now, open the target table and put your own update statement to update the target table.



 
 
Syntax to write SQL statement :
 

 
Create the Session and Workflow and run the mapping.
 

Wednesday, 29 October 2014

OBIA - Full & Incremental Load Mapping


It’s possible to configure either a Full- or an Incremental Load in Oracle BIA. If you look at the Informatica version of Oracle BIA, there are a few areas you will have to configure.
First you start with the Informatica Mapping. This will be one Mapping. It does not matter whether you run this Mapping Full or Incremental.

Lets take the ‘SDE_ORA_GLJournals’-Mapping as an example. In the Source Qualifier of the Mapping (or Mapplet), you will see a reference to to the $$LAST_EXTRACT_DATE. If you would run the Mapping with these settings, you will run an Incremental Mapping. This means that you only select the data which is created / updated since the last ETL-run.

Informatica - Source Qualifier - $$LAST_EXTRACT_DATE

The $$LAST_EXTRACT_DATE is a Parameter which you configure in the Datawarehouse Administration Console (DAC) and reference in Informatica.

DAC - Configure $$LAST_EXTRACT_DATE

According to the Oracle documentation, the “@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP. Returns the minimum of the task’s primary or auxiliary source tables last refresh timestamp, minus the prune minutes.”
Make sure this Parameter is available in both the DAC (see above) as well as in the Mapping (or Mapplet).

Informatica - Variables and Parameters - $$LAST_EXTRACT_DATE

This way the Parameter can be used in the Extraction Mapping. If you reference a Parameter in the Extraction Mapping Query which isn’t declared, the Workflow will return an error and won’t complete.

So the steps are easy;
1. Declare the $$LAST_EXTRACT_DATE-Parameter in the DAC
2. Declare the $$LAST_EXTRACT_DATE-Parameter in Informatica
3. Reference the $$LAST_EXTRACT_DATE-Parameter in the Source Qualifier
As I said before, the same Mapping is used for the the Incremental- as well as the Full-Load. If you want to run the two different loads, make sure there ar two different Workflows which run the same mapping. The difference is in the mapping of the Workflow. The Full-Workflow uses the $$INITIAL_EXTRACT_DATE whereas the Incremental-Workflow uses the $$LAST_EXTRACT_DATE.

Informatica - Workflow - SDE_ORA_GLJournals

If you edit the task which belongs to the Incremental-Workflow (‘SDE_ORA_GLJournals’), you will find the Source Qualifier with the extraction query and a reference to the $$LAST_EXTRACT_DATE-Parameter.
As you can see, the LAST_UPDATE_DATE is compared to the $$LAST_EXTRACT_DATE-Parameter.

After each ETL-run, the LAST_EXTRACT_DATES (Refresh Date) per table are stored. You can check, update or delete these values as per requirement (see picture below). If you decide to delete the Refresh Date, a Full Load ill be performed the next time.

DAC - Refresh Dates

As stated earlier, the Full-Workflow is almost identical. The only thing is that there is a reference to the $$INITIAL_EXTRACT_DATE. The $$INITIAL_EXTRACT_DATE-Parameter is defined in the DAC. You define a date in the past. Just make sure that this date captures all the data you need.

DAC - Configure $$INITIAL_EXTRACT_DATE

Make sure this Parameter is available in both the DAC (see above) as well as in the Mapping (or Mapplet).

Informatica - Variables and Parameters - $$INITIAL_EXTRACT_DATE

This way the Parameter can be used in the Extraction Mapping. If you reference a parameter in the Extraction Mapping Query which isn’t declared, the Workflow will return an error and won’t complete.
How do you make sure that the $$INITIAL_EXTRACT_DATE-Parameter will be used when running a Full-Load?

Informatica - Workflow - SDE_ORA_GLJournals_Full

If you edit the task which belongs to the Incremental-Workflow (‘SDE_ORA_GLJournals_Full’), you will find the Source Qualifier with the extraction query and a reference to the $$INITIAL_EXTRACT_DATE-Parameter.
As you can see, the LAST_UPDATE_DATE is compared to the $$INITIAL_EXTRACT_DATE-Parameter.

At this point everything is in place to either run a Full-, or an Incremental Load.

Informatica - Workflows

You just have to tell the DAC to either run the ‘SDE_ORA_GLJournals_Full’-Workflow or the ‘SDE_ORA_GLJournals’-Workflow (incremental)

DAC - Task - SDE_ORA_GL_Journals

Check the Informatica Session Log when the ETL has a another result than expected. It could be that the Workflows are incorrectly defined. You will see in the Session Log which Parameter is used and what the value of that Parameter is.

Tuesday, 28 October 2014

OBIEE 11g: How To Check the Log Files

You can check the different log files in the Oracle Business Intelligence (OBIEE) 11.1.1.x version.
The right way to check them is using the Enterprise Manager (EM) Console page but you can also review directly the files in the hard disk.

EM Console

Login to the URL
http://server.domain:7001/em
and navigate to:
Fam_bifoundation_domain-Business Intelligence-coreapplications-Diagnostics-Log Messages
These are all the available files:
Presentation Services Log
Server Log
Scheduler Log
JavaHost Log
Cluster Controller Log
Action Services Log
Security Services Log
Administrator Services Log

*************************************************************************
To check the files directly, you can use the following:

1) Admin Server logs
AdminServer-diagnostic.log
Directory:
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/AdminServer/logs
or
$DOMAIN_HOME/servers/AdminServer/logs
 
2) Managed Server logs
bi_server1-diagnostic.log

Directory:
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/bi_server1/logs
or
$DOMAIN_HOME/servers/bi_server1/logs/
 
3) Node Manager logs
Directory: WL_HOME/common/nodemanager
Example:
C:\OBI_11116\wlserver_10.3\common\nodemanager
 
4) BI Components logs
Directory: $<MW_HOME>/instances/instance2/diagnostics


5) OPMN: Oracle Process Manager and Notification Server
All files under the directory:
$<MW_HOME>/instances/instanceX/diagnostics/logs/OPMN/opmn
or
$ORACLE_INSTANCE/diagnostics/logs/OPMN/opmn
 

6) Enterprise Manager Log
emoms.trc

Directory:
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/sysman/log
or
$DOMAIN_HOME/servers/sysman/log
 
7) Miscellaneous log files
7.1) Installation logs.

7.2) Opatch.

7.3) Repository Creation Utility (RCU).

7.4) Upgrade Assistant (UA).

7.5) Map Viewer log files.

7.6) BI Publisher log file.

7.7) EmbeddedLDAP (can be used in conjunction with the access.logs, as above).

7.8) You can also review the Event Viewer log on Windows and the syslog on Linux or UNIX.

7.9) What Are Diagnostic Log Configuration Files?

7.10) Oracle Business Intelligence Mobile
 
 

 
Log files in detail

1. Admin Server logs

AdminServer-diagnostic.log

Directory:
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/AdminServer/logs
or
$DOMAIN_HOME/servers/AdminServer/logs

Example:
C:\OBIEE_11G\user_projects\domains\bifoundation_domain\servers\AdminServer\logs

 If there is no Managed Server (for instance, you did Simple Install), you should review these log files:

access.log
AdminServer.log

Domain log file:
bifoundation_domain.log
 

2. Managed Server log

bi_server1-diagnostic.log

Directory:
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/bi_server1/logs
or
$DOMAIN_HOME/servers/bi_server1/logs/

Example:
C:\OBIEE_11g\user_projects\domains\bifoundation_domain\servers\bi_server1\logs

Other files:
access.log
bi_server1.log
bi_server1.out
bi_server1.out00001
 

3. Node Manager log files

WL_HOME/common/nodemanager
or
$<MW_HOME>/wlserver_10.3/common/nodemanager

Example:
C:\OBI_11116\wlserver_10.3\common\nodemanager

4. BI Components logs file are under:

$<MW_HOME>/instances/instance2/diagnostics

In detail:

4.1) BI Server log files.

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBIServerComponent\coreapplication_obis1

To confirm the above path, review the 'component-logs.xml' file, these lines to see if the default path has changed:

<log path="diagnostics/logs/OracleBIServerComponent/coreapplication_obis1/nqquery.log">
 ...
 ...
 <log path="diagnostics/logs/OracleBIServerComponent/coreapplication_obis1/nqserver.log">

The files to review are:
4.1.1) Queries run by BI
nqquery.log
4.1.2) BI Server log file
nqserver.log
4.1.3) Administration Tool
servername_NQSAdminTool.log (example, jjuan4_NQSAdminTool.log)
4.1.4) nQUDMLExec
servername_nQUDMLExec.log (examle, jjuan4_nQUDMLExec.log)
4.1.5) Migration Utility
servername_obieerpdmigrateutil.log (example, jjuan4_obieerpdmigrateutil.log)
4.1.6) Repository log
<ORACLE_INSTANCE>/OracleBIServerComponent/coreapplication_obis1/repository
 
4.1.7) System logging level (for BISystem user).
http://docs.oracle.com/cd/E23943_01/bi.1111/e10540/planning.htm#BIEMG398

Using Administration Tool, in Tools->Options->Repository->System Logging Level to set the log level.

This option determines the default query logging level for the internal BISystem user. The BISystem user owns the Oracle BI Server
system processes and is not exposed in any user interface.

A query logging level of 0 (the default) means no logging. Set this logging
level to 2 to enable query logging for internal system processes like event
polling and initialization blocks.

See this link to see all the Query Logging Levels
4.1.8) Usage Tracking (Only if redirected to logs)
NQAcct.yyyymmdd.hhmmss.log

STORAGE_DIRECTORY parameter in the Usage Tracking section of the
NQSConfig.INI file determines the location of usage tracking log files
NQSConfig.ini file in this directory:
$<MW_HOME>/instances/instance2/config/OracleBIServerComponent/coreapplication_obis1


Example:
C:\OBIEE_11G\instances\instance2\bifoundation\OracleBIServerComponent\coreapplication_obis1\repository\SampleAppLite.rpd.Log

4.2) Presentation Server log files

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBIPresentationServicesComponent\coreapplication_obips1

File to check:
sawlog0.log
4.2.1) Java Host
$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBIJavaHostComponent/coreapplication_obijh1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBIJavaHostComponent\coreapplication_obijh1

Files to check:
jh.log
jh-1.log
jh-2.log
 

4.3) Web Catalog Upgrade.

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBIPresentationServicesComponent\coreapplication_obips1

File to check:
webcatupgrade0.log

http://aseng-wiki.us.oracle.com/asengwiki/display/~mikhail_shmulyan/BIPS+Logging+Configuration
 

4.4) Scheduler log

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBISchedulerComponent/coreapplication_obisch1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBISchedulerComponent\coreapplication_obisch1

File to check:
nqscheduler.log
console~coreapplication_obisch1~1.log
4.4.1) iBot log files.
- You can review the log files using the Windows client tool Job Manager

- You can also review the following log files:
Agent-1-1.err
Agent-1-1.log
Review the instanceconfig.xml file under

$<MW_HOME>/instances/instance2/config/OracleBISchedulerComponent/coreapplication_obisch1

Example:
C:\OBIEE_11g\instances\instance2\config\OracleBISchedulerComponent\coreapplication_obisch1


to see the path where the iBot log files are created:

iBots>
            <Log_Dir>C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBISchedulerComponent\coreapplication_obisch1</Log_Dir>
            <!--This Configuration setting is managed by Oracle Business Intelligence Enterprise Manager--><Web_Server>
            jjuan2-es.es.oracle.com:9710</Web_Server>
        <

By default, it is the same path as the Scheduler log but you can modify it.

The path is:

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBISchedulerComponent/coreapplication_obisch1
Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBISchedulerComponent\coreapplication_obisch1

4.5) Cluster Controller

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBIClusterControllerComponent/coreapplication_obiccs1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBIClusterControllerComponent\coreapplication_obiccs1

File to check:
nqcluster.log

4.6) ODBC

$<MW_HOME>/instances/instance2/diagnostics/logs/OracleBIODBCComponent/coreapplication_obips1

Example:
C:\OBIEE_11g\instances\instance2\diagnostics\logs\OracleBIODBCComponent\coreapplication_obips1
 

5. OPMN: Oracle Process Manager and Notification Server

All files under the below directory:

debug.log
logquery~1.log
opmn.log
opmn.out
service.log

$<MW_HOME>/instances/instanceX/diagnostics/logs/OPMN/opmn
or
$ORACLE_INSTANCE/diagnostics/logs/OPMN/opmn
 

6. Enterprise Manager Log

emoms.trc

Directory:
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/sysman/log
or
$DOMAIN_HOME/servers/sysman/log
 

7. Miscellaneous log files

7.1) Installation logs

7.1.1) OBIEE Installation and Configuration ($ORACLE_HOME/bin/config.sh):
$<MW_HOME>/cfgtoollogs/oui

Examples:
C:\OBIEE_11g\Oracle_BI1\cfgtoollogs\oui

2011-12-23_02-58-20PM.log
install2011-12-23_02-58-20PM.log
install2011-12-23_02-58-20PM.out
installActions2011-12-23_02-58-20PM.log
installProfile2011-12-23_02-58-20PM.log
oraInstall2011-12-23_02-58-20PM.err
oraInstall2011-12-23_02-58-20PM.out

or
$<MW_HOME>/oracle_common/cfgtoollogs/oui

Examples:
C:\OBIEE_11g\oracle_common\cfgtoollogs\oui

install2011-12-23_02-58-20PM.log
installActions2011-12-23_02-58-20PM.log
installProfile2011-12-23_02-58-20PM.log
 
7.1.2) OraInventory files: Installation, Desinstallation and Patch. OraInventory log files:

                              Windows:
                               C:\Program Files\Oracle\Inventory\logs\

                              Linux and UNIX:
                              USER_HOME/oraInventory/logs/
Examples:
installActions2011-12-15_03-21-41PM.log
oraInstall2011-12-05_02-55-22PM.err
oraInstall2011-12-05_02-55-22PM.out
On Unix/Linux, the location of the oraInventory is defined by the content of oraInst.loc, at:

- /var/opt/oracle/oraInst.loc on Solaris, HP-UX and Tru64
- /etc/oraInst.loc on Linux and AIX

On Windows, the location of the oraInventory is defined by the value of the registry key
HKEY_LOCAL_MACHINE|Software\Oracle\INST_LOC

or if this value is not defined at
C:\Program Files\Oracle\Inventory
7.1.3) Make.log
$ORACLE_HOME/install/make.log

file to check for install/relinking.

7.2) Opatch

Files:
opatch*.log

Directory
$<MW_HOME>/Oracle_BI1/cfgtoollogs/opatch

Example:
OUI location : C:/OBIEE_11G/Oracle_BI1/oui
Log file location : C:/OBIEE_11G/Oracle_BI1/cfgtoollogs/opatch/opatch2010-09-24_16-52-41PM.log
Patch history file: C:/OBIEE_11G/Oracle_BI1/cfgtoollogs/opatch/opatch_history.txt
Other file in the inventory directory. For example, on Windows:
C:/Program Files/Oracle/Inventory/logs

7.3) Repository Creation Utility (RCU)

Files:
biplatform.log
mds.log
rcu.log

If the RCU software is uncompressed in this directory:

D:\Trabajo\Docu\OBIEE\Software\11.1.1.6.0\Windows64

Check the following files, mainly the 'rcu.log':

D:\Trabajo\Docu\OBIEE\Software\11.1.1.6.0\Windows64\rcuHome\rcu\log\logdir.2012-02-23_10-17

biplatform.log
mds.log
rcu.log

7.4) Upgrade Assistant (UA)

$<MW_HOME>/Oracle_BI1/upgrade/logs

Example
C:/OBIEE_11g/Oracle_BI1/upgrade/logs

Files to check:
uaxxxx.log

Example:
ua2010-10-20-14-30-21PM.log

7.5) Map Viewer log files

$mapviewer_deploy_home/web.rar/WEB-INF/log

Example:
mapviewer_20.log in the directory

C:/OBIEE_11g/Oracle_BI1/bifoundation/jee/mapviewer.ear/web.rar/WEB-INF/log

Error messages and timestamped:
- Javascript console: Firefox, Chorme, IE with developer tools

7.6) BI Publisher log file

$<MW_HOME>/user_projects/domains/bifoundation_domain/servers/AdminServer/logs/bipublisher

For example:
C:\OBIEE_11G\user_projects\domains\bifoundation_domain\servers\AdminServer\logs\bipublisher\bipublisher.log

Details:
========
$<MW_HOME>/user_projects/domains/bifoundation_domain/servers

You will see entry for AdminServer and managed server. Depending on where bipublisher is
installed, go the

logs/bipublisher

folder.

BI Publisher Trace 32
========================
Go to FMW Control (EM)

http://server:70001/em

Farm_bifoundation_domain
  WebLogic Domain
    bi_cluster
    or
    AdminServer
  
    Right click and select Logs-Log Configuration
  
    In 'Log Levels' tab, search by: oracle.bi.bipssessionscache
  
    View: Runtime Loggers
    Search: All Categories
  
    oracle.bi
  
   oracle.bi.bipssessionscache  TRACE (32) FINEST
   oracle.bi.bipssessionscache.performance  TRACE (32) FINEST

7.7) EmbeddedLDAP (can be used in conjunction with the access.logs, as above).

EmbeddedLDAP.log
EmbeddedLDAPAccess.log

$DOMAIN_HOME/servers/server_name/data/ldap/log/EmbeddedLDAP.log DOMAIN_HOME/servers/server_name/data/ldap/log/

Example:
C:\OBIEE_11G\user_projects\domains\bifoundation_domain\servers\bi_server1\data\ldap\log\EmbeddedLDAP*.log

7.8) Event Viewer log (Windows) and syslog (Linux or UNIX)

You can also review the Event Viewer log on Windows and the syslog on Linux or UNIX. 

7.9) What Are Diagnostic Log Configuration Files?

Diagnostic log configuration files control output to diagnostic log files for Oracle Business Intelligence.
Log configuration files for Oracle Business Intelligence are stored in the following locations:

$ORACLE_INSTANCE/config/component_type/bi_component_name

Example:
logconfig.xml is the same as configuring at

http://server.domain:7001/em

Fam_bifoundation_domain-Business Intelligence-coreapplications-Diagnostics-Log Configuration

 

Power BI: Show Last Data Refresh on Dashboard

 To show last data refresh on Power BI report follow the below steps. 1. Open Report Query Editor Mode. 2. Clink on Get Data -> Blank...