Steps to migrate EPMA Import Profiles from DEV2 region to CIT region using LCM

Posted on Leave a commentPosted in Informative

STEP1 : Exporting Import Profiles from Enterprise Performance Management Architecht

  • To export Import Profiles from EPMA DEV2 , login to Workspace.
  1. Select Navigate -> Administer -> Shared Services Console

 

1

 

2. Expand Application Groups and select Foundation -> EPM Architect

 

2

 

 

3. Under Browse, expand Dimension Access – > Import Profiles (if the import profiles are for shared dimensions). To export import profiles for a particular application then expand Application metadata and expand Application and select import profiles.

 

3

 

4.Select the import profiles we wish to migrate and click on Export which is found at the bottom right of the screen

 

4

 

5. Give the File system Folder name and click on Export.

 

5

 

6.We can see the exported file under File System

 

6

 

7. We can see the status of the export under Migration status report.

 

7

 

8. Once the status is complete, download the exported file from File System into local system

 

8

 

STEP2 : Importing Import Profiles Enterprise Performance Management Architecht using LCM

  • To import Import Profiles from local system, logon to EPMA Shared Services Console in CIT

1.Right click File System and Upload

 

9

 

2.Browse the file location where it was saved while downloading it from DEV2. Select the zip file location and click on Finish

 

10

3. After uploading, the file can be seen under File System. Select the file from File System and right click the file and select Import

 

 

11

 

4. It prompts you to proceed with import. Select OK

12

 

5. Check the status of the import in Migration Status Report

 

13

 

6.Once the status is complete, the profile can be seen under Artifact list under Import Profiles

 

14

 

7.The import profiles which are migrated can now be executed to load dimensions.

 

Soumya Chandanam

DRM Consultant

Automate Metadata load into ESSBASE cubes using Maxl scripts

Posted on Leave a commentPosted in Informative

 

What is MAXL:

MaxL is the multi-dimensional database definition language (DDL) for Essbase Server. Using MaxL, you can easily automate administrative and query operations on Essbase Server.

A MaxL script contains a login statement and a sequence of MaxL statements, each terminated by a semicolon. Using of MaxL Script Editor to execute a MaxL script, the login statement is optional. Most MaxL statements begin with a verb and consist of grammatical sequences of keywords and variables. MaxL Script Editor color-codes the elements of the MaxL syntax and provides an auto-complete feature that helps to build statements.

Sample MAXL:

maxl

Using MAXL Shell:

We can pass MaxL statements to Essbase Server using the MaxL Shell. The MaxL Shell command-line interface is installed with Administration Server in:EASPATH\server\bin\essmsh.exe (EASPATH/server/bin/essmsh on UNIX)

Where EASPATH is the directory to which Administration Services is installed.

Why MAXL:

After building essbase analytic model and importing the data, the next big thing is to keep essbase data refreshed with each coming day’s new data. And it might be necessary to add new dimension member to the outline sometimes (e.g. we might have to add today’s date to the date dimension before loading today’s sales data).

MAXL is such a script language that can use to automate essbase in daily maintenance job by using some batch script to do it automatically, instead of use the admin console every time to add new member or execute calculation on the database.

 

We can write MaxL scripts which are easy to customize and re-use. A MaxL script contains a login and a sequence of MaxL statements, each terminated by a semicolon.

MaxL statements begin with a verb, and consist of grammatical sequences of keywords and variables. A single MaxL statement looks similar to an English sentence; for example,

Create application Newsamp as Sample;

Maxl DDL statements:

2

Verbs in MAXL ddl are:

3

Automate essbase using these maxl statements:

The following is an example to import data using maxl script

________________________________________________________________________________

/*maxl example*/

Spool on to c:\output.txt;

Login admin password on localhost;

Import database ‘Basic’.’Sample’ dimensions from data_file ‘C:\essbase\LoadFile.Product.csv’ using local rules_file ‘C:\essbase\product.rul’ on error append to ‘C:\essbase\dataload.err’;

Logout;

Spool off;

Exit;

________________________________________________________________________________

In the above example spool on is used to Send output of MaxL statements to a file called output.txt, located in the pre-existing directory specified by a relative path.

And also we can direct errors to one file and output to another by placing the following lines in the script:

Spool stdout on to ‘output.txt’;

Spool stderr on to ‘error.txt’;

LOGIN is used to enter into the essbase without clicking on EAS.

Import data from text or spreadsheet data files, with or without a rules file.

LOGOUT s used to exit from essbase

The following is an example of a MaxL script, sent to Essbase via the MaxL Command Shell. This script creates a user, creates a filter, and then assigns the filter to the user. Note that all MaxL scripts must begin with a login to the Essbase system, which must be running.

________________________________________________________________________________

/* login admin identified by systempasswd on Esshost;

Create user Fiona identified by sunflower;

Create filter Sample.Basic.Diet read on ’@idesc(Diet)’;

Grant filter Sample.Basic.Diet to Fiona;

Logout;

Exit;  */

________________________________________________________________________________

Note: save the maxl file with a .msh extension

.Bat file to run maxl script automatically

________________________________________________________________________________

Essmshfilename with .msh extension (and) file path where the file exists

________________________________________________________________________________

AUTOMATING DRM EXPORTS AND LOADING INTO ESSBASE:

We can Automate DRM Exports using Batch Client

Following is an example for Batch Client Export for Market

maxl

If we run this VBScript Using Batch Client we can get Market.csv Export. We can import this file Directly into EAS using Maxl

5

 

This is a maxl script for Automate Market Dimension into EAS.

We can combine these two using a .bat file and run both at a time.

For Example

3

In the above examble first line is for DRM batch Client output and If Drm Export exists only then it calls Maxl Script If not exists it Writes file does not exist into out.txt log file.

 

 

Srujana Reddy

Oracle DRM Consultant

DataSprouts Technologies Pvt Ltd.

 

 

 

Oracle DRM – Classic HFM Integration Process

Posted on Leave a commentPosted in Informative

Oracle Hyperion Financial Management is a financial consolidation and reporting application built with advanced Web technology and designed to be used and maintained by the finance team. It provides financial managers the ability to rapidly consolidate and report financial results, meet global regulatory requirements, reduce the cost of compliance and deliver confidence in the numbers.

Oracle Data Relationship Management is a Master Data Management tool where master data is maintained, analyzed, and validated before moving throughout enterprise. It is the recommended tool of choice for organizing the financial views of financial chart of accounts, cost centers and legal entities that are then used to govern on-going financial management and consolidations for an enterprises.

Some main features of DRM maintaining Master Data than creating in HFM, DRM is compatible in integrating with all the Hyperion and Non-Hyperion Systems like HFM, ESSBASE, PLANNING, Planning and budgeting cloud service (PBCS) and has a roadmap to integrate to all upcoming Oracle Cloud EPM applications. DRM Integrates with both HFM Classic and HFM EPMA.

HFM1

STEP 1 : Initial Load of metadata into DRM:

When DRM is integrated with HFM for the first time, along with metadata load we also need to create Metadata objects like Property Definitions, Property Categories, Node Types, Validations etc.. These metadata objects can be loaded into DRM using epma-app-template.xml through Migration Utility. Along with this we may also need to create certain custom properties as per business requirement.

STEP 2: Importing Dimensions using Import Profile.

After loading metadata objects through migration utility into DRM ,Create import profiles to import data into DRM. After import profiles create custom properties ,property categories, validations and node types which are used to export data into HFM.

HFM2

STEP 3:   Exporting Metadata from DRM.

DRM Export in the form of Hierarchy Export is done and then a book of all related export are created by combining all the dimensions containing individual exports into a single file.

HFM3

STEP 4: DRM Export Automation

DRM Export Process can be automated using DRM-Batch Client

DRM Batch Client is a command line utility that allows access to various arm operations in batch mode including: Actionscript, Import, Blender, Export(Individual Export and Book Export),Opening and closing versions.

HFM4

STEP 5: Importing Data into HFM Applicaiton.

Once the metadata is exported into a flat file or DB Tables from DRM next step is to import Metadata into HFM Application using Load Task.

Create an Application Profile in the Client of HFM and create an Application on the Workspace. Dimension are loaded in HFM Application using Load Metadata option in Load Task.

 

STEP6:

This Process of Loading Metadata in HFM application can be automated using Task Automator and then scheduling the task using schedulers in the Workspace.

 

Oracle DRM – Creating Derived Properties Using Javascript

Posted on Leave a commentPosted in Informative

Property define the characteristics of the Node. Property Type “Derived” allows you to build dynamic business logic into your properties. Derived properties can simplify application maintenance and reduce the overall storage of your application by reducing or eliminating the need to store (override) additional values.

When you select “Derived” as Property Type, a new drop down will appear with the available options “Formula” or “Script”. Prior to 11.1.2.3 release of DRM, using the legacy formulas was the only option.

Oracle introduced Dynamic Scripting in DRM 11.1.2.3. Dynamic scripting enables Data Relationship Management administrators to develop business logic for derived properties and validations using JavaScript. Dynamic scripts provide a more robust and better-performing alternative to formulas, using a standard scripting language. Scripts allow for better organization and less complexity of logic through the use of multiple statements, variables, and in-line comments. Dynamic scripts also provide support for advanced concepts such as looping and regular expressions. JavaScript offers significant performance improvement over the legacy formulas.

Jav1

Fig: 1. Sample JavaScript Which returns “Hello”.

Jav2

Fig: 2 Sample JavaScript to Add Two Strings

//JavaScript to add two strings

var myNodeDescr; //Declare variables
var myVersionDescr;

myVersionDescr=node.Version.PropValue(“Core.VersionDescr”); //Assigning variables
myNodeDescr=node.PropValue(“Core.Descr”);

//Declare Function
function add() {
return myNodeDescr + ” ” +myVersionDescr
}
return add();

Oracle DRM – Assigning validations through Action Script

Posted on Leave a commentPosted in Informative

Today we are going to see how we can assign validations through action scripts. Validations enable us to enforce business rules on versions, hierarchies, nodes, and properties. We can run validations in either real time or batch mode, or in both modes. Real-time validations are run at the time of modification and prevent changes from being saved if the action violates the rules being enforced. We explicitly run batch validations. Validations can also be run using Action Scripts

 Action Script

Action scripts enable us to process bulk sets of incremental changes in an automated fashion. Each record within a script represents a separate action to be performed and processed separately from other actions. We can group actions of different types in the same script. Action scripts are particularly useful when we need to perform the same set of actions for multiple versions, hierarchies, or nodes. We can load action scripts from flat files, generate them from transaction logs, or create them from node models. In the Script task group, we can load and run only one action script at a time.

 

Types

AssignVersionValidations Ver Validation List ResultDescr  
AssignHierValidations Ver Validation List ResultDescr  

 

Version validation using action script syntax

AssignVersionValidations Ver Validation List ResultDescr

 

Example:

AssignVersionValidations;Sob_Ess_Year;Essbase.MemberNameLength;Result

Selecting Validation

After creating validations, we can assign them to versions, hierarchies, domains, and nodes. Multiple validations can be assigned at the same time. We can assign using Action Scripts as well. Select the related validation which we want to perform on a version.

Load Action Script

In the Script task group, we load the flat file that contains the action scripts. After loading, edit rows that display warning symbols to ensure proper processing.

Let’s take an example for loading the script, below is the script

 

AssignVersionValidations;Sob_Ess_Year;Essbase.MemberNameLength;Result in DRM script.

Val1

Run the Script

When we run the script of assigning validation at version level by using action script validations are inherited by all hierarchies and nodes within the version.

Val2

Check Assign Validations for Version

 This is how we can check by clicking version then right click assign validations tab

Val4