Come See Edgewater Ranzal at Kscope11

ODTUG Kscope11 is right around the corner. Kscope11 offers the chance for a full day EPM Symposium on Sunday, plus the opportunity to learn from experts in the EPM and BI fields on a wide range of topics.

Edgewater Ranzal will be well represented at the conference, with our associates presenting eight presentations covering Planning, DRM, EPMA, HFM, and FDM. The sessions that we will be presenting at Kscope11 are summarized below. Each title links to an abstract for the presentation, providing additional details.

Session No. Date Time Room Presenter Title
1 6/27/11 11:15 – 12:15 102C Jeff Richardson Calculation Manager:  The New and Improved Application to Create Planning Business Rules
7 6/28/11 11:15 – 12:15 103C Tony Scalese Planning (or Essbase) and FDM and ERPi Equals Success!
10 6/28/11 4:30 – 5:30 101B Chris Barbieri Security and Auditing in HFM
11 6/29/11 8:30 – 9:30 103A Patrick Lehner Best Practices for Using DRM with EPMA
11 6/29/11 8:30 – 9:30 101B Chris Barbieri Getting Started with Calc Manager for HFM
12 6/29/11 9:45 – 10:45 101B Chris Barbieri Advanced Topics in Calc Manager for HFM
12 6/29/11 9:45 – 10:45 102C John Martin Have it Your Way: Building Planning Hierarchies with EPMA or Outline Load Utility
13 6/29/11 11:15 – 12:15 101B Tony Scalese Maximizing the Value of an EPM Investment with ERPi, FDM, & EPMA
17 6/30/11 8:30 – 9:30 101B Tony Scalese Taking Your FDM Application to the Next Level with Advanced Scripting
18 6/30/11 10:30 – 11:30 101B Peter Fugere IFRS Reporting Within Hyperion Financial Management

In addition to the presentations above, you can catch up with our experts at our booth in the Vendor Showcase.

We look forward to seeing you in Long Beach. If you haven’t already registered, you can do so here.

ORACLE HYPERION CALC MANAGER – Part 4 – Creating RuleSets

In Part 1 of this series, we introduced Calc Manager, providing a general overview and explanation of some new terms.  In the second post we walked through the development of a Planning rule that utilized a run time prompt.  Part 3 covered templates available with Calc Manager.

In this, the final post in this series, we’ll step through the creation of a ruleset.  Rulesets are equivalent to Business Rule Sequences in Hyperion Business Rules.

We’ll begin by logging on to Hyperion Workspace and navigating to Calc Manager.  Once in Workspace, the navigation path is:  Navigate -> Administer -> Calculation Manager.

Once in Calc Manager, you’ll land on the System View tab, which appears as follows:

 

Once again, I’ll use my EPMA enabled version of my Planning app based on Sample.Basic.

To create a new ruleset, right click on the “RuleSets” node under your Planning app and select New.  You’ll be prompted to give the ruleset a name.  I’ll name mine Process_Application.  Additionally, you can change the app/database for this ruleset in this dialog box.

After I click OK,  the following screen loads:

You can display the rules available for your rule set by expanding the tree until you see the rules for your database. 

To add rules to the ruleset, simply drag and drop them onto the Ruleset Designer on the right side of the screen.

By default, the rules will run sequentially.  If you wish for rules to execute in parallel, select the RuleSet name within the RuleSet designer.  Check “Enable Parallel Execution” on the Properties tab at the bottom of the screen.

In order to run the script, save, validate, and deploy to your Planning application.

The series of posts that we’ve put together this summer were designed to give a user a basic understanding of how to work with Calculation Manager.  With any new technology, its best to dive in and immerse yourself to speed through the learning curve – Calculation Manager is no different.  Take the opportunity to experiment with the tool.  I feel that you’ll find it easy to learn the basics and before long you’ll be developing your own rules.

If you have any questions about Calc Manager, please leave a comment on any of the posts in this series, or reach out to me via email at jrichardson@ranzal.com.

ORACLE HYPERION CALC MANAGER – Part 3 – Working with Templates

In Part 1 of this series, we introduced Calc Manager, providing a general overview and explanation of some new terms.  In the second post in the series, we walked through the development of a Planning rule that utilized a run time prompt.  In this post, we’ll explore templates provided within Calc Manager.

As with the Rule Designer, which is a great tool to help less experienced developers build rules, templates provide a simple way to develop rules for basic tasks in Planning and Essbase…tasks such as copying, clearing, exporting, allocating, and aggregating data.  In addition, you can design your own templates.

We’ll begin by logging on to Hyperion Workspace and navigating to Calc Manager.  Once in Workspace, the navigation path is:  Navigate -> Administer -> Calculation Manager.

Once in Calc Manager, you’ll land on the System View tab, which appears as follows:

Once again, I’ll use my EPMA enabled version of my Planning app based on Sample.Basic.

To access predefined templates, right click on “Rules”.  Once you give the rule a name, the graphical designer is launched.  In the “Existing Objects” window, you should find a list of the pre-existing templates.  A list of the system templates follows:

CLEAR DATA

In order to use the system template to Clear Data, drag and drop “Clear Data” from the System Templates to the Rule Designer.  This will then invoke a member selection window asking you to specify the data to clear.  Keep in mind that this template generates a calc script utilizing the CLEARBLOCK command as opposed to a CLEARDATA command.

In my sample app, I select “FY11” for the Years dimension and “Final” for the Version dimension.  The dropdown box for “Clearblock Option” can be used to define the blocks to be cleared…”All” is the default.  The code that is generated appears below.

FIX ("FY11","Final")
  CLEARBLOCK ALL;
ENDFIX

COPY DATA

The Copy Data template helps to walk the calc developer through the process of copying data from one slice of the database to another.

In the remainder of the wizard, you select the “Copy From” member and the “Copy To” member.  The calc script generated follows:

FIX (@RELATIVE("Measures" , 0),@RELATIVE("Periods" ,0),@RELATIVE("Product" , 0),@RELATIVE("Market" , 0),@RELATIVE("Years" , 0),"Budget")
DATACOPY "Working" TO "Final";
ENDFIX

AMOUNT-UNIT-RATE

The Amount-Unit-Rate template allows the developer to build a calc script to solve for either an amount, unit, or rate, basically whichever is missing.  I’ve added a couple of measures to my application to facilitate the demo.  Using the member selection wizard, I’ve selected “Sales” as my amount, “Cases” as my unit, and “Revenue per Case”  as my rate.  The script generated by the template follows:

"Sales"(
  IF ("Sales" == #missing and "Cases" != #missing and "Revenue per Case" != #missing)
    "Sales" = "Cases" * "Revenue per Case";
  ELSEIF ("Sales" != #missing and "Cases" == #missing and "Revenue per Case" != #missing)
    "Cases" = "Sales" / "Revenue per Case";
  ELSEIF ("Sales" != #missing and "Cases" != #missing and "Revenue per Case" == #missing)
    "Revenue per Case" = "Sales" / "Cases";
  ELSE
    "Sales" = "Cases" * "Revenue per Case";
  ENDIF
)

ALLOCATIONS

Two types of allocation templates are provided within Calc Manager.  The first template, Allocate Level to Level,  allows you to allocate from one level to another.   In my example with my Planning app, you would use this template to allocate marketing expenses  from product family to product using a driver like revenue.  This approach utilizes @ANCESTVAL to build the script.

The second template, Allocate Simple, allocates values based on a predefined relationship, such as Marketing->Market * Cases/Cases->Market.

Both templates walk the developer through the setup of the allocations, selecting members that are fixed throughout the process, offset members (if any), etc.

AGGREGATION

The aggregation template aids the developer to create a script to aggregate the application.  The first screen of the wizard, pictured below, allows you to select members for the FIX statement in the aggregation – here you would limit the calc to a particular version, scenario, or your non aggregating sparse dimension members.

The next screen prompts for dense dimensions to aggregate.  However, if dynamic calcs are properly utilized, this should not be necessary.

The third screen asks for sparse dimensions for the aggregation.  You should exclude any non aggregating sparse dimensions from this selection.

Next, you’re prompted for partial aggregations of dense dimensions.  Again – if dynamic calcs are used properly, this should not be an issue.

In the final screen of the wizard, the developer selects settings for the script…

The code generated by Calc Manager follows:

SET AGGMISSG ON;
SET FRMLBOTTOMUP ON;
SET CACHE HIGH;
FIX (@RELATIVE("Years" , 0),"Working","Budget")
CALC DIM ("Product");
CALC DIM ("Market");
ENDFIX

Please note that this code is not optimized.  In this example, I would use the following:

AGG (“Product”,”Market”);

The code as generated by Calc Manager will result in an extra pass through the database – the calc can be accomplished with a single pass.  Additionally, AGG can be used in place of CALC DIM if there are no formulas on the dimensions being calculated.  Generally speaking, stored formulas on sparse dimensions should be avoided due to performance issues.

SET Commands

The next template walks the user through setting various SET commands for the calc.  This is a fairly straightforward exercise.

EXPORT DATA

This is another straightforward template that helps create a data export calc script.  You need to define the fixed members for the export,  delimiter, #MISSING value, export type (flat file, relational), etc.

In the final part of this series, due for posting on August 13, we’ll walk through the creation of a ruleset.  If you have any questions before the next post, please leave a comment!

ORACLE HYPERION CALC MANAGER – Part 2 – Creating a Planning Rule

In Part 1 of this series we introduced Calc Manager, providing a general overview and explanation of some new terms.  In this post, we will walk through the development of a rule for Hyperion Planning using the graphical interface within Calc Manager.

Again, in order to access Calc Manager, log on to Hyperion Workspace.  Once in Workspace, the navigation path is:

Navigate->Administer->Calculation Manager.

Once in Calc Manager, you’ll land on the System View tab, which appears as follows:

 

For purposes of this demonstration, I have created an EPMA enabled Planning application from the Sample.Basic application that we all know and love.  When the Planning node is expanded, this is what I see:

First, to help illustrate functionality available in Calc Manager, I’m going to create a script component that contains my standard SET commands for the rule.  In order to create the script component, right click on “Scripts” and click on “New”.  Give your script a name and click on “OK”.  This will launch the Component Designer.

From here, you have two options.  If you know what your SET commands need to be, you’re free to type them in directly.  If you wish to be prompted through the process, click on the  button at the top left corner of the Component Designer window.  This will launch a window with all of the calc functions and SET commands.  The following shot displays the function selection interface for SET commands.

For my purposes, I’m going to directly type my SET commands into the Component Designer.  Once complete, save and validate.

Think of script components as an easy way to reuse code…SET commands, standard cube aggregations and the like. 

Once we have saved the script component with our SET commands, it’s time to develop our rule.  To begin, right click on “Rules” under the database node and select “New”.    Give your rule a name and click on “OK”.  This will launch you into the Rule Designer window.

In this example, I’ll create a rule that aggregates the cube, using a run time prompt for the Version dimension.

We can now begin to develop our rule.  First, we’ll select the script component for our SET commands that we developed earlier.  Simply drag this into the rule designer to the right of “Start”.  The Rule Designer window now looks like this:

Let’s take this opportunity to create our variable for the Version dimension run time prompt.  Go to the “Tools” menu and select “Variables”.  Once the Variable Navigator launches, expand the Planning, application, and database nodes.  I’m going to create a run time prompt variable for the Version dimension.  Right click on your rule name and select “New”.  Once I populate the fields on the “Replacement” tab, my screen looks like this:

When complete, save the variable.  Now, back to our rule…

We’re going to specify members for our “Fix” statement.  To do this, select “Member Range” in the New Objects portion of the Rule Palette.  

To add to the rule, drag and drop to the right of the SET command script.  My screen looks like this:

Next, we’ll populate the members for our Fix statement.  I’ll start with Measures.  For my rule, I want to select all of the level 0 measures.  Once I click on the Value field for the Measures dimension, an Actions box appears. 

I want to select a Function.  This invokes the function selection window that we observed earlier.  I want to select @LEVMBRS from the list, which will then prompt for the dimension and level number.

I select Measures from the drop down box and enter “0” for the level name.  I’m going to repeat this process for all of my dense and non aggregating sparse dimensions, with the exception of the Version dimension.  This will be handled via the run time prompt.  For the Version dimension, select “Variable” in the Actions box.  Change the Category selection to “Rule” and this is what we see.

Highlight the variable and click OK.  My member range box looks like this:

Now, we’ll develop the script component to aggregate the Product and Market dimensions.  I’m going to drag a script from the “New Objects” portion of the Rule Palette into my member range.  The graphical display looks like:

Again, I’m going to select a function (AGG in this case).  I then select Products and Market from the dimension selector. 

Now, save and validate.  To deploy the rule to Planning, select Quick Deploy:

Once deployed, the rule can be run from Planning.

In this post, we’ve provided a walk through on developing a new rule using the graphical designer.  More experienced developers can directly code the calc in script mode.  To convert to script mode, select “Edit” and “Script” from the menu. 

In the next post, due by July 31, we’ll explore templates and ruleset creation.  In the meantime, please leave a comment if you have any questions!

ORACLE HYPERION CALC MANAGER – Part 1

With the continued investment in the Hyperion tool set by Oracle, there was a desire to centralize the development of calculations for HFM, Essbase, and Planning.  As a result of this, Oracle Hyperion Calculation Manager was born.  Calc Manager is a powerful tool for developing and administering rules for Planning and Essbase.   An intuitive graphical interface is available to help in the development process, helping to expedite movement through the learning curve for people just beginning to dip their toes into the world of Oracle Hyperion Planning and Oracle Essbase.

Over the course of several posts this summer, I’ll explore Calc Manager functionality from the Essbase and Planning points of view.  For EPMA-enabled Planning applications, use of Calc Manager is required.   With version 11.1.1.3, Calc Manager can be used with Classic Planning apps as well.  However, the focus of my blog posts will be EPMA-enabled apps, as Classic Planning rides off into the sunset.

Calc Manager, a component of EPM Architect, is integrated into EPM Workspace, the standard entry point for many Hyperion applications.  In order to access Calc Manager, log into Workspace, and select Nagivate->Administer->Calculation Manager (see screen shot below for navigation path).  However, before we get too far into actually navigating the tool, we’ll need to get comfortable with the terminology within Calc Manager.

There are three types of objects within Calc Manager:  components, rules, and rulesets.   Components are smaller pieces of a larger rule.  Things like SET commands, FIX statements, formulas, etc. are examples of components.  I’ll explore this in much greater detail in a future post, but think of a standard types of SET commands that you use in all of your scripts – this can be saved separately as a script component and pulled into a new rule very easily.  Included below is a shot of the Component Designer with a sample of some standard set commands.

Essentially, rules are the finished calc script, similar to Business Rules in the past.  Rules are used for modeling/allocations/aggregations and the like.  Rules can be built using system templates.  Oracle has provided standardized templates for tasks such as clearing, copying, allocating, aggregating, and exporting data.   Again, these templates will be explored in additional detail in a future post.

Rulesets are similar to Business Rule Sequences under Hyperion Business Rules.  Rulesets can be used to launch rules sequentially or simultaneously depending on your logic requirements.

Now that we’ve covered the basic terminology related to Calc Manager, in my next post, which should be online by July 4, we’ll walk you through creating a rule for an EPMA enabled Planning app.  In the meantime, if you have any questions, leave a comment!

Hyperion Financial Management: Zero View and Default View Settings

One of the most common areas of confusion in Hyperion Financial Management (HFM) has been the Scenario application setting for the Default View and Zero View.  Peering back into the history of Hyperion Solutions, you will find this setting has existed in all the Consolidation products, such as Hyperion Enterprise’s “missing values” setting under Categories.  They are not new features and are a fundamental to HFM.

ZeroViewForAdj and ZeroViewForNonadj

There are two ZeroView settings in HFM.  The ZeroViewForNonadj is a setting which applies to the <Entity Currency> Value dimension.   The second is the ZeroViewForAdj which applies a setting to the <Entity Curr Adjs> Value dimension member.

The derived data that results from the ZeroView settings commonly appears in HFM as a slightly grey numeric value, as opposed to a black display found with standard data.

Before we explain what it does, let’s talk about why we need it.

Part of Hyperion Financial Management’s financial intelligence is its Account Types.  These types help to manage data flowing through the chart of accounts and support variance reporting.

  • Revenue
  • Expense
  • Asset
  • Liability
  • Balance
  • Flow
  • GroupLabel
  • Currency
  • BalanceRecurring

The Revenue, Expense and Flow Account types are referred to as “flow” accounts and tend to support Profit and Loss reporting.  As such, they are supported by the View dimension.  The View dimension allows data to be viewed, keyed or loaded as YTD or Periodic amounts.  Data entered as periodic sales of $1000 to all months would allow users to view December as Periodic $1000 or YTD $12,000.

Example 1:

The ZeroView setting informs the HFM Scenario how to handle data for a flow account when no data is loaded to a Period.  No data is not a zero; it is a blank, non-existent value.

Example 2:

If data was loaded from your source system for Jan as $1000, how should HFM calculate Feb? There are two options using the ZeroView settings which are YTD or Periodic.  The first option is the YTD setting for the ZeroViewForNonadjs attribute.

Choosing the ZeroView setting as YTD, HFM will fill the period following with a derived value as zero YTD.

Example 3 ZeroView as YTD:

Recall that Flow accounts can be viewed as Periodic or YTD, what does Feb look like as Periodic?  If there is $1000 YTD in Jan and zero YTD in Feb, there must have been a change in periodic activity.  Therefore, the Feb Periodic value would be $-1000 to arrive at YTD zero.

Example 4 ZeroView as YTD:

The second option is to set the ZeroView settings as Periodic.  This setting addresses the same properties of flow type accounts.  In this instance, the following period will have a zero applied as a Periodic zero, which is the activity.  This setting is common on the Budget and Forecast Scenarios.  The data result is the entry flows through all the periods because the YTD value is derived from the Periodic activity.

Example 5 ZeroView as Periodic:

Example 6 ZeroView as Periodic:

How does one determine which setting is correct for their application?  Typically, we analyze how the data load file is constructed.  For example, for flow accounts that may be adjusted to zero, are they included in the file?  Does a periodic version of the file supply zero items as the year-to-date negative offset? The most common setting is to set the ZeroView as YTD for Actual data loads.

A good example why we would set the ZeroView to YTD in Actual would be if data was re-classed month-to-month or a new account is used.  In the example below, a different account is used in Feb compared to the data supplied in Jan.  The ZeroView setting as YTD will automatically assume the Feb amount is to be YTD zero.

Example 7 ZeroView as YTD:

If the Scenario was set to Periodic for the ZeroView, the application would have incorrect results for Feb as 2000.  The data would flow through all the Periods, and in this case, the Sales account should be zero for Feb.  The original account would require a -1000 Periodic or “0” YTD entries to clear the value in Feb on account Sales.

Example 8 ZeroView as Periodic:

The ZeroViewForAdjs setting applies to how HFM Journal entries will function for missing data and how they affect future periods.

For Journal adjustments we see the same impact on the data within the <Entity Curr Adjs> Value member.  The YTD setting will essentially reverse the Journal in the following period by applying the YTD offset to the Journal.

Example 9 ZeroView as YTD:

In cases were a prior period is being adjusted, the YTD setting will keep the YTD values of future Period, such as Feb, constant.  As an example, if the original value were 1000 in Jan and 1000 in Feb, the Feb YTD amount would be 2000.  A journal posted to Jan would impact the Jan results, but the Feb YTD amount of 2000 would not change.  In the example above, what is adjusted is the Periodic amount from 1000 to 500.  Therefore, to see the impact on all periods, the Journal would need to be created and posted in all periods going forward.

Journal adjustments with the ZeroView setting for Journals as Periodic will allow the impact of the Journal to affect the YTD results in the future periods.  Because of this, to reverse the affect in a future period a reversing Journal entry is required.

Example 10 ZeroView as Periodic:

Therefore, when viewing data within HFM as a YTD or Periodic view, the grey offset values are the result of your Scenario settings for ZeroViewForNonadjs or ZeroViewForAdjs.  These amounts impact your YTD and Periodic results in Local and Translated amounts.

Default View

The Default view determines what View will be applied to the “<Scenario View>”.  When opening a Grid, form or report, this is the default member that will display.

Because of this, it is important when writing reports or comparing data, a specific view be chosen.  The “<Scenario View>” results will vary by the design of each Scenario.

The default view also is a concern for Rules in HFM.  This setting sets the basis of calculations.

As an example, if there was a Rule that simply called for “estimated bonus” to be a percent of “sales” after we attained our goal of $1000.00, the calculation would vary by Default View on Scenarios.

In the example, the YTD Scenario will calculate a result because it is calculating Rules off the YTD value of $1500. The Periodic Scenario does not calculate the Rule because it is driven off the Periodic value of $500.

Careful attention to Rules development and the utilization of the Dynamic Sub Routine in rules can help to minimize issues caused by differences in the Default View settings.

Special uses for Life Cycle Management (LCM)

In my previous post, I showed how to use LCM to back up or copy an entire planning application environment.  Here I’ll expand on that subject a bit by showing some other uses you may find handy.  This is by no means meant to be an exhaustive collection – just a few suggestions you may find useful and which may provoke ideas for other uses.

Copy single dimension from one app to another

This can be done for any dimension, including the standard planning dimensions.  Here, to expand on the subject we are also going to export from the “Organization” dim in one planning app & import to the “Entity” dim in another.

Select the artifacts to export (no harm in copying everything).

Click thru the next screen to this one.

Since we need to change the dimension name, we must export to files, not directly to the other app.

Then click thru the remaining screens to execute the migration.

After the export finishes, go to the \Hyperion\Common\Import_export directory. Under the Username@Directory folder find the files you exported.

In the “info” directory, edit “listing.xml” changing all instances of “Organization” to “Entity”.

Now find the XML file for the dimension to be migrated with name change.

Rename to the target dimension name.

Now edit the file to change “Organization” to “Entity”.

In Shared Services->Application Groups->File System, open the extract and select the (newly renamed) Entity dimension.

Define Migration…

…and click thru the remaining screens to execute the migration.

Lights-out Operation

In Shared Services select the artifacts to be backed up and define migration.

We need to back it up to files so type in a folder name…

…and click thru the remaining screens until you get here.

Now, instead of clicking the Execute button, click “Save Migration Definition.”

You will get this screen…

…click “Save.”

Shared Services wants to save “MigrationDefinition.xml” where you tell it to.

You can name the file any name you want (I suggest using naming conventions to differentiate the operation being saved) and anywhere you want.

After saving the file you will get this…

…click “Close” and the backup definition will be saved.

Now look in the Automation folder where the xml file was saved.

The file has everything Shared Services needs to run the backup from the command line utility except the USERID and PASSWORD.

Edit in TextPad or other text editor and type in a Userid and password.

After running the job the password is automatically encrypted.

The job is run from an Oracle supplied process, “utility.bat.”

…and you pass the path information to the migration definition file you created above.”

You should channel the output to a log file so you will have a record of success or failure.  The following message is an excerpt from that log which, in turn, lists the detailed log location & name and whether the process was a success or failure and it will also tell exactly where any failure occurred in the process.

I hope I’ve shown you enough to get you started using LCM.  It can certainly be a valuable tool, whether you want to do one-time tasks or perform lights-out operations such as regular backups.  The important thing to remember is to test it and see what, if any, problems you will have and either fix those or work around them.

Using Oracle’s Hyperion® Life Cycle Management

What is LCM?

LCM (Life Cycle Management) is a tool which can be used to migrate Hyperion applications, cubes, repositories, or artifacts across product environments and operating systems. It is accessed through the Shared Services Console.

Does it work?

After using LCM at a few clients I think the answer is a definite YES, but there needs to be a realistic setting of expectations:  Yes, LCM has some very good and handy uses; but NO, it is not necessarily going to be a painless, simple answer to your migration and/or backup needs.

What can I do with it?

You can use it for migrations:

  • One environment to another
  • One app to another (same SS environment)
  • Selected dimensions or other artifacts

And for backups/restores, including keeping two separate environments synchronized:

  • Selected artifacts
  • Lights-out

Products which can be migrated are:

  • Shared Services
  • Essbase
  • Planning
  • Reporting
  • HFM
  • The dimensions housed in EPMA

This blog is going to concentrate on using LCM for planning application migrations although, as you can see from the list above, it can also be used for other products as well.

First I’ll show how a migration is done, using screen shots, to give a detailed look.  Then I’ll point out things to look out for including things which will cause the migration to fail — with work-arounds where possible.

To migrate an entire Planning application, you will need to copy (4) areas:

  1. Shared Services
  2. Essbase (For Planning, only need the Essbase Global Variables.  All App/DB specific variables are migrated with the Planning Application)
  3. Planning Application
  4. Reporting and Analysis (if applicable)

The order in which you export these is not important but when doing the import, the order is very important.

Some important considerations:

  • Target app can have different name from source
  • Source and destination plan types must match
    • Can be changed by editing the files
    • Target plan types must be in same order as source
  • Start year must be the same
    • Number of years doesn’t need to match
  • Base time period must be the same
  • Target app’s Currency settings must match Source
  • Standard Dimension names must match
    • Can be changed by editing the files

When exporting any application it is advisable to just export everything.  If necessary you can be selective on the import side.

Start the process by opening the Shared Services console and go to the Application Groups –>Application (in this case – Shared Services under Foundation).

In the lower part of the screen, click “Select All” and then “Define Migration”

Now go through the screens:

Leave each field with an * and Choose “Next”

Type in a file name for the export.  It is advisable that you use a naming convention for this since you will end up with (possibly multiple) files for each application.

Review the destination options & click “Next.”

Finally, review the Migration summary and click “Execute Migration.”

NOTE:  If this process is going to be re-run in a lights-out environment you should instead choose the “Save Migration Definition” button.  I’ll discuss this more fully later on.

You will get this information screen.  Click Launch Migration Status Report to actually see the migration progress.

As long as the migration is running you will get a status of In Progress

Click Refresh to keep checking status (if desired) until you get a status of Completed or Failed.

All of the other applications can be exported this same way, each with slightly different screen sequences but generally the same process.

The primary differences will be for Planning and Essbase where, if there are other applications in the same Shared Services environment, they will be offered as possible targets for the export, in addition to the File System.  Selecting one of these will cause a direct migration from the source application to the selected target application.

After the exports are finished the LCM export files can be copied to the target server environment, if needed.  These export files can be found on the Shared Services server under \Hyperion\common\import_export\username@directory.

Copy the entire directory (in this example, Essadmin@Native Directory) to the Hyperion\common\import_export directory on the target server.

The import side is where things are more likely to be tricky.  Here you will reverse the process, selecting the export files in proper order (Shared Services, Essbase, Planning & Reporting) and importing them to whatever target is appropriate.

Start the process by logging in to the Shared Services console as the same username you used in the export process.  Under Application Groups–>File System, find the appropriate export files and click “Define Migration.”

Click through the screens, including the SS screen selecting the target application to import to.

On the destination option screen select Create/Update and increase the Max errors if desired (default = 100)…

…and run the migration.

For the Planning import select all to begin.

Click through the screens and select the planning application to import to.

And click through the remaining screens to execute the migration.

The Reporting migration is similar.  Select all the artifacts you want to import.

And go through the remaining screens to execute the migration.

In many cases, especially where you are keeping two identical environments in sync, these migrations should go smoothly and complete without error.  However, at other times, especially when doing an initial migration or one where the security will be much different from one to another, you may have to make several passes at the migration.  When even one item fails to migrate successfully, LCM will send back a status of “Failed”.  Click on that link in the status report and LCM will tell you what items failed to migrate.  All other items will usually have migrated successfully.   You will then have to figure out why the item failed and either fix the problem, work around the problem or ignore it and migrate the item another way.

Here are some things I’ve found which will cause you problems in using LCM:

  • In exporting a planning application with many substitution variables, the EXPORT failed – refusing to export all of the variables.  This was worked around by exporting only the variables and then exporting everything except the variables.
  • OR, you can play with the group count/size settings as well as report and log files location within the migration.properties file.
  • Default settings usually are:
  • grouping.size=100[mb]
  • grouping.size_unknown_artifact_count=10000
  • Using “All Locations” in HBR will cause failure for those forms.
  • Essbase server names—if not same in source & target, you will have to modify the import files for target name.
  • Report Name Length is limited to 131 characters less folder name.
  • Dim members “Marked for Delete” won’t migrate.  You will have to delete them using a SQL query if you want them migrated.
  • Form folders may get re-ordered on migration.  You can work around this by manually adding the folders to the target application in the proper order.  LCM will not reorder existing folders.
  • Doesn’t support parentheses ( ) in form names.  You won’t get an error indication in the export/import – the forms just won’t be there in the imported app.  You’ll have to rename the forms to get them to come over.
  • Member formulas need to be in planning – if just in Essbase they don’t come over.  If this is a one-time migration you can use the EAS migration utility to bring the outline over after the LCM migration.
  • You must manually delete Shared Services groups in the target app if you deleted them in the source app (or they will remain).
  • Reports – you must manually update the data source in the target.
  • Members don’t come over with certain special characters.
  • Doesn’t support Clusters; must use the outline as HBR location.
  • Global Variables with limits in their definition don’t work.

Well, now you should be able to use LCM and judge for yourself whether it is right for your application.  In another BLOG I’ll show how to run LCM in a lights-out mode and also how to do some modifications to the export files so you can do things like sharing dimension detail between planning applications.

Enjoy!

Can FDM Do That?

Most everyone that uses or has seen a demo of Oracle Hyperion Financial Data Quality Management (“FDM”) knows the basic functionality that FDM provides – mapping, data loading, data validation and financial statement certification.  But FDM is so much more than the basics. 

FDM is a very open architecture product readily supporting advanced customization.  As I highlighted in my last blog post, the workflow process (Import  Validate  Export  Check) can be fully automated and executed on a scheduled basis using either FDM’s internal scheduling component or any Windows batch based scheduling tool that an organization prefers.  But that’s just the tip of the proverbial iceberg. 

Any organization that has recently experienced a Hyperion product upgrade – for example, System 9 to Fusion or Enterprise to HFM – knows the pain of revalidating years of financial data.  This exercise can easily take weeks.  Not only is this process time consuming, it’s tedious and often prone to error.  More importantly, data validation can be one of the biggest risks to a project.  The need to improve seems obvious. 

To address this opportunity, we developed a custom solution that leverages HFM and FDM’s advanced features, custom scripts and batch loading. The benefits are substantial – literally tens of thousands of data points can be validated (and re-validated) in minutes – with 100% data accuracy.  This process is easily extendable not only to other Oracle/Hyperion products like Planning & Essbase but potentially to other data stores.

The benefits of this process may be obvious but let’s take a moment to think about them:

  • 100% Data Accuracy – How valuable is this to your organization in the current economic and financial market climate?  The cost of restated financials is far too great to fathom – potential for government fines, reduced shareholder equity and even loss of one’s job.
  • Shorten Implementation Timelines – How nice would it be for your project to come in on time or early?  Using this solution, you can realistically trim weeks if not months out of a project timeline. 
  • Reduced Implementation Costs – let’s face it, in this economy, every dollar needs to pay dividends.  Whether you choice to leverage a consultant, temp, intern or internal resource to validate your data, there is a cost associated with it.  Reducing the time associated with this activity will reduce your project cost. 

I invite you to check back often as I’ll continue to discuss “outside the box” solutions that can add significant ROI to your FDM investment.   

Contributed by:
Tony Scalese, FDM Design Lead
Hyperion Certified Consultant – HFM
Ranzal & Associates
ascalese@ranzal.com

Business Intelligence Technology Environment – Welcome to the Buffet

Business Intelligence Technology Environment or BITE is my own little tag line and acronym (maybe I should copyright it) to express the host of solutions available in the Business Intelligence application world today. (It could also be used as a verb to describe the plethora of poorly designed solutions… ahh but that is another story.)

My current blog series will be Oracle EPM/BI+ solution centric while remaining Oracle EPM/BI+ application agnostic (now dictionary.com is paying off). I hope that you will enjoy this real life approach to the process of decision making on software solutions interspersed with some genuine tips and tricks of the trade — some that you have seen before and some you have never imagined.

In other words, I hope that you will not find this blog to be represented by my newly coined acronym — BITE.

Rules of conduct while at the Buffet

First we need a definition. Yes a definition! Don’t be afraid, definitions are a good thing, they keep us grounded, they set limits and finally they determine if we are true to our mission. I define BITE as processes, software and goals needed to precisely solution the business data critical to the legal, accounting and business decision needs of a specific entity.

Inventive techno junkies, single tool consultants and one track sales people – CLOSE YOUR EYES / SHEILD YOUR COMPUTERS for this next statement else you might go blind. “Precisely Solution” in the definition of BITE includes the moral imperative of not misusing software for intent other than its design and picking software that fits the current business life cycle of a company. (Those of you with Software Misuse problems, I will be posting a number you can call to get help. Remember the first step is admitting you have a problem.)

The application stack for EPM / BI+; HFM, Essbase (with all its add-on modules), Smart View, OBIE, OBAW, FDM, DRM, ODI and a few products you might not have heard about or you’ve heard about but never assessed for your purposes. NO, NO, No, no folks this is not a software sales blog, it’s a solutions blog and in our solutions toolbox we need to do more than use a single hammer creatively to remain competitive from an efficiency and business life cycle standpoint.

The Personalities in the Buffet Line

Now that we have some parameters (and I know it was painful for you left brainers) by which we can solution, we need some realistic company situations to solution. Let’s start with four companies each different in their business life cycle, staff sizes and demands for a BITE at success. You can email me if you will absolutely die without a very specific company example however, I cannot boil the ocean here in this blog (small ponds are all that will be possible).

Our four companies need to be different to see solutions in the work. Let’s pick a manufacturer, a technology company, a retailer and a commodity group. In my next addition we will outline the companies, their mission, their needs and their resources.