We’ve Got You Covered: Producing Flat-File Extracts out of Cloud Data Management

As an EPM Administrator or Implementation Specialist, we have all had that moment when someone comes to us and asks for the dreaded extract out of an Enterprise Performance Management (EPM) application.  Depending on the system combination (Hyperion Financial Management (HFM), Planning, etc.) and the file layout specifications, this can be tricky.  Layer in the concept of a Cloud application, and things have now gotten real!

In an on-premise installation of Financial Data Quality Management Enterprise Edition (FDMEE), we could use scripting within a “custom application” to build an end-to-end approach for delivering a flat-file extract for third party consumption.  With the release of version 19.06, Oracle has further enhanced this concept and brought it to Cloud Data Management (CDM). The Cloud application now provides the ability to design and produce a text file for downstream consumption in Oracle Cloud products (PCMCS, EPBCS, FCCS).  WHOA!

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 1

The Setup

I recently busted out the functionality and this is what I have discovered:  It’s crazy simple!

  1. Create a text file with your defined headers

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 2

  1. Create a target application and set your settings

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 3

3.  Create an import format

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 4

  1. Create a Location & DLR

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 5

  1. Create the desired Maps
  2. Run the Data Load Rule

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 6

It is that simple!  CDM produces a file that looks similar to this:

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 7

It can PIVOT!?

As crazy as it sounds, it can even pivot the data!  I find this extremely helpful as it is a common request to have twelve months of data in column format.  CDM leverages the PIVOT command of the database for this process and creates the pivot file with ease and efficiency.

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 8

What does it do behind the scenes?

Behind the scenes, CDM appears to run a standard import and validation of the data, but it leverages a different set of workflow instructions.  The process does not consider unmapped items which are left as blank fields in the output.

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 9

It also does not permanently store any data in the CDM repository unless you want it to.  The documentation can be easily misinterpreted because you will see “fish,” but no data is stored (more on this later).  A quick review of the process details log shows that all the work is done in the “tDataSeg_T” table.  This is the “temporary working” table of Data Management, and it is cleared after/before each new run for optimal performance.  Since the data is never moved out of this table, it is never retained.  Even the export process that produces the output file pulls from the temporary table.

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 10

A review of the Documentation shows that there are 3 main supported types for processing:

  • Simple (the option selected here) – Does all the work in tDataSeg_T and does not retain any data or archive maps. Although, be warned, it does retain the process details and “fish” status which can look a bit strange.

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 11

  • Full No Archive – Data is retained in tDataSeg only after the import step. Data is deleted after the export.
  • Full – All data is retained. Full process flows are supported (check rules, drill down, etc.).

That’s great, but my file is stuck in the Cloud!

Not really…let’s think this through in a workflow process.  When using Cloud applications, we might have an automation wrapper or a larger workflow process.  If not, we are using the general user interface (GUI), and we can access the file in two ways:

  1. Data File Explorer
  2. Process Details -> Download File option

Wayne Paffhausen - Weve Got You Covered - 7-26-19 Image 12

If we are using a more automated approach, we just need to include additional steps to:

  1. Monitor the data load rule for completion
  2. Verify the status of that completion (do not proceed forward if it failed; do something different)
  3. Confirm that the file was created
  4. Download the file that was created
  5. Continue the automated routine

In Summary…

It is simple to produce a file using Data Management in the EPM Cloud products.  This is a welcomed change that further enhances the product lines by delivering on client needs.  This allows us to build a simplified Cloud solution that was previously only on-premise.

If you need more information or have questions about this topic, email us at infosolutions@alithya.comSubscribe to receive notifications about new posts.  Follow Alithya on social media for the latest information about EPM, ERP, and Analytics solutions to meet your business needs.

Twitter  |  Linkedin  |  Facebook  |  Youtube

Alithya Leverages the Power of Oracle Hyperion FDMEE

One of the biggest challenges of every organization nowadays is to provide reliable data for a clear business outlook. This essential activity is more critical than ever now that the solutions for hosting data are increasingly varied, with multiple scenarios involving on-site hosting, Cloud, and hybrid solutions. However, there are solutions that allow companies to efficiently and seamlessly navigate amongst the different hosting solutions. Alithya Group (NASDAQ: ALYA, TSX: ALYA) (“Alithya”) is well positioned to advise its clients on this topic.

Efficient management of data requires solid know-how.

As companies attempt to develop long-term guidance in this area, Alithya ensures that its clients’ data hosted in different environments continue to be used effectively. Alithya’s Data Governance and Integration practice includes specialists in Data Integration to help free up resources leveraging FDMEE for data validation and to maximize FDMEE with its offering for financial data application review.

Alithya’s Tony Scalese published a book providing deeper understanding of FDMEE.

Banking on the numerous mandates Alithya Group has been entrusted by its clients as a market-leading provider of Oracle Enterprise Performance Management Platform solutions, the company leverages the power of Oracle Hyperion Financial Data Quality Management, Enterprise Edition (FDMEE) to help organizations enhance the quality of internal controls and reporting processes. The extensive Alithya team specializing in these FDMEE solutions has among its ranks a widely recognized expert in the market, Tony Scalese, VP of Technology at Alithya and Oracle ACE, who published The Definitive Guide to Oracle FDMEE [Second Edition], in May 2019.

Connecting current on-premise and future Cloud solutions.

“As thought leaders, we are committed to providing essential resources to help clients enhance the quality of internal controls and reporting processes,” stated Chris Churchill, Senior Vice President at Alithya. “Our Data Governance and Integration practice aligns offerings with best practices and includes a team of dedicated experts as well as some of the most comprehensive resources in the industry.”

Sharing real-world FDMEE deployment strategies.

It is the great interest of Tony Scalese for the integration of data and the sharing of his great knowledge with a maximum of interested parties that led him to publish books on Oracle FDMEE. After a first edition that was very successful in 2016, he just launched the second edition of The Definitive Guide to Oracle FDMEE. Since many organizations are now considering or have begun migrating to the Cloud, the book provides a deeper understanding of FDMEE by informing readers about such topics as batch automation, Cloud & hybrid integration, and data synchronization between EPM products.

“FDMEE can integrate not only with on-premise applications, but also Oracle EPM Software as a Service (SaaS) Cloud Service offering,” says Tony. “It provides the foundation for Cloud Data Management and Integrations which are embedded in each of the EPM Cloud Services.  A deep understanding of FDMEE ensures that integrations built on-premise or in the Cloud function well and stand the test of time.”

Retro Reboot #1: Set It & Forget It – Scheduling FDMEE Tasks

As with most nostalgic items, reboots are the next best thing. From video game consoles to television shows, they are all getting a modern facelift and a new prime-time seat on television.  I have jumped on that band-wagon to revitalize a previous post authored by Tony Scalese: Set it & Forget It – Scheduling FDM Tasks.

As with most reboots, there must be flair and alluring content to capture old and new audiences. Since Oracle Financial Data Quality Management Enterprise Edition (FDMEE) has been in the Enterprise Performance Management (EPM) space for a while and has moved into the Cloud, this is a great time for its reboot!

Oh Great…A Reboot. Now What?

Scheduling tasks in FDMEE has never been easier. Oracle provides several ways to do this for a variety of out-of-the-box activities.  Is there a report that you want to run and email every hour?  Or how about a script that needs to run hourly?  Or maybe batch-automation every 15 minutes?  No worries!  FDMEE can handle all of that with out-of-the-box functionality.

Let us pause for a moment and determine what is needed to make this happen:

  1. Is there a business case and justification for what is about to be scheduled?
  2. Who benefits and how will they be notified of the results?
  3. Is there a defined frequency for which the activity must take place?

Getting Started

First, understand that the scheduling for FDMEE is built directly into the Graphical User Interface (GUI) anywhere you see the “SCHEDULE” button. Unlike the previous FDM counterpart which had it as an independent utility to be installed/configured, the ease of having it via the Web has removed some complexity.

A word of caution:  while this screen allows items to be scheduled, there isn’t a screen that shows “what has been” scheduled.  To do that, access to the Oracle Data Integrator (ODI) is needed, but more on this later.

Initially, the screen shows the types of schedules that can be created and their relevant inputs.

Retro Reboot Screen Shot 1

Below is a reference guide to outline FDMEE’s scheduling capabilities.

Schedule Type Inputs Notes / Examples
Simple TimeZone, Date, HH:MM:SS, AM/PM Single run based on the specified inputs.

 

Example:  Run 08/02/2018 @ 11AM

Hourly TimeZone, MM:SS Repeatable run at the specified time MM:SS time.

 

Example:  Run every hour, at the 22minute mark.

Daily TimeZone, HH:MM:SS, AM/PM Every day at the specified time.

 

Example:  Run every day at 11AM.

Weekly TimeZone, Day of the Week, HH:MM:SS, AM/PM Every specified day at the specified time.

 

Example: Run every Monday thru  Friday at  11AM.

Monthly
(day of month)
TimeZone, Date, HH:MM:SS, AM/PM Specified day at the specified time.

 

Example: Run on the 2nd day of every month at 11AM.

Monthly
(week day)
TimeZone, Iteration, Weekday, HH:MM:SS, AM/PM Specified interval and week day at the specified time.

 

Example: Run every third Tuesday at 11AM.

Why Does the Job Run Under My UserID?

That is because the system assigns the user’s credentials who created the schedule. What can go wrong with that, right?!  Well, if a user no longer exists or a password is changed, the existing jobs will no longer run.

The following considerations should be observed:

  1. Dedicate a service account that is not being used by an employee to be used for server/automation actions.
  2. This account can be a “native” user; since the account is only used internally for EPM products, having a domain account is not needed.
  3. Non-expiry passwords are best.

 It is Scheduled…Now What?

After the item is scheduled, what really happens? The action executes at the scheduled time!  Actions can easily be monitored via the FDMEE Process Details screen.  Now all the possibilities of scheduling the following can be explored:

  1. Data Load Rules
  2. Script Executions
  3. Batch Executions
  4. Report Executions

Also, as mentioned earlier, there is no way to see the batches inside of FDMEE. For that, information can be retrieved in a few ways.  The easiest way to see what is scheduled is to use the ODI Studio.

The ODI Studio provides details as seen in the screen shot below:

Retro Reboot Screen Shot 2

Any scheduled tasks will be listed under “All Schedules.” Simply double click them to obtain details related to that task.

Retro Reboot Screen Shot 3

Another effective option is to write a custom report that displays the information. My previous blog post, Easy Value with FDMEE Reports, provides further details of FDMEE report options and their value.  This would allow a report to be executed to provide a user-friendly report.

Seriously … What Now?

By now, you may have noticed from the previous blog post Scheduling FDM Tasks – A Second Option by Tony Scalese that the upsShell process is quite handy.  It allows other tools to control the FDM jobs…maybe through a corporate scheduler.  Now that most organizations have a corporate scheduler, the new FDMEE options below must be learned:

Command Purpose
Executescript.bat / .sh Executes an FDMEE Custom Script
Importmapping.bat / .sh Executes an import from text-file for Maps
Loaddata.bat / .sh Executes a Data Load Rule
Loadhrdata.bat / .sh Executes an HR Data Load Rule
Loadmetadata.bat / .sh Executes a Metadata Load Rule
Runbatch.bat / .sh Executes a defined Batch
Runreport.bat / .sh Executes a defined Report

*All files are stored in the EPM_ORACLE_HOME\products\FinancialDataQuality\bin\

In the example below, the command, when launched, executes a Data Load Rule for Jan-2012 thru Mar-2012:

Retro Reboot Screen Shot 4

There still must be a better solution…right? Things to overcome:

  1. What happens if the scheduler is Windows-based and the server is Linux?
  2. How does a separate scheduling server communicate with EPM? Does it have to be installed on each EPM Server?
  3. How can we monitor and get details of a job once it is kicked off?

What Happens if You Don’t Want to Run the .BAT/.SH Files?

You’re in luck! With the introduction of new functionality to FDMEE, RESTful APIs are also now available.  With the RESTful APIs, not only can you execute a job, but you can also loop and monitor for the results.  This enhances the previous .BAT/.SH file routines and provides a cleaner and more elegant solution.

Command Purpose
Running Data Rules Execute a Data Load Rule
Running Batch Rules Execute a Batch Definition
Import Data Mapping Import Maps
Export Data Mapping Export Maps
Execute Reports Execute a Report

*URL construct: https://<SERVICE_NAME>/aif/rest/V1

The below example is just querying for a process:

Retro Reboot Screen Shot 5

The Future…

As Oracle moves forward to enhance the RESTful APIs, many doors continue to open for FDMEE and tool scheduling. At Edgewater Ranzal, we fully embrace the RESTful concept and evolve our solutions to utilize this functionality.  The result is improved support and flexibility of FDMEE and the future of Oracle Cloud products.

Contact us at info@ranzal.com with questions about this product or its capabilities.

When FDM Isn’t an Option…Using Essbase to Map Data

lots-of-arrowsThere are times when you do not have an option of using FDM to do large data mapping exercises prior to loading data into Essbase. There are many techniques for handling large amounts of data mappings in Essbase, I have used the technique oultined here several times for large mappings and it continues to exceed my expectations from a performance and repeatability perspective.

Typically, without FDM or some other ETL process, we would simply use an Essbase load rule to do a “mapping” or a replace. However, there are those times when you need to do a mapping based on multiple members. For example, if account = x and cost center = y then change account to z.

Let’s first start with the dimensionality that is in play based on the example below: Time, Scenario, Type, NOU, Account, Total Hospital, and Charge Code

Dimension Type Members in
Dimension
Members
Stored
Time Dense 395 380
Scenario Dense 13 6
Type Sparse 4 4
NOU Sparse 25 18
Account Sparse 888 454
Total Hospital Sparse 5523 2103
Charge Code Sparse 39385 39385

You then need to be able to identify the logic of where the mapping takes place.  I will want to keep the mapping data segregated from all other data so I will load this to a Mapping scenario (Act_Map).  I load a value of ‘1’ to the appropriate intersection, always level0.  Since the mapping applies to all Period detail I will load to a BegBalance member.  The client will then update this mapping file from a go forward basis based on new mapping combinations.

Here is a sample of what the mapping file looks like that gets loaded into Essbase:
NOU STATUS Revised DEPT ACCT # CDM Data
SLJ   IP            2CC      2         0012013         1
SLJ   IP            2CC      2         0012021         1
SLJ   IP            2CC      2         0012062         1

Here is what it looks like when you do a retrieve.  So for 4410CC->2600427->IP->67->SVM there is a value of 1 and for 4410CC->2600435->IP->67->SVM as well.

Essbase Mapping

The next step in the process is to load the actual data that ultimately needs to be mapped. I will load this data based on the detail and dimensionality I have, again at level0.  In my experience, the data is missing a level of detail (GL account for project based planning, Unit/Stat for charge master detail, etc.). So this data gets loaded to specific “No_Dimension” member via a load rule or a generic member. Again, I load this data to a separate scenario as well (Act_Load).

In the example below you will see I am loading Account detail (67 & 68 in the above screenshot) to the Stat_Load member. The data comes across missing the account detail.

essbase mapping

The final step is to calculate the Actuals scenario based on the two scenarios above. You will see that after we run the calculation, Current Yr Actuals is calculated correctly in that the data resides where it should reside.

essbase mapping

Keeping all the data segregated in different scenarios allows you to easily clear data should anything be wrong with one of the loads, thereby keeping the other datasets intact. This process runs on the entire year in less than 2 minutes and not only performs the calculation but also does an aggregation for the Current Yr Actuals.

Come See Edgewater Ranzal at Kscope11

ODTUG Kscope11 is right around the corner. Kscope11 offers the chance for a full day EPM Symposium on Sunday, plus the opportunity to learn from experts in the EPM and BI fields on a wide range of topics.

Edgewater Ranzal will be well represented at the conference, with our associates presenting eight presentations covering Planning, DRM, EPMA, HFM, and FDM. The sessions that we will be presenting at Kscope11 are summarized below. Each title links to an abstract for the presentation, providing additional details.

Session No. Date Time Room Presenter Title
1 6/27/11 11:15 – 12:15 102C Jeff Richardson Calculation Manager:  The New and Improved Application to Create Planning Business Rules
7 6/28/11 11:15 – 12:15 103C Tony Scalese Planning (or Essbase) and FDM and ERPi Equals Success!
10 6/28/11 4:30 – 5:30 101B Chris Barbieri Security and Auditing in HFM
11 6/29/11 8:30 – 9:30 103A Patrick Lehner Best Practices for Using DRM with EPMA
11 6/29/11 8:30 – 9:30 101B Chris Barbieri Getting Started with Calc Manager for HFM
12 6/29/11 9:45 – 10:45 101B Chris Barbieri Advanced Topics in Calc Manager for HFM
12 6/29/11 9:45 – 10:45 102C John Martin Have it Your Way: Building Planning Hierarchies with EPMA or Outline Load Utility
13 6/29/11 11:15 – 12:15 101B Tony Scalese Maximizing the Value of an EPM Investment with ERPi, FDM, & EPMA
17 6/30/11 8:30 – 9:30 101B Tony Scalese Taking Your FDM Application to the Next Level with Advanced Scripting
18 6/30/11 10:30 – 11:30 101B Peter Fugere IFRS Reporting Within Hyperion Financial Management

In addition to the presentations above, you can catch up with our experts at our booth in the Vendor Showcase.

We look forward to seeing you in Long Beach. If you haven’t already registered, you can do so here.

Business Intelligence Technology Environment – Welcome to the Buffet

Business Intelligence Technology Environment or BITE is my own little tag line and acronym (maybe I should copyright it) to express the host of solutions available in the Business Intelligence application world today. (It could also be used as a verb to describe the plethora of poorly designed solutions… ahh but that is another story.)

My current blog series will be Oracle EPM/BI+ solution centric while remaining Oracle EPM/BI+ application agnostic (now dictionary.com is paying off). I hope that you will enjoy this real life approach to the process of decision making on software solutions interspersed with some genuine tips and tricks of the trade — some that you have seen before and some you have never imagined.

In other words, I hope that you will not find this blog to be represented by my newly coined acronym — BITE.

Rules of conduct while at the Buffet

First we need a definition. Yes a definition! Don’t be afraid, definitions are a good thing, they keep us grounded, they set limits and finally they determine if we are true to our mission. I define BITE as processes, software and goals needed to precisely solution the business data critical to the legal, accounting and business decision needs of a specific entity.

Inventive techno junkies, single tool consultants and one track sales people – CLOSE YOUR EYES / SHEILD YOUR COMPUTERS for this next statement else you might go blind. “Precisely Solution” in the definition of BITE includes the moral imperative of not misusing software for intent other than its design and picking software that fits the current business life cycle of a company. (Those of you with Software Misuse problems, I will be posting a number you can call to get help. Remember the first step is admitting you have a problem.)

The application stack for EPM / BI+; HFM, Essbase (with all its add-on modules), Smart View, OBIE, OBAW, FDM, DRM, ODI and a few products you might not have heard about or you’ve heard about but never assessed for your purposes. NO, NO, No, no folks this is not a software sales blog, it’s a solutions blog and in our solutions toolbox we need to do more than use a single hammer creatively to remain competitive from an efficiency and business life cycle standpoint.

The Personalities in the Buffet Line

Now that we have some parameters (and I know it was painful for you left brainers) by which we can solution, we need some realistic company situations to solution. Let’s start with four companies each different in their business life cycle, staff sizes and demands for a BITE at success. You can email me if you will absolutely die without a very specific company example however, I cannot boil the ocean here in this blog (small ponds are all that will be possible).

Our four companies need to be different to see solutions in the work. Let’s pick a manufacturer, a technology company, a retailer and a commodity group. In my next addition we will outline the companies, their mission, their needs and their resources.