Troubleshooting Cloud Data Management Metadata Load Errors

In my last post, I highlighted a solution that was recently deployed for a customer that leveraged Enterprise Data Management Cloud Service (EDMCS), Financial Data Quality Management Enterprise Edition (FDMEE), and Cloud Data Management (CDM) to create an automated metadata integration process for both Planning and Budgeting Cloud Service (PBCS) and Financial Close and Consolidation Cloud Service (FCCS). In this post, I want to take a bit of a deeper dive into the technical build and share some important learnings.

Cloud Data Management introduced the ability to load metadata from a flat file to the Oracle EPM Cloud Services in the 17.11 patch. This functionality provides customers the ability to leverage a common platform for loading both data and metadata within the Cloud.  Equally important, CDM allows metadata to be transformed using its familiar mapping functionality.

As noted, this customer deployed both PBCS and FCCS. Within the PBCS application, four plan types are active while FCCS has the default two plan types.  A design decision was made for EDMCS to create a single custom application type that would store the metadata for both cloud applications.  This decision was not reached without significant thought as well as counsel with Oracle development.  While the pros and cons of the decision are outside the scope of this post, the choice to use a custom application registration in EDMCS ensured that metadata was input a single time but still fed to both cloud applications.  As a result of the EDMCS design decision, a single metadata file (per dimension) was supplied with properties necessary to support each plan type.

CDM leverages its 23 “dimensions” to store metadata information for processing. Exactly like data, metadata is imported using an import format into the CDM relational repository.  Each field from a metadata file is aligned to a CDM dimension field.  The CDM Account dimension always represents the target application member name and the CDM Entity dimension represents the parent of the member.  All other fields can be aligned to any of the remaining 21 dimensions.  CDM Attribute dimensions can be utilized in the import and mapping process but are not available for exporting to the cloud application.  This becomes an important constraint especially in a multi-plan type deployment.  These 21 fields can be used to store any of the properties required to successfully load metadata to the target plan type.

Let’s consider this case study for a moment. The PBCS application has four plan types.  If a process were to be built to load all plan types from a single CDM data load rule, then we would not be able to have more than five plan type specific attributes or properties because we would not have enough CDM fields/dimensions to store the relevant information.  This leads to an important design approach.  Instead of a single CDM data load rule to load all plan types, a data load rule was created for each plan type.  This greatly increased the number of metadata properties and attributes that could be loaded by CDM and ensured that future growth could be accommodated without a redesign of the integration process.

It is important to understand that CDM utilizes the Planning Outline Load Utility (OLU) to actually perform the metadata load to the cloud application. The OLU loads metadata using merge (yes Planning experts, I realize that I am not discovering fire) which is important to understand especially when processing multiple metadata loads for a single application.  When loading metadata, there are certain properties that are Application level.  I like to think of these as being global.  Additionally, there are plan type specific attributes that can align (or not align) to the application level value/setting.  I like to think of these as local.

Why is this important? Well when loading a metadata file, if certain global properties are excluded from the metadata load file, the local properties (if specified) are utilized to default the global properties. Since metadata is loading using merge, this only becomes problematic when a new member is being added to the outline.

In this particular example, an alternate hierarchy with shared members was specified in one of the plan types. The storage property of the alternates was obviously set as Shared; however, when attempting the metadata load, the following error was encountered:

A Base Member cannot be changed to a Shared Member.

After much investigation (details to follow), I discovered that the global property should also be included in the metadata load.

While CDM utilizes the OLU to load metadata, it does not provide as much verbosity in the error information as the PBCS web interface (which also uses OLU) when loading metadata. Below is an example of the error in the CDM process log.  As a tangent, I’d love to check the logs without needing to open a Service Request.  Maybe Oracle will build an enhancement that allows that in the future (hint, hint, wink, wink to my friends at Oracle).

Baha Mar - Error Handling 1

So where do I go from here? Well, what do we know about CDM loading metadata to the cloud application?  We know that CDM uses the OLU to load a flat file generated by CDM.  Bingo!  The metadata file output by CDM is a good starting point.  That file is in the Outbox of the CDM application and can be downloaded in several different ways – CDM Import process (get creative folks), CDM process details, or EPM Automate.  Now we have the metadata file and can test to determine if the error is caused by CDM or the metadata itself.  It’s all about ruling out variables.  So, we take the metadata file and import it manually within the PBCS web interface and are able to replicate the error.  But now we have an important new data point – the line number from the metadata file that is causing the error.

Baha Mar - Error Handling 2

Now that we have actionable information, we can review each property and start isolating and eliminating different variables. We determined that this error was only occurring for new alternate hierarchy parents being added to the outline.  As a test, we added the global storage property and voila, the metadata load completed successfully.  Face palm!

Maybe this would have been obvious to folks with a lot of Planning experience, but there are plenty of folks learning the intricacies of Planning and Essbase (including our friends converting from HFM to FCCS), so I wanted to share a lesson learned in my journey of metadata integration using CDM.

CDM functionality for metadata represents two of the three primary operations of ETL. In my next post, we’ll dive deeper into how the extract component of ETL was accomplished to provide a seamless end- to-end ETL solution for metadata.

Patch Today! Don’t Delay! Best Reasons to Upgrade Your EPM System

Putting off that upgrade to 11.1.2.4? Cloud not whetting your appetite for patches? Patch today. Don’t delay!

“But we’re going to the Oracle EPM Cloud soon!” you say. You should maintain your patches anyway. With the recurring maintenance, updates, and patches available to the EPM Cloud products, expect the on-premise patches to contain similar updates. An upcoming conversion to Oracle EPM Cloud products may benefit from running the latest on-premise codelines.

If you have an existing on-premise installation of Oracle EPM System, be sure to maintain the latest EPM System Patch Set Updates every 3 to 6 months. Here are a few great reasons why:

New Features

Patches often contain reactive bug resolutions to known issues; however, we have also been seeing new functionality released in patches for 11.1.2.4.

You Own It

You already pay for it! As long as your Oracle Maintenance contract is current (very likely if you are reading this article), you’re already paying for access to patches. Why leave them unapplied? You are running legacy code when the latest version costs you nothing additional. Windows XP was a great OS, but we’ve got to keep up with the times.

Supportability

Maximize your success by reducing time to resolution on your issues. Should you submit a support request to the vendor, the first line of response to a ticket is often about current patch levels. Once provided, the subsequent reply frequently contains a recommendation to apply the latest Patch Set Updates (PSUs) to see if that fixes the issue. Annoying? Perhaps you’re a pessimist. Or have just been remiss with your patching. I’ve certainly changed my mind on the matter and can better side with them. The reason? Supporting the latest codeline is more efficient and effective for the vendor. Your problem may have already been addressed in a code fix. They can better and more quickly support you if they are troubleshooting the current release instead of legacy code.

Stability

In older versions, patches seem to come out on a haphazard schedule. Over the last few years, Oracle has regularly streamlined EPM System patch releases – typically releasing Patch Set Updates quarterly, which are different from Patch Set Exceptions. PSUs are a grouping of PSEs or fewer, more significant PSEs that get regression tested collectively by the vendor and are released under a singular patch. We’ve gained a much higher degree of confidence with this bulk model of PSUs. The organization of release schedule and bug fixes is more dependable and greatly appreciated. The PSU model provides less ambiguity on which patches to apply and brings greater stability to all customers.

Upgrade

Maybe it’s bigger than patching. Are you not on version 11.1.2.4 of your EPM System? Compliance with Enterprise IT requirements around browser version and operating systems is often impetus for an upgrade. But there are also plenty of compelling new software features, functions, conventions, and improvements in 11.1.2.4.

Operating System (OS) support for current platforms maximizes your investment and supportability. When 2.4 came out, many customers were forced to upgrade their older systems for compliance with the latest enterprise standards for server operating systems and/or client browser versions. Instead of being faced with an IT mandated technology upgrade, an upgrade on the business’ schedule is preferred.

What Kind of Effort is Involved?

The comprehensive effort to bring a simple deployment (3-4 servers, no High Availability) up to the latest PSUs is typically less than a day per environment. That includes an analysis of existing patches, the patching itself as well as any prerequisites, and a post-check verification to confirm all patches applied are properly indicated in the corresponding inventories.

An initial patch application may take a little bit longer because there are often common prerequisites to address that don’t have to be handled with subsequent patching. There are also considerations like bringing WebLogic up to the latest patch level, as well as one-offs like the fixes for the Equifax-discovered vulnerabilities, that don’t happen frequently. Once you’ve got a solid base of primary critical patching, additional patching events are typically shorter.

Patching can be tricky. Documentation can often be ambiguous, whether it be an unintended omission or even assumed knowledge based on an implied experience or understanding of the product. Sometimes post-install instructions get skipped or SQL statements do not get executed properly as part of the patch. Less experienced resources typically only patch the EPMSystem11R1 Oracle Home; however, did you know that Oracle’s ADF framework also has an Opatch directory under oracle_common? Possibly because those are often prereqs. But what about Oracle Data Integrator (ODI) and Oracle HTTP Server (OHS)? They also may have applicable OPatches. Who knows what you’re missing? We do! Let’s button it up.

Contact us for more details.

Laser Tag for Cloud Analytics

A friendly game of laser tag between out-of-shape technology consultants became a small gold mine of analytics simply by combining the power of Essbase and the built-in data visualization features of Oracle Analytics Cloud (OAC)! As a “team building activity,” a group of Edgewater Ranzal consultants recently decided to play a thrilling children’s game of laser tag one evening.  At the finale of the four-game match, we were each handed a score card with individual match results and other details such as who we hit, who hit us, where we got hit, and hit percentage based on shots taken.  Winners gained immediate bragging rights, but for the losers, it served as proof that age really isn’t just a number (my lungs, my poor collapsing lungs).  BUT…we quickly decided that it would be fun to import this data into OAC to gain further insight about what just happened.

Analyzing Results in Essbase

Using Smart View, a comprehensive tool for accessing and integrating EPM and BI content from Microsoft Office products, we sent the data straight to Essbase (included in the OAC platform) from Excel, where we could then apply the power of Essbase to slice the data by dimensions and add calculated metrics. The dimensions selected were:

  • Metrics (e.g. score, hit %)
  • Game (e.g.Game 1, Game 2, Total),
  • Player
  • Player Hit
  • Target (e.g. front, back, shoulder)
  • Bonus (e.g. double points, rapid fire)

With Essbase’s rollup capability, dimensions can be sliced by any one item or at a “Total” level. For example, the Player dimension’s structure looks like this:

  • Players
    • Red Team
      • Red Team Player 1
      • Red Team Player 2
    • Blue Team
      • Blue Team Player 1
      • Blue Team Player 2

This provides instant score results by player, by “Total” team, or by everybody. Combined with another dimension like Player Hit, it’s easy to examine details like number of times an individual player hit another player or another team in total. You can drill in to Red Team Player 1 shot Blue Team or Red Team Player 1 shot Blue Team Player 1 to see how many times a player shot an individual player. A simple Smart View retrieval along the Player dimension shows scores by player and team, but the data is a little raw. On a simple data set such as this, it’s easy to pick out details, but with OAC, there is another way!

Laser Tag 1

Even More Insight with Oracle Analytics Cloud (OAC)

Using the data visualization features of OAC, it’s easy to build queries against the OAC Essbase cube to gain interesting insight into this friendly folly and, more importantly, answer the questions everybody had: what was the rate of friendly fire and who shot who? Building an initial pivot chart by simply dragging and dropping Essbase dimensions onto the canvas including the game number, player, score, and coloring by our Essbase metric “Bad Hits” (a calculated metric built in Essbase to show when a player hit a teammate), we discovered who had poor aim…

Laser Tag 2

Dan from the Blue team immediately stands out as does Kevin and Wayne from the Red team!  This points us in the right direction, but we can easily toggle to another visualization that might offer even more insight into what went on. Using a couple of sunburst type data visualizations, we can quickly tie who was shooting and who was getting hit – filtered by the same team and then weight by the score (and also color code it by team color).

Laser Tag 3

It appears that Wayne and Kevin from the Red Team are pretty good at hitting teammates, but it is also now easy to conclude that Wayne really has it out for Kevin while Kevin is an equal opportunity shoot-you-in-the-back kind of teammate!

Reimagining the data as a scatter plot gives us a better look at the value of a player in relation to friendly fire. By dragging the “Score” Essbase metric into the size field of the chart, correlations are discovered between friendly fire and hits to the other team.  While Wayne might have had the highest number of friendly fire incidents, he also had the second highest score for the Red team.  The data shows visually that Kevin had quite a few friendly fire incidents, but he didn’t score as much (it also shows results that allow one to infer that Seema was probably hiding in a corner throughout the entire game, but that’s a different blog post).

Laser Tag 4

What Can You Imagine with the Data Driving Your Business?

By combining the power of Essbase with the drag-and-drop analytic capabilities of Oracle Analytics Cloud, discovering trends and gaining insight is very easy and intuitive. Even in a simple and fun game of laser tag, results and trends are found that aren’t immediately obvious in Excel alone.  Imagine what it can do with the data that is driving your business!

With Oracle giving credits for a 30-day trial, getting started today with OAC is easy. Contact us for help!

A Safe Step into the Cloud: The Argument for Account Reconciliation Cloud Service (ARCS)

Before forecasting models, before fancy dashboards and pretty reports, before a data point is even considered “Actual” comes the age old question…

                “Does this number even look right?”

Bulls*#!

Account reconciliations – the means by which this question is answered – are a fundamental part of the financial close process. Imagine you are trying to build a sandcastle. Now imagine your “sand” is harvested from a cow pasture. You *could* continue to build this “sandcastle,” but you will likely finish with a pile of…bull-sand. In the same way, if your account balances and transactions have an integrity equivalent to “bull-sand,” this will inevitably lead to problems down the line.

sandcastle_collapsing_400px

The shift to the Cloud has complicated the decision-making process when considering new enterprise-wide application tools. The choice of whether to go with a known “on-premise” solution or take a bold step into Cloud solutions is a daunting one, particularly when considering moving high-visibility cycles such as forecasting or financial consolidations into this brave new world.

A Justified Recommendation

Take the measured move instead. If you feel hesitant to go “all-in” on Cloud offerings, here are four reasons why you should consider entering the Cloud through the arch of ARCS…the ARCSway (Get it?…archway…ARCSway…never mind – just keep reading…)

Safe Bet on a Strong Foundation

Oracle introduced Account Reconciliation Cloud Service (ARCS) as the “one stop shop” solution for managing and streamlining the reconciliation cycle in the Cloud back in 2016. While it’s not uncommon for some EPM products to lose functionality during their initial transition into the Cloud space, ARCS retains the “good bones” of its on-premise counterpart – Account Reconciliation Manager (ARM). ARCS builds upon the clever functionality and customizability of ARM, released in 2012, yet with the slick look and feel of the Oracle Cloud experience.

Since its release, ARCS has become the “golden child” of the reconciliation product family, receiving not only “first dibs” on refinement of existing capabilities, but also benefiting from the newest components such as Transaction Matching (note: this has separate licensing than the Reconciliation Compliance component of ARCS).  As the product continues to gain steam, this trend is expected to continue. Between utilizing the tried-and-true foundation of the ARM tool and having Oracle’s watchful eye, ARCS is a safe bet.

No Mistakes with Modularity

Unlike some applications, ARCS is easy to implement in pieces. While good design will certainly prevent future heartache, there are no decisions made on Day 1 of a project that cannot be modified or enhanced in the future:

  • Want to manually enter data for reconciliations today, but automatically load them from a source system tomorrow? We can do this.
  • Missing fields for additional detail you would like users to include? Can be ready for next period (or the current one even!)
  • Only want to rollout in one country to start? No problem – go ahead and make the other entities jealous!

While some changes are “cleaner” than others (I am looking at you, Profile Segments!), ARCS welcomes you to “test the waters” and see what works in your company without needing to go “all-in.” For example, a current client has a live ARM application that provides a viable solution for its reconciliation process needs given the initial project timeline and budget. Although the client wasn’t able to fully utilize the available functionality at the time, the modularity of the reconciliation tools (both ARM and ARCS) allows the opportunity for enhancements without punishing this design decision – we are now revamping the client’s auto-reconciliation setup to further streamline the process. For Partners, this means additional project phases; for clients, this means not biting off more than you can chew (win-win!).

Want to dive deeper into this topic?  Read the blog posts in the series Modularity in Account Reconciliation Cloud Service (ARCS): No Mistakes from “Day 1” to “Day 100.”

Fast Implementation Cycles and Rapid ROI

Relative to other EPM project lifecycles, ARCS is typically a quick implementation. As with all projects, there are certainly exceptions, but with Ranzal’s “Quick Start” methodology, we have stood up applications in just six weeks! A strong inventory of project “accelerators” – custom tools and scripts that Ranzal has developed based on common requests across multiple clients – allows sophisticated deployments in a timely manner. Couple this with the inherent time saving benefits of Cloud technology (i.e. lack of infrastructure setup, etc.), and ARCS shines as the first step in a Roadmap, producing tangible metrics for evaluation (ex. completion percentages per period, timeliness per Preparer/Reviewer, reconciliation accuracy, etc.) and giving users a taste of the Oracle Cloud experience in a short period of time.

You Don’t Have Anything Today and It’s Costing You

I know that may read like a presumptuous fear tactic, but hear me out:

Account reconciliations ARE being completed in your company – one way or another. Whether that means your CPAs are *click*click* clicking away on their keyboards to manually update Excel spreadsheets or – heaven forbid – actually printing out recons to hand sign, if you cannot name the system that is comprehensively handling your reconciliation cycle, it’s because there isn’t one.

And this is normal. But there are costs associated with this normalcy.

Reconciliation cycles aren’t sexy (well…personal taste…) and often have low visibility to upper management. And yet (!) the reconciliation process is often widespread across the company spanning business entities, departments, and corporate ladders (I see you, Mr/s. Director signing off on recons). ARCS is an attractive option when considering enterprise-wide Cloud solutions to “test run” because everyone can try it. A successful ARCS implementation paves the way for easier adoption of future projects – it gets everybody onboard.

Step Through the “ARCSway” and Ditch the Bulls*#!

The shift to the Cloud is disrupting the traditional market of on-premise EPM solutions. As you look at the new strategic options available to your company’s roadmap, consider ARCS as a “first step.” Of note, it is important to have an accurate understanding of the tool – ARCS is first and foremost a management tool, and although it can provide helpful information in troubleshooting account variances, it does not replace actually performing a reconciliation in an ERP system. Additionally, customizing reports can be difficult (unless you are familiar with BI Publisher), although the out-of-the-box reports and strong dashboarding capabilities largely make up for this limitation. All-in-all, I strongly recommend this product as an introduction to the new Oracle offerings. ARCS’ “low risk, high reward” nature provides real company value quickly while presenting you with a good picture of life in the Cloud. Now is the perfect time to ditch your “bull-sand” reconciliation process and update to a more solid foundation in the Cloud through the ARCSway.

Contact us today for details about a custom Cloud solution for your business needs.

Large Sandcastle

New…and Cool Features in EDMCS

Previously on The Wonderful World of Enterprise Data Management Cloud Service (EDMCS), we highlighted some of the features offered in this new product (released on Jan 25, 2018), including packaged application adapters for PBCS/EPBCS and the visual cues provided in the user interface as you modify master data. With the 18.03 release, subtle but helpful features have been added, and this post shares details of those along with useful tips resulting from actual project work done with one of three clients selected for the EDMCS Early Adopter Program. This client has fully embraced Oracle EPM Cloud by utilizing EDMCS, Financial Consolidation and Close Cloud Service (FCCS), Planning and Budgeting Cloud Service (PBCS), Account Reconciliation Cloud Service (ARCS), and Customer Data Management Cloud (CDM), along with on-premise Financial Data Quality Management Enterprise Edition (FDMEE).

Incremental Dimension Import

You now have two options available for import: Full (reset dimension) or Incremental. Full operates as expected and will completely erase and replace your dimension. All history will be lost. Certainly, a helpful feature when first seeding EDMCS, but a feature to be used carefully.

Incremental allows you to incrementally update property assignments during an import. Why is that important or useful? Because sometimes another system is point-of-entry and is feeding a new extract to EDMCS. Those extracts are typically complete data extracts (not incremental), so the incremental feature allows you to update new or changed property assignments without completely erasing your dimension.

*One important clarification – even when performing an incremental import, the complete set of node relationships must be imported as all parent-child relationships will be replaced. The “incremental” piece of the import applies to updating the changed property values of those node relationships.

New Features in EDMCS 1

Metadata Object Search

The search feature for objects (views, node sets, hierarchy sets, applications, etc.) has been expanded to include name and description. Enter your search text, and any object that contains the matching text in either the name or description is returned in the search results.

Node Search

Like the metadata object search, the node search in a viewpoint has been enhanced to include name and description. You do not need to toggle between name and description like in Data Relationship Management (DRM); instead, enter your search text and any node containing that text in the name or description is returned.

Request Load Files

EDMCS provides the capability to directly load Excel files, and a great way to start creating your request file is to use the download feature in EDMCS. Download a viewpoint (either the entire viewpoint or from the selected node and its descendants) to Excel, and you now have the basis for your request file: the complete “hierarchy” in Excel including parent, child, and all properties with the correct property labels as column headers in Excel. You can then modify the Excel file by adding an Action Code column and including the relevant node relationships and properties you wish to include in your request file.

*One note: If you need to modify top nodes in a request load file, you must include the Parent Node Type column. The column can be blank for the top nodes, but the column itself must be present. You can always modify top nodes interactively in a request.

Summary

While these features like incremental import and improved search function may seem minor, it’s often the little things that end up making a big difference when you put them all together. And it’s exciting to see this type of functionality being added so soon after the initial release.

Check back regularly for new updates and insights as EDMCS continues to evolve and mature -we’ll keep sharing! We’d love to hear your questions and observations related to EDMCS and how it fits into the ecosystem of Oracle EPM Cloud, so please comment below or contact us to share your experiences.

Try Oracle Cloud for Free

Easy Value with FDMEE Reports

Strolling into work sipping coffee, the realization soon hits that information is needed out of Financial Data Quality Management Enterprise Edition (FDMEE) for internal audit.  After logging in to Data Management, what happens?? We freeze!  And the questions begin swirling in our heads:  How do we get data out of FDMEE?  What are the drivers needed to do that?  What tools are needed to write an FDMEE report and from where do we get them?

At this point, it is often easier to evaluate existing reports within the application for what they lack rather than start creating a report from scratch and then modify and/or update them to meet our specific needs.

A Variety of Report Options

FDMEE Reports does not equal Financial Reports. From within the application, there are numerous options available to choose from for reports.  Most of these are updated reports from FDM Classic.  These groups help to focus on and categorize common reports together and provide information on the following:

  1. Audit Reports display all transactions for all locations that compose the balance of a target account
  2. Check Reports provide information on the issues encountered when data load rules are run
  3. Base Trial Balance Reports provide detail on how source data is processed
  4. Listing Reports summarize metadata and settings (such as the import format, or check rule) by the current location
  5. Location Analysis reports provide dimension mapping by the current location
  6. Process Monitor Reports shows locations and their positions within the data conversion process
  7. Variance Reports display source and trial balance accounts for one target account, showing data over two periods or categories
  8. Intersection Reports identify invalid HFM data load intersections

Below is a screen shot of the default FDMEE report groups:FDMEE Reports 1

 

 

 

 

 

 

 

 

Getting Started

While the canned reports are a great start, creating custom reports allows more creativity and only requires the following:

  1. Microsoft Word (2010+)
  2. Oracle BI Publisher 11.1.1.7 or 11.1.1.9
  3. Working knowledge of SQL
  4. Working knowledge of the FDMEE database tables

First, if you do not currently have Microsoft Word installed, this process isn’t going to work.  After confirming your version of Word, navigate to Oracle to download the BI Publisher software. (http://www.oracle.com/technetwork/middleware/bi-publisher/downloads/index-101746.html).

After installing the software, an access toolbar will become available:

FDMEE Reports 2

This is where the good nerdy stuff happens!  You need to write a query, via SQL*Developer or SMSS that can then be dropped into FDMEE to produce an XML.  In FDMEE, the query will produce an XML that contains the first 100 rows when you test/validate.  This XML file is what you can bring into BI Publisher (via Word) to produce your report.  Below is a screen shot of FDMEE-generated download for Word:

FDMEE Reports 3

And YES! FDMEE CAN Accept Inputs

FDMEE has the ability to have many prompts.  The information can be user input or a selection from a drop-down.  This information can be gathered/compiled in multiple smaller report queries or from out-of-the-box drop-downs.  Below is a sample FDMEE report with input parameters:

FDMEE Reports 4

Ample Value

Custom FDMEE Reports can be valuable in many ways.  For example, reports can be written to:

  1. Provide Data Compare analysis for data validation activities
  2. Track how many times an end user has exported data for a specific period
  3. Download the maps for a location to Excel
  4. List all the Journals posted by period and category
  5. List all the maps modification activity by date range
  6. List all the location and category and provides the status of each POV

Each of the report styles listed above has provided valuable information to both auditors as well as the administrators of the FDMEE application.   One of the most valuable reports is the one that permits quick data validations and reconciliations because it helps with COA conversions as well as upgrades to the EPM suite.  Here is a sample of a custom journal listing report:

FDMEE Reports 5

…and a custom FDMEE process monitor report:

FDMEE Reports 6

The Verdict

The possibilities and use of FDMEE for supplemental reporting is not limited to trial-balance analysis, trending, or variance reports. Reports are often created to provide additional valuable information for auditors, data workflow analysis, or external and downstream systems.  In many cases, they are used to provide additional and supplemental detail to IT or Financial auditors.  The verdict:  there is easy value added with variety and simplicity with FDMEE Reports.

Contact us at info@ranzal.com with questions about this product.

Speed-of-Thought, JD Edwards Data-Driven Decisions

In a recent trip to a hardware store, my son asked me why there are so many different types of screwdrivers – regular, Phillips, hex, etc. This led to a discussion around the fact that different jobs require different tools, and even though you may be able to use a regular screwdriver on a Phillips head screw, the job would be harder and would take longer to complete. He compared this to a time when he tried to do his math homework with a crayon instead of a pencil. He completed the homework, but his teacher could not understand half of it.

What does that have to do with JD Edwards and analytics? Well….

JD Edwards, like many other ERP tools, is great at what it is designed to do: it gives companies of all sizes – across many industries – an integrated set of applications that spans all aspects of the business, including sales, resource planning, supply chain, financials, and others. But analyzing the data generated in JDE can be pricy, inconvenient, and it may take a long time. This deficit is actually the result of one of JDE’s greatest strengths:  its flexible architecture that allows organizations to customize their deployments for their business needs. This flexibility means reporting and analytics solutions must be customized as well, and that’s where the price and time come in.

The traditional resolution for this problem is to build a data warehouse that allows for the integration of multiple sources into a dimensional model geared towards analytics and reporting. This solution, however, has its own implications.  The process that populates the data warehouse (extract, transform and load or ETL for short) can take several hours to complete, making it impossible for the business to have access to near real-time data. Also, data warehouses can take months, if not years to deploy, requiring heavy IT involvement and continuous maintenance, driving the total cost of ownership (TCO) up.

By now you may be thinking “But I also heard of analytics tools that can connect directly to my JDE database without a data warehouse. What about those?” Good thought!  But it is flawed.  ERP database schemas are designed for fast insert and update of records as opposed to analysis and reporting. This causes performance issues when you need to go through multiple tables and joins – potentially waiting minutes, if not hours – just to find out how many products you sold last week.  Additionally, this type of querying can put an extreme amount of stress on your JDE database, slowing down the processes that are required to keep your business running.

As a result, most organizations end up spending more money and time than anticipated to puzzle together a high-maintenance solution with multiple tools just to generate even the most basic reports.

So… what is the solution? Click here for a webinar that addresses these issues using a solution co-developed with our partner Incorta’s groundbreaking analytics to solve all the issues found in legacy operational reporting and traditional enterprise business intelligence tools and see a demonstration of Ranzal’s new, Incorta-powered templates for analyzing JDE data, all of which are available out-of-the-box.

This webinar offers great understanding of a quick-deployment solution that will address your organization’s specific needs and customizations while eliminating the cost and frustration associated with months and years of development time – using the power of Incorta and Edgewater Ranzal’s years of experience with deploying reporting solutions for JDE organizations.

Cloud Data Management (CDM) and Financial Data Quality Management Enterprise Edition (FDMEE): A Case Study in Working Together

Why buy Financial Data Quality Management Enterprise Edition (FDMEE) when Cloud Data Management (CDM) is free?  As outlined in my recent white paper – FDMEE vs. Cloud Data Management – there are myriad factors that can drive the decision.  This blog post highlights how one customer gained a highly flexible and automated solution for data and master data management with an on-premise deployment of FDMEE in conjunction with Cloud Data Management.

This customer adopted a pure Cloud strategy as it relates to Enterprise Performance Management (EPM) procuring subscriptions to Planning and Budgeting Cloud Service (PBCS), Financial Close & Consolidation Cloud Service (FCCS), and Account Reconciliation Cloud Service (ARCS).  A diverse business, the customer has many unique operational systems with varying formats and charts of accounts.  So far, no reason why Cloud Data Management (CDM) can’t handle this requirement, right?  This is what CDM does – uses import formats and maps to consume and transform data – right?  Sure, but with caveats.  Notice that I used the word consume and not extract.  CDM does not provide the ability to link with on-premise systems to extract data.  Additionally, flat file data extracts that lack a consistent structure often cannot be natively consumed by CDM.

In this case, data needs to be loaded each day from numerous sources to support daily operational reporting.  The systems are a blend of on-premise, hosted, and Cloud applications.  The customer requirement dictated that any on-premise system should be connected directly to eliminate the need for a flat file extract to be generated daily.  Additionally, the hosted and Cloud applications are very industry specific and, in some cases, provided by very niche vendors.  The ability to modify extract formats was cost prohibitive or simply not supported.  As a result, several of these data feeds were not consumable by CDM without preprocessing/modification.

In light of the above requirements, the customer procured and deployed FDMEE on-premise.  The power of FDMEE allows a solution to be deployed that provides a direct connection to multiple on-premise systems as well as consume the flat file extracts from hosted and Cloud applications including Excel files (not in the required FDMEE/CDM format) and XML.  Because FDMEE on-premise supports scripting, we were able to greatly enrich the data integration cycle with full end-to-end automation including FTP downloading of hosted data, enhancement of the data integration cycle to detect data mapped to members not yet in PBCS or FCCS, dynamically setting substitution variables based on the processing day, running calculations in PBCS, and sending email status alerts to outline the success or failure of a data load cycle.

Although I am a huge FDMEE advocate, I recognize the value of Cloud Data Management and the benefits it provides in a case like this one.  This customer was one of just three participants in the Oracle Enterprise Data Management Cloud Service (EDMCS) program.  This means that they were able to use the software before it was publicly available – otherwise known as GA.  To participate in this program, one must recognize the absence of certain features and functions with the software.  The program allows the customer (and partner) to offer Oracle development and product management valuable input about the software and in some ways drive what features are prioritized within the product roadmap.

EDMCS currently lacks native connections to FCCS, but this will change over time.  So how does CDM help with loading metadata to FCCS?  In a recent update to CDM, Oracle included the ability to import a flat file into CDM and load metadata to a registered target application such as PBCS or FCCS.  John Goodwin gives a detailed overview of the technical setup.

FDMEE and CDM have come together in this case to provide a fully automated data integration process and an automated master data integration process.  Within EDMCS, a Custom application type was created.  The required properties for FCCS were built and attached to the multiple dimensions being mastered, and flat file exports were generated for FCCS.  We knew we were going to use CDM to manage the master data load process, but we had a decision to make – do we leverage EPM Automate or FDMEE as our automation hub?

We chose FDMEE.  Why?  Simply because a lot of automation assets had already been developed in FDMEE that could readily be reused for this process including execution of EPM Automate commands, a framework for leveraging the REST API (for PBCS and FCCS), and email alerting.  Additionally, we found the capabilities of EPM Automate to be somewhat limited.

For example, when you execute a CDM data load rule from EPM Automate, the process ID associated with the execution is not returned.  Why is that important?  Because in the event of a failure, I’d want to download the process log and attach it to the email so the user has information to address the issue.  Could I use the ListFiles command of EPM Automate to get the process log? Possibly, but it doesn’t account for potential concurrency, and I am not doing my job as a consultant if I build a process that can’t handle concurrent operations.  For reasons such as these, we leveraged EPM Automate when possible and the REST API as needed, and we wrapped it all together with an FDMEE process that could be executed on a scheduled basis or on demand simply by using the Script Execution functionality.

Let’s review the end-to-end solution.  In EDMCS, metadata is maintained for PBCS and FCCS.  The metadata is extracted to a flat file (.csv) after maintenance is completed and saved to a network folder.  From FDMEE, the master data integration process is initiated to upload the metadata files to FCCS and PBCS.  Cloud Data Management data load rules are initialized to process the metadata extracts.  In the event of an error, the CDM process log is downloaded.  Finally, an email is generated to alert the administrator of the data integration process status.

There you have it – EDMCS, FDMEE, and CDM working in concert to provide a seamless and elegant solution to data and master data integration for a customer that adopted a Cloud EPM strategy.  If you want to learn how you can enhance your Oracle EPM integration processes, contact us and we’ll be happy to discuss your options.

Enterprise Data Management Cloud Service (EDMCS) – First Impressions

Continuing its momentum with Enterprise Performance Management (EPM) Cloud initiatives, Oracle recently released Enterprise Data Management Cloud Service (EDMCS). Here are some initial impressions of the application to provide fundamental information and spark discussion.

First, some background: these observations are based on an actual project from working with a client who was 1 of 3 selected for the EDMCS Early Adopter Program. This client is essentially going all-in on Oracle EPM Cloud, with Planning and Budgeting Cloud Service (PBCS), Financial Consolidation and Close Cloud Service (FCCS), and Account Reconciliation Cloud Service (ARCS). One on-premise component, Financial Data Quality Management Enterprise Edition (FDMEE), is also in the mix. This client quickly realized its reporting structures between the Planning/Budgeting and Financial Close/Consolidation worlds, while not identical, were similar and contained a high degree of shared structures. The idea of maintaining these reporting structures in multiple tools did not make sense, leading the client to inquire about EDMCS. After an evaluation, Oracle selected them to participate in the early adopter program for EDMCS with Edgewater Ranzal as the implementation partner.

EDMCS = DRM in the Cloud, Right?

Well, not exactly, but that’s not necessarily the right question to ask. EDMCS is NOT a lift-and-shift of Data Relationship Management (DRM) to the Cloud. Yes, there are similar concepts and constructs in EDMCS that a DRM administrator will quickly grasp (Add/Insert/Delete/Remove of members, Shared members, properties, and node types to name a few). But EDMCS utilizes a different philosophy to manage your enterprise master data along with a different data model, all geared around effective master data management for EPM Cloud products. It’s crucial to adopt a new mindset as you embrace EDMCS and not be constrained by “this is how DRM did it.”

With EDMCS, you will immediately notice new functionality such as the capability to create an Enterprise Planning and Budgeting Cloud Service (EPBCS) or PBCS application in EDMCS, which provides the built-in connectors, properties, and validations for those target applications. Simply step through the Register Application wizard, specify your dimensions and plan types, and EDMCS will automatically build the rest for you. The built-in properties and validations enforce constraints and business rules to ensure no changes can be made that could break EPBCS/PBCS.

For other use cases, EDMCS provides the ability to create a custom application along with custom properties. As EDMCS matures, the number of packaged connectors, applications, and validations will surely increase.

So, the Data Model is Different?

The EDMCS data model is quite different from DRM. Understanding the EDMCS data chain is crucial to effective administration, especially given new concepts such as Viewpoints, Hierarchy Sets, and Node Sets.

Key data objects include:

  • Node Type – a collection of nodes and associated properties for your application
  • Hierarchy Set – defines the parent-child relationships of nodes
  • Node Set – defines a group of nodes available for a viewpoint. This may include all nodes in a hierarchy set or a subset of nodes
  • Viewpoint – the other data objects come together to provide the viewpoint, which is essentially the “hierarchy” you interact with to modify nodes, parent-child relationships, and properties

The diagram below, taken from the Oracle EDMCS Administration Guide, is a useful reference as you start to build out your EDMCS applications. Future blog posts will explore these key constructs in more detail.

EDMCS Figure 1

Does EDMCS include Data Relationship Governance (DRG)?

Not yet, but workflows, approvals, separation of duties, and other data governance goodness is on the roadmap for EDMCS. But fear not! EDMCS already provides a “request” mechanism. Modifications to master data can only be performed within the context of a request. Requests can include interactive changes through the UI or batch loading of changes through an Excel request file (think of request files like automator or action scripts, but easier to use and yes, in Excel!). Comments can be included with a request and, continuing with one of the strongest features of DRM, requests provide auditability by capturing the who/what/when/where of every change performed in EDMCS.

How is the User Interface?

One of my favorite features is the visual feedback EDMCS provides as you make changes within a request. As you add, insert, remove, delete, reorder, or modify a member, visual icons and highlights are displayed for that member in real-time to capture the action being performed on that member. You basically get a preview of the change before it’s committed. Changes to properties are visually highlighted and easy to spot. Validations are performed as the request is in Draft status and instantly flag any violations with error messages highlighting the problem node and issue.

Summary

Overall, EDMCS is an exciting entry into the EPM Cloud market and a foundational tool critical to maximizing your EPM Cloud investment. While DRM administrators will experience an adjustment period as they learn EDMCS due to the data chain and new terminology, they will be pleasantly surprised with the available functionality such as pre-packaged connectors and properties for PBCS/EPBCS, the use of requests (and did I mention you can load Excel files?!), and the real-time visual feedback as you modify and validate your master data.

Oracle Data Visualization for Strategic Analytics

The world of analytics and data visualization continues to change at a rapid pace. New tools, processes, buzz words (Cloud anyone?) have penetrated our industry and can become overwhelming.  Most of these changes, though, are for the better – one of them being self-service data visualization. Solutions like Tableau, PowerBI, and Qlik have been around for several years, earning reputations as leaders in the self-service, easy, and sexy exploration of data.

Oracle Business Intelligence Enterprise Edition (OBIEE), although a great tool and long-time market leader, lacks ease of deployment, maintenance, and connectivity, and the freedom from “IT tyranny” access to data craved by business users – attributes already addressed by Tableau and others. For many years, I’ve listened to IT describe OBIEE as its enterprise Business Intelligence (BI) solution while business users use Tableau to connect to their own databases and spreadsheets because Tableau doesn’t require them to enter a ticket and potentially wait weeks for that new column to be added to their report.

Today, there is an Oracle tool – Oracle Data Visualization (DV), an Oracle Analytics Cloud (OAC) component,  that exceptionally meets the needs of business users.  Because it is part of a suite of products, Data Visualization also provides enterprise and financial reporting capabilities, advanced analytics and big data, mobile access, what-if scenario modeling, ingestion, preparation and transformation of data, and yes – you guessed it – Cloud.  Some highlights include:

  • Dynamic visualizations that can be organized into stories to be shared across the organization
  • Consumer (drag and drop) style with easy uploads, mashups, and exploration
  • Mobile authoring and consumption, device agnostic, and with dynamic design optimization
  • Connections to dozens of different sources, including SaaS, relational databases, big data tools, NoSQL, and others through JDBC/ODBC
  • Cloud and desktop versions with identical features

The full Oracle Analytics Cloud suite is a comprehensive solution that offers analytics and reporting tools to effectively address business requirements as well as accommodate different users within an organization (analysts, consumers, admins, etc.). Also included with OAC is Business Intelligence Cloud Service (BICS), a tool that is essentially “OBIEE in the Cloud.”  Data Visualization and BICS complement each other and address different needs, summarized in the following:

BICS vs DV

With continued rapid changes inevitable in technology, the future of Oracle Analytics Cloud will likely be promising as additional components are added. After seeing and supporting clients as they “take the plunge” into Cloud analytics, it becomes clearer that Data Visualization is Oracle’s strategic future of analytics.

See an overview of the history of Oracle DV and OAC as well as a demonstration of the products’ main features and wide array of possible sources in this webinar: The Strategic Future of Analytics…Starring Oracle Data Visualization