Troubleshooting Cloud Data Management Metadata Load Errors

In my last post, I highlighted a solution that was recently deployed for a customer that leveraged Enterprise Data Management Cloud Service (EDMCS), Financial Data Quality Management Enterprise Edition (FDMEE), and Cloud Data Management (CDM) to create an automated metadata integration process for both Planning and Budgeting Cloud Service (PBCS) and Financial Close and Consolidation Cloud Service (FCCS). In this post, I want to take a bit of a deeper dive into the technical build and share some important learnings.

Cloud Data Management introduced the ability to load metadata from a flat file to the Oracle EPM Cloud Services in the 17.11 patch. This functionality provides customers the ability to leverage a common platform for loading both data and metadata within the Cloud.  Equally important, CDM allows metadata to be transformed using its familiar mapping functionality.

As noted, this customer deployed both PBCS and FCCS. Within the PBCS application, four plan types are active while FCCS has the default two plan types.  A design decision was made for EDMCS to create a single custom application type that would store the metadata for both cloud applications.  This decision was not reached without significant thought as well as counsel with Oracle development.  While the pros and cons of the decision are outside the scope of this post, the choice to use a custom application registration in EDMCS ensured that metadata was input a single time but still fed to both cloud applications.  As a result of the EDMCS design decision, a single metadata file (per dimension) was supplied with properties necessary to support each plan type.

CDM leverages its 23 “dimensions” to store metadata information for processing. Exactly like data, metadata is imported using an import format into the CDM relational repository.  Each field from a metadata file is aligned to a CDM dimension field.  The CDM Account dimension always represents the target application member name and the CDM Entity dimension represents the parent of the member.  All other fields can be aligned to any of the remaining 21 dimensions.  CDM Attribute dimensions can be utilized in the import and mapping process but are not available for exporting to the cloud application.  This becomes an important constraint especially in a multi-plan type deployment.  These 21 fields can be used to store any of the properties required to successfully load metadata to the target plan type.

Let’s consider this case study for a moment. The PBCS application has four plan types.  If a process were to be built to load all plan types from a single CDM data load rule, then we would not be able to have more than five plan type specific attributes or properties because we would not have enough CDM fields/dimensions to store the relevant information.  This leads to an important design approach.  Instead of a single CDM data load rule to load all plan types, a data load rule was created for each plan type.  This greatly increased the number of metadata properties and attributes that could be loaded by CDM and ensured that future growth could be accommodated without a redesign of the integration process.

It is important to understand that CDM utilizes the Planning Outline Load Utility (OLU) to actually perform the metadata load to the cloud application. The OLU loads metadata using merge (yes Planning experts, I realize that I am not discovering fire) which is important to understand especially when processing multiple metadata loads for a single application.  When loading metadata, there are certain properties that are Application level.  I like to think of these as being global.  Additionally, there are plan type specific attributes that can align (or not align) to the application level value/setting.  I like to think of these as local.

Why is this important? Well when loading a metadata file, if certain global properties are excluded from the metadata load file, the local properties (if specified) are utilized to default the global properties. Since metadata is loading using merge, this only becomes problematic when a new member is being added to the outline.

In this particular example, an alternate hierarchy with shared members was specified in one of the plan types. The storage property of the alternates was obviously set as Shared; however, when attempting the metadata load, the following error was encountered:

A Base Member cannot be changed to a Shared Member.

After much investigation (details to follow), I discovered that the global property should also be included in the metadata load.

While CDM utilizes the OLU to load metadata, it does not provide as much verbosity in the error information as the PBCS web interface (which also uses OLU) when loading metadata. Below is an example of the error in the CDM process log.  As a tangent, I’d love to check the logs without needing to open a Service Request.  Maybe Oracle will build an enhancement that allows that in the future (hint, hint, wink, wink to my friends at Oracle).

Baha Mar - Error Handling 1

So where do I go from here? Well, what do we know about CDM loading metadata to the cloud application?  We know that CDM uses the OLU to load a flat file generated by CDM.  Bingo!  The metadata file output by CDM is a good starting point.  That file is in the Outbox of the CDM application and can be downloaded in several different ways – CDM Import process (get creative folks), CDM process details, or EPM Automate.  Now we have the metadata file and can test to determine if the error is caused by CDM or the metadata itself.  It’s all about ruling out variables.  So, we take the metadata file and import it manually within the PBCS web interface and are able to replicate the error.  But now we have an important new data point – the line number from the metadata file that is causing the error.

Baha Mar - Error Handling 2

Now that we have actionable information, we can review each property and start isolating and eliminating different variables. We determined that this error was only occurring for new alternate hierarchy parents being added to the outline.  As a test, we added the global storage property and voila, the metadata load completed successfully.  Face palm!

Maybe this would have been obvious to folks with a lot of Planning experience, but there are plenty of folks learning the intricacies of Planning and Essbase (including our friends converting from HFM to FCCS), so I wanted to share a lesson learned in my journey of metadata integration using CDM.

CDM functionality for metadata represents two of the three primary operations of ETL. In my next post, we’ll dive deeper into how the extract component of ETL was accomplished to provide a seamless end- to-end ETL solution for metadata.

Patch Today! Don’t Delay! Best Reasons to Upgrade Your EPM System

Putting off that upgrade to 11.1.2.4? Cloud not whetting your appetite for patches? Patch today. Don’t delay!

“But we’re going to the Oracle EPM Cloud soon!” you say. You should maintain your patches anyway. With the recurring maintenance, updates, and patches available to the EPM Cloud products, expect the on-premise patches to contain similar updates. An upcoming conversion to Oracle EPM Cloud products may benefit from running the latest on-premise codelines.

If you have an existing on-premise installation of Oracle EPM System, be sure to maintain the latest EPM System Patch Set Updates every 3 to 6 months. Here are a few great reasons why:

New Features

Patches often contain reactive bug resolutions to known issues; however, we have also been seeing new functionality released in patches for 11.1.2.4.

You Own It

You already pay for it! As long as your Oracle Maintenance contract is current (very likely if you are reading this article), you’re already paying for access to patches. Why leave them unapplied? You are running legacy code when the latest version costs you nothing additional. Windows XP was a great OS, but we’ve got to keep up with the times.

Supportability

Maximize your success by reducing time to resolution on your issues. Should you submit a support request to the vendor, the first line of response to a ticket is often about current patch levels. Once provided, the subsequent reply frequently contains a recommendation to apply the latest Patch Set Updates (PSUs) to see if that fixes the issue. Annoying? Perhaps you’re a pessimist. Or have just been remiss with your patching. I’ve certainly changed my mind on the matter and can better side with them. The reason? Supporting the latest codeline is more efficient and effective for the vendor. Your problem may have already been addressed in a code fix. They can better and more quickly support you if they are troubleshooting the current release instead of legacy code.

Stability

In older versions, patches seem to come out on a haphazard schedule. Over the last few years, Oracle has regularly streamlined EPM System patch releases – typically releasing Patch Set Updates quarterly, which are different from Patch Set Exceptions. PSUs are a grouping of PSEs or fewer, more significant PSEs that get regression tested collectively by the vendor and are released under a singular patch. We’ve gained a much higher degree of confidence with this bulk model of PSUs. The organization of release schedule and bug fixes is more dependable and greatly appreciated. The PSU model provides less ambiguity on which patches to apply and brings greater stability to all customers.

Upgrade

Maybe it’s bigger than patching. Are you not on version 11.1.2.4 of your EPM System? Compliance with Enterprise IT requirements around browser version and operating systems is often impetus for an upgrade. But there are also plenty of compelling new software features, functions, conventions, and improvements in 11.1.2.4.

Operating System (OS) support for current platforms maximizes your investment and supportability. When 2.4 came out, many customers were forced to upgrade their older systems for compliance with the latest enterprise standards for server operating systems and/or client browser versions. Instead of being faced with an IT mandated technology upgrade, an upgrade on the business’ schedule is preferred.

What Kind of Effort is Involved?

The comprehensive effort to bring a simple deployment (3-4 servers, no High Availability) up to the latest PSUs is typically less than a day per environment. That includes an analysis of existing patches, the patching itself as well as any prerequisites, and a post-check verification to confirm all patches applied are properly indicated in the corresponding inventories.

An initial patch application may take a little bit longer because there are often common prerequisites to address that don’t have to be handled with subsequent patching. There are also considerations like bringing WebLogic up to the latest patch level, as well as one-offs like the fixes for the Equifax-discovered vulnerabilities, that don’t happen frequently. Once you’ve got a solid base of primary critical patching, additional patching events are typically shorter.

Patching can be tricky. Documentation can often be ambiguous, whether it be an unintended omission or even assumed knowledge based on an implied experience or understanding of the product. Sometimes post-install instructions get skipped or SQL statements do not get executed properly as part of the patch. Less experienced resources typically only patch the EPMSystem11R1 Oracle Home; however, did you know that Oracle’s ADF framework also has an Opatch directory under oracle_common? Possibly because those are often prereqs. But what about Oracle Data Integrator (ODI) and Oracle HTTP Server (OHS)? They also may have applicable OPatches. Who knows what you’re missing? We do! Let’s button it up.

Contact us for more details.

Laser Tag for Cloud Analytics

A friendly game of laser tag between out-of-shape technology consultants became a small gold mine of analytics simply by combining the power of Essbase and the built-in data visualization features of Oracle Analytics Cloud (OAC)! As a “team building activity,” a group of Edgewater Ranzal consultants recently decided to play a thrilling children’s game of laser tag one evening.  At the finale of the four-game match, we were each handed a score card with individual match results and other details such as who we hit, who hit us, where we got hit, and hit percentage based on shots taken.  Winners gained immediate bragging rights, but for the losers, it served as proof that age really isn’t just a number (my lungs, my poor collapsing lungs).  BUT…we quickly decided that it would be fun to import this data into OAC to gain further insight about what just happened.

Analyzing Results in Essbase

Using Smart View, a comprehensive tool for accessing and integrating EPM and BI content from Microsoft Office products, we sent the data straight to Essbase (included in the OAC platform) from Excel, where we could then apply the power of Essbase to slice the data by dimensions and add calculated metrics. The dimensions selected were:

  • Metrics (e.g. score, hit %)
  • Game (e.g.Game 1, Game 2, Total),
  • Player
  • Player Hit
  • Target (e.g. front, back, shoulder)
  • Bonus (e.g. double points, rapid fire)

With Essbase’s rollup capability, dimensions can be sliced by any one item or at a “Total” level. For example, the Player dimension’s structure looks like this:

  • Players
    • Red Team
      • Red Team Player 1
      • Red Team Player 2
    • Blue Team
      • Blue Team Player 1
      • Blue Team Player 2

This provides instant score results by player, by “Total” team, or by everybody. Combined with another dimension like Player Hit, it’s easy to examine details like number of times an individual player hit another player or another team in total. You can drill in to Red Team Player 1 shot Blue Team or Red Team Player 1 shot Blue Team Player 1 to see how many times a player shot an individual player. A simple Smart View retrieval along the Player dimension shows scores by player and team, but the data is a little raw. On a simple data set such as this, it’s easy to pick out details, but with OAC, there is another way!

Laser Tag 1

Even More Insight with Oracle Analytics Cloud (OAC)

Using the data visualization features of OAC, it’s easy to build queries against the OAC Essbase cube to gain interesting insight into this friendly folly and, more importantly, answer the questions everybody had: what was the rate of friendly fire and who shot who? Building an initial pivot chart by simply dragging and dropping Essbase dimensions onto the canvas including the game number, player, score, and coloring by our Essbase metric “Bad Hits” (a calculated metric built in Essbase to show when a player hit a teammate), we discovered who had poor aim…

Laser Tag 2

Dan from the Blue team immediately stands out as does Kevin and Wayne from the Red team!  This points us in the right direction, but we can easily toggle to another visualization that might offer even more insight into what went on. Using a couple of sunburst type data visualizations, we can quickly tie who was shooting and who was getting hit – filtered by the same team and then weight by the score (and also color code it by team color).

Laser Tag 3

It appears that Wayne and Kevin from the Red Team are pretty good at hitting teammates, but it is also now easy to conclude that Wayne really has it out for Kevin while Kevin is an equal opportunity shoot-you-in-the-back kind of teammate!

Reimagining the data as a scatter plot gives us a better look at the value of a player in relation to friendly fire. By dragging the “Score” Essbase metric into the size field of the chart, correlations are discovered between friendly fire and hits to the other team.  While Wayne might have had the highest number of friendly fire incidents, he also had the second highest score for the Red team.  The data shows visually that Kevin had quite a few friendly fire incidents, but he didn’t score as much (it also shows results that allow one to infer that Seema was probably hiding in a corner throughout the entire game, but that’s a different blog post).

Laser Tag 4

What Can You Imagine with the Data Driving Your Business?

By combining the power of Essbase with the drag-and-drop analytic capabilities of Oracle Analytics Cloud, discovering trends and gaining insight is very easy and intuitive. Even in a simple and fun game of laser tag, results and trends are found that aren’t immediately obvious in Excel alone.  Imagine what it can do with the data that is driving your business!

With Oracle giving credits for a 30-day trial, getting started today with OAC is easy. Contact us for help!

A Safe Step into the Cloud: The Argument for Account Reconciliation Cloud Service (ARCS)

Before forecasting models, before fancy dashboards and pretty reports, before a data point is even considered “Actual” comes the age old question…

                “Does this number even look right?”

Bulls*#!

Account reconciliations – the means by which this question is answered – are a fundamental part of the financial close process. Imagine you are trying to build a sandcastle. Now imagine your “sand” is harvested from a cow pasture. You *could* continue to build this “sandcastle,” but you will likely finish with a pile of…bull-sand. In the same way, if your account balances and transactions have an integrity equivalent to “bull-sand,” this will inevitably lead to problems down the line.

sandcastle_collapsing_400px

The shift to the Cloud has complicated the decision-making process when considering new enterprise-wide application tools. The choice of whether to go with a known “on-premise” solution or take a bold step into Cloud solutions is a daunting one, particularly when considering moving high-visibility cycles such as forecasting or financial consolidations into this brave new world.

A Justified Recommendation

Take the measured move instead. If you feel hesitant to go “all-in” on Cloud offerings, here are four reasons why you should consider entering the Cloud through the arch of ARCS…the ARCSway (Get it?…archway…ARCSway…never mind – just keep reading…)

Safe Bet on a Strong Foundation

Oracle introduced Account Reconciliation Cloud Service (ARCS) as the “one stop shop” solution for managing and streamlining the reconciliation cycle in the Cloud back in 2016. While it’s not uncommon for some EPM products to lose functionality during their initial transition into the Cloud space, ARCS retains the “good bones” of its on-premise counterpart – Account Reconciliation Manager (ARM). ARCS builds upon the clever functionality and customizability of ARM, released in 2012, yet with the slick look and feel of the Oracle Cloud experience.

Since its release, ARCS has become the “golden child” of the reconciliation product family, receiving not only “first dibs” on refinement of existing capabilities, but also benefiting from the newest components such as Transaction Matching (note: this has separate licensing than the Reconciliation Compliance component of ARCS).  As the product continues to gain steam, this trend is expected to continue. Between utilizing the tried-and-true foundation of the ARM tool and having Oracle’s watchful eye, ARCS is a safe bet.

No Mistakes with Modularity

Unlike some applications, ARCS is easy to implement in pieces. While good design will certainly prevent future heartache, there are no decisions made on Day 1 of a project that cannot be modified or enhanced in the future:

  • Want to manually enter data for reconciliations today, but automatically load them from a source system tomorrow? We can do this.
  • Missing fields for additional detail you would like users to include? Can be ready for next period (or the current one even!)
  • Only want to rollout in one country to start? No problem – go ahead and make the other entities jealous!

While some changes are “cleaner” than others (I am looking at you, Profile Segments!), ARCS welcomes you to “test the waters” and see what works in your company without needing to go “all-in.” For example, a current client has a live ARM application that provides a viable solution for its reconciliation process needs given the initial project timeline and budget. Although the client wasn’t able to fully utilize the available functionality at the time, the modularity of the reconciliation tools (both ARM and ARCS) allows the opportunity for enhancements without punishing this design decision – we are now revamping the client’s auto-reconciliation setup to further streamline the process. For Partners, this means additional project phases; for clients, this means not biting off more than you can chew (win-win!).

Want to dive deeper into this topic?  Read the blog posts in the series Modularity in Account Reconciliation Cloud Service (ARCS): No Mistakes from “Day 1” to “Day 100.”

Fast Implementation Cycles and Rapid ROI

Relative to other EPM project lifecycles, ARCS is typically a quick implementation. As with all projects, there are certainly exceptions, but with Ranzal’s “Quick Start” methodology, we have stood up applications in just six weeks! A strong inventory of project “accelerators” – custom tools and scripts that Ranzal has developed based on common requests across multiple clients – allows sophisticated deployments in a timely manner. Couple this with the inherent time saving benefits of Cloud technology (i.e. lack of infrastructure setup, etc.), and ARCS shines as the first step in a Roadmap, producing tangible metrics for evaluation (ex. completion percentages per period, timeliness per Preparer/Reviewer, reconciliation accuracy, etc.) and giving users a taste of the Oracle Cloud experience in a short period of time.

You Don’t Have Anything Today and It’s Costing You

I know that may read like a presumptuous fear tactic, but hear me out:

Account reconciliations ARE being completed in your company – one way or another. Whether that means your CPAs are *click*click* clicking away on their keyboards to manually update Excel spreadsheets or – heaven forbid – actually printing out recons to hand sign, if you cannot name the system that is comprehensively handling your reconciliation cycle, it’s because there isn’t one.

And this is normal. But there are costs associated with this normalcy.

Reconciliation cycles aren’t sexy (well…personal taste…) and often have low visibility to upper management. And yet (!) the reconciliation process is often widespread across the company spanning business entities, departments, and corporate ladders (I see you, Mr/s. Director signing off on recons). ARCS is an attractive option when considering enterprise-wide Cloud solutions to “test run” because everyone can try it. A successful ARCS implementation paves the way for easier adoption of future projects – it gets everybody onboard.

Step Through the “ARCSway” and Ditch the Bulls*#!

The shift to the Cloud is disrupting the traditional market of on-premise EPM solutions. As you look at the new strategic options available to your company’s roadmap, consider ARCS as a “first step.” Of note, it is important to have an accurate understanding of the tool – ARCS is first and foremost a management tool, and although it can provide helpful information in troubleshooting account variances, it does not replace actually performing a reconciliation in an ERP system. Additionally, customizing reports can be difficult (unless you are familiar with BI Publisher), although the out-of-the-box reports and strong dashboarding capabilities largely make up for this limitation. All-in-all, I strongly recommend this product as an introduction to the new Oracle offerings. ARCS’ “low risk, high reward” nature provides real company value quickly while presenting you with a good picture of life in the Cloud. Now is the perfect time to ditch your “bull-sand” reconciliation process and update to a more solid foundation in the Cloud through the ARCSway.

Contact us today for details about a custom Cloud solution for your business needs.

Large Sandcastle

New…and Cool Features in EDMCS

Previously on The Wonderful World of Enterprise Data Management Cloud Service (EDMCS), we highlighted some of the features offered in this new product (released on Jan 25, 2018), including packaged application adapters for PBCS/EPBCS and the visual cues provided in the user interface as you modify master data. With the 18.03 release, subtle but helpful features have been added, and this post shares details of those along with useful tips resulting from actual project work done with one of three clients selected for the EDMCS Early Adopter Program. This client has fully embraced Oracle EPM Cloud by utilizing EDMCS, Financial Consolidation and Close Cloud Service (FCCS), Planning and Budgeting Cloud Service (PBCS), Account Reconciliation Cloud Service (ARCS), and Customer Data Management Cloud (CDM), along with on-premise Financial Data Quality Management Enterprise Edition (FDMEE).

Incremental Dimension Import

You now have two options available for import: Full (reset dimension) or Incremental. Full operates as expected and will completely erase and replace your dimension. All history will be lost. Certainly, a helpful feature when first seeding EDMCS, but a feature to be used carefully.

Incremental allows you to incrementally update property assignments during an import. Why is that important or useful? Because sometimes another system is point-of-entry and is feeding a new extract to EDMCS. Those extracts are typically complete data extracts (not incremental), so the incremental feature allows you to update new or changed property assignments without completely erasing your dimension.

*One important clarification – even when performing an incremental import, the complete set of node relationships must be imported as all parent-child relationships will be replaced. The “incremental” piece of the import applies to updating the changed property values of those node relationships.

New Features in EDMCS 1

Metadata Object Search

The search feature for objects (views, node sets, hierarchy sets, applications, etc.) has been expanded to include name and description. Enter your search text, and any object that contains the matching text in either the name or description is returned in the search results.

Node Search

Like the metadata object search, the node search in a viewpoint has been enhanced to include name and description. You do not need to toggle between name and description like in Data Relationship Management (DRM); instead, enter your search text and any node containing that text in the name or description is returned.

Request Load Files

EDMCS provides the capability to directly load Excel files, and a great way to start creating your request file is to use the download feature in EDMCS. Download a viewpoint (either the entire viewpoint or from the selected node and its descendants) to Excel, and you now have the basis for your request file: the complete “hierarchy” in Excel including parent, child, and all properties with the correct property labels as column headers in Excel. You can then modify the Excel file by adding an Action Code column and including the relevant node relationships and properties you wish to include in your request file.

*One note: If you need to modify top nodes in a request load file, you must include the Parent Node Type column. The column can be blank for the top nodes, but the column itself must be present. You can always modify top nodes interactively in a request.

Summary

While these features like incremental import and improved search function may seem minor, it’s often the little things that end up making a big difference when you put them all together. And it’s exciting to see this type of functionality being added so soon after the initial release.

Check back regularly for new updates and insights as EDMCS continues to evolve and mature -we’ll keep sharing! We’d love to hear your questions and observations related to EDMCS and how it fits into the ecosystem of Oracle EPM Cloud, so please comment below or contact us to share your experiences.

Try Oracle Cloud for Free

Cloud Data Management (CDM) and Financial Data Quality Management Enterprise Edition (FDMEE): A Case Study in Working Together

Why buy Financial Data Quality Management Enterprise Edition (FDMEE) when Cloud Data Management (CDM) is free?  As outlined in my recent white paper – FDMEE vs. Cloud Data Management – there are myriad factors that can drive the decision.  This blog post highlights how one customer gained a highly flexible and automated solution for data and master data management with an on-premise deployment of FDMEE in conjunction with Cloud Data Management.

This customer adopted a pure Cloud strategy as it relates to Enterprise Performance Management (EPM) procuring subscriptions to Planning and Budgeting Cloud Service (PBCS), Financial Close & Consolidation Cloud Service (FCCS), and Account Reconciliation Cloud Service (ARCS).  A diverse business, the customer has many unique operational systems with varying formats and charts of accounts.  So far, no reason why Cloud Data Management (CDM) can’t handle this requirement, right?  This is what CDM does – uses import formats and maps to consume and transform data – right?  Sure, but with caveats.  Notice that I used the word consume and not extract.  CDM does not provide the ability to link with on-premise systems to extract data.  Additionally, flat file data extracts that lack a consistent structure often cannot be natively consumed by CDM.

In this case, data needs to be loaded each day from numerous sources to support daily operational reporting.  The systems are a blend of on-premise, hosted, and Cloud applications.  The customer requirement dictated that any on-premise system should be connected directly to eliminate the need for a flat file extract to be generated daily.  Additionally, the hosted and Cloud applications are very industry specific and, in some cases, provided by very niche vendors.  The ability to modify extract formats was cost prohibitive or simply not supported.  As a result, several of these data feeds were not consumable by CDM without preprocessing/modification.

In light of the above requirements, the customer procured and deployed FDMEE on-premise.  The power of FDMEE allows a solution to be deployed that provides a direct connection to multiple on-premise systems as well as consume the flat file extracts from hosted and Cloud applications including Excel files (not in the required FDMEE/CDM format) and XML.  Because FDMEE on-premise supports scripting, we were able to greatly enrich the data integration cycle with full end-to-end automation including FTP downloading of hosted data, enhancement of the data integration cycle to detect data mapped to members not yet in PBCS or FCCS, dynamically setting substitution variables based on the processing day, running calculations in PBCS, and sending email status alerts to outline the success or failure of a data load cycle.

Although I am a huge FDMEE advocate, I recognize the value of Cloud Data Management and the benefits it provides in a case like this one.  This customer was one of just three participants in the Oracle Enterprise Data Management Cloud Service (EDMCS) program.  This means that they were able to use the software before it was publicly available – otherwise known as GA.  To participate in this program, one must recognize the absence of certain features and functions with the software.  The program allows the customer (and partner) to offer Oracle development and product management valuable input about the software and in some ways drive what features are prioritized within the product roadmap.

EDMCS currently lacks native connections to FCCS, but this will change over time.  So how does CDM help with loading metadata to FCCS?  In a recent update to CDM, Oracle included the ability to import a flat file into CDM and load metadata to a registered target application such as PBCS or FCCS.  John Goodwin gives a detailed overview of the technical setup.

FDMEE and CDM have come together in this case to provide a fully automated data integration process and an automated master data integration process.  Within EDMCS, a Custom application type was created.  The required properties for FCCS were built and attached to the multiple dimensions being mastered, and flat file exports were generated for FCCS.  We knew we were going to use CDM to manage the master data load process, but we had a decision to make – do we leverage EPM Automate or FDMEE as our automation hub?

We chose FDMEE.  Why?  Simply because a lot of automation assets had already been developed in FDMEE that could readily be reused for this process including execution of EPM Automate commands, a framework for leveraging the REST API (for PBCS and FCCS), and email alerting.  Additionally, we found the capabilities of EPM Automate to be somewhat limited.

For example, when you execute a CDM data load rule from EPM Automate, the process ID associated with the execution is not returned.  Why is that important?  Because in the event of a failure, I’d want to download the process log and attach it to the email so the user has information to address the issue.  Could I use the ListFiles command of EPM Automate to get the process log? Possibly, but it doesn’t account for potential concurrency, and I am not doing my job as a consultant if I build a process that can’t handle concurrent operations.  For reasons such as these, we leveraged EPM Automate when possible and the REST API as needed, and we wrapped it all together with an FDMEE process that could be executed on a scheduled basis or on demand simply by using the Script Execution functionality.

Let’s review the end-to-end solution.  In EDMCS, metadata is maintained for PBCS and FCCS.  The metadata is extracted to a flat file (.csv) after maintenance is completed and saved to a network folder.  From FDMEE, the master data integration process is initiated to upload the metadata files to FCCS and PBCS.  Cloud Data Management data load rules are initialized to process the metadata extracts.  In the event of an error, the CDM process log is downloaded.  Finally, an email is generated to alert the administrator of the data integration process status.

There you have it – EDMCS, FDMEE, and CDM working in concert to provide a seamless and elegant solution to data and master data integration for a customer that adopted a Cloud EPM strategy.  If you want to learn how you can enhance your Oracle EPM integration processes, contact us and we’ll be happy to discuss your options.

Enterprise Data Management Cloud Service (EDMCS) – First Impressions

Continuing its momentum with Enterprise Performance Management (EPM) Cloud initiatives, Oracle recently released Enterprise Data Management Cloud Service (EDMCS). Here are some initial impressions of the application to provide fundamental information and spark discussion.

First, some background: these observations are based on an actual project from working with a client who was 1 of 3 selected for the EDMCS Early Adopter Program. This client is essentially going all-in on Oracle EPM Cloud, with Planning and Budgeting Cloud Service (PBCS), Financial Consolidation and Close Cloud Service (FCCS), and Account Reconciliation Cloud Service (ARCS). One on-premise component, Financial Data Quality Management Enterprise Edition (FDMEE), is also in the mix. This client quickly realized its reporting structures between the Planning/Budgeting and Financial Close/Consolidation worlds, while not identical, were similar and contained a high degree of shared structures. The idea of maintaining these reporting structures in multiple tools did not make sense, leading the client to inquire about EDMCS. After an evaluation, Oracle selected them to participate in the early adopter program for EDMCS with Edgewater Ranzal as the implementation partner.

EDMCS = DRM in the Cloud, Right?

Well, not exactly, but that’s not necessarily the right question to ask. EDMCS is NOT a lift-and-shift of Data Relationship Management (DRM) to the Cloud. Yes, there are similar concepts and constructs in EDMCS that a DRM administrator will quickly grasp (Add/Insert/Delete/Remove of members, Shared members, properties, and node types to name a few). But EDMCS utilizes a different philosophy to manage your enterprise master data along with a different data model, all geared around effective master data management for EPM Cloud products. It’s crucial to adopt a new mindset as you embrace EDMCS and not be constrained by “this is how DRM did it.”

With EDMCS, you will immediately notice new functionality such as the capability to create an Enterprise Planning and Budgeting Cloud Service (EPBCS) or PBCS application in EDMCS, which provides the built-in connectors, properties, and validations for those target applications. Simply step through the Register Application wizard, specify your dimensions and plan types, and EDMCS will automatically build the rest for you. The built-in properties and validations enforce constraints and business rules to ensure no changes can be made that could break EPBCS/PBCS.

For other use cases, EDMCS provides the ability to create a custom application along with custom properties. As EDMCS matures, the number of packaged connectors, applications, and validations will surely increase.

So, the Data Model is Different?

The EDMCS data model is quite different from DRM. Understanding the EDMCS data chain is crucial to effective administration, especially given new concepts such as Viewpoints, Hierarchy Sets, and Node Sets.

Key data objects include:

  • Node Type – a collection of nodes and associated properties for your application
  • Hierarchy Set – defines the parent-child relationships of nodes
  • Node Set – defines a group of nodes available for a viewpoint. This may include all nodes in a hierarchy set or a subset of nodes
  • Viewpoint – the other data objects come together to provide the viewpoint, which is essentially the “hierarchy” you interact with to modify nodes, parent-child relationships, and properties

The diagram below, taken from the Oracle EDMCS Administration Guide, is a useful reference as you start to build out your EDMCS applications. Future blog posts will explore these key constructs in more detail.

EDMCS Figure 1

Does EDMCS include Data Relationship Governance (DRG)?

Not yet, but workflows, approvals, separation of duties, and other data governance goodness is on the roadmap for EDMCS. But fear not! EDMCS already provides a “request” mechanism. Modifications to master data can only be performed within the context of a request. Requests can include interactive changes through the UI or batch loading of changes through an Excel request file (think of request files like automator or action scripts, but easier to use and yes, in Excel!). Comments can be included with a request and, continuing with one of the strongest features of DRM, requests provide auditability by capturing the who/what/when/where of every change performed in EDMCS.

How is the User Interface?

One of my favorite features is the visual feedback EDMCS provides as you make changes within a request. As you add, insert, remove, delete, reorder, or modify a member, visual icons and highlights are displayed for that member in real-time to capture the action being performed on that member. You basically get a preview of the change before it’s committed. Changes to properties are visually highlighted and easy to spot. Validations are performed as the request is in Draft status and instantly flag any violations with error messages highlighting the problem node and issue.

Summary

Overall, EDMCS is an exciting entry into the EPM Cloud market and a foundational tool critical to maximizing your EPM Cloud investment. While DRM administrators will experience an adjustment period as they learn EDMCS due to the data chain and new terminology, they will be pleasantly surprised with the available functionality such as pre-packaged connectors and properties for PBCS/EPBCS, the use of requests (and did I mention you can load Excel files?!), and the real-time visual feedback as you modify and validate your master data.

Oracle Data Visualization for Strategic Analytics

The world of analytics and data visualization continues to change at a rapid pace. New tools, processes, buzz words (Cloud anyone?) have penetrated our industry and can become overwhelming.  Most of these changes, though, are for the better – one of them being self-service data visualization. Solutions like Tableau, PowerBI, and Qlik have been around for several years, earning reputations as leaders in the self-service, easy, and sexy exploration of data.

Oracle Business Intelligence Enterprise Edition (OBIEE), although a great tool and long-time market leader, lacks ease of deployment, maintenance, and connectivity, and the freedom from “IT tyranny” access to data craved by business users – attributes already addressed by Tableau and others. For many years, I’ve listened to IT describe OBIEE as its enterprise Business Intelligence (BI) solution while business users use Tableau to connect to their own databases and spreadsheets because Tableau doesn’t require them to enter a ticket and potentially wait weeks for that new column to be added to their report.

Today, there is an Oracle tool – Oracle Data Visualization (DV), an Oracle Analytics Cloud (OAC) component,  that exceptionally meets the needs of business users.  Because it is part of a suite of products, Data Visualization also provides enterprise and financial reporting capabilities, advanced analytics and big data, mobile access, what-if scenario modeling, ingestion, preparation and transformation of data, and yes – you guessed it – Cloud.  Some highlights include:

  • Dynamic visualizations that can be organized into stories to be shared across the organization
  • Consumer (drag and drop) style with easy uploads, mashups, and exploration
  • Mobile authoring and consumption, device agnostic, and with dynamic design optimization
  • Connections to dozens of different sources, including SaaS, relational databases, big data tools, NoSQL, and others through JDBC/ODBC
  • Cloud and desktop versions with identical features

The full Oracle Analytics Cloud suite is a comprehensive solution that offers analytics and reporting tools to effectively address business requirements as well as accommodate different users within an organization (analysts, consumers, admins, etc.). Also included with OAC is Business Intelligence Cloud Service (BICS), a tool that is essentially “OBIEE in the Cloud.”  Data Visualization and BICS complement each other and address different needs, summarized in the following:

BICS vs DV

With continued rapid changes inevitable in technology, the future of Oracle Analytics Cloud will likely be promising as additional components are added. After seeing and supporting clients as they “take the plunge” into Cloud analytics, it becomes clearer that Data Visualization is Oracle’s strategic future of analytics.

See an overview of the history of Oracle DV and OAC as well as a demonstration of the products’ main features and wide array of possible sources in this webinar: The Strategic Future of Analytics…Starring Oracle Data Visualization

Automating Enterprise Planning with EPBCS: A Case Study Featuring Sims Metal Management

Enterprise Planning and Budgeting Cloud ServiceIn using Enterprise Planning & Budgeting Cloud Service (EPBCS) to support annual budgeting and forecasting processes, organizations are choosing solutions that allow them to leverage the financials, projects, capital and workforce business processes necessary to provide a driver-based solution that links expected intake to revenues and costs. In turn, they are able to more efficiently produce integrated income statements, balance sheets and cash flow statements.

Featuring Jim Clark of Sims Metal Management, Our Special Guest

 Our August 16, 2017 webinar, featuring Jim Clark, Group Manager of FP&A at Sims Metal Management, takes a detailed look at how one organization automated enterprise planning to streamline processes and produce better results.

Within a real-world scenario, this means that whether using EPBCS out of the box or as a “hybrid” of OOTB with customized extensions, companies like Sims are able to adjust sales forecasts—throughout the year and through sales cycles—to better match the actual costs and needs in areas such as raw materials and labor.

A Better Approach To Performance Management

Using this integrated approach to Performance Management, companies are, in effect, bringing actual performance numbers, on a monthly basis, into their models.

As a result, changes and adjustments can be fine-tuned and incorporated into the mix.  Forecasts can be based more on actual numbers and less on assumptions, thus leading to a balance sheet that matches projections. From a planning perspective, companies can be more nimble and, ultimately, create their models with greater accuracy.

Whether you are participating live or via a recording, this webinar will illustrate how organizations like Sims are leveraging EPBCS in ways that allow them to: 

  • Gain insight to increase efficiency and improve outcomes
  • Better understand how organizations like yours can make standardization and centralization a top priority
  • See how an integrated solution works not just in theory, but actually in practice
  • Follow the processes to results that include improved accuracy and increased efficiency across the enterprise

For More Information

No matter where your team or your organization is along your EPBCS journey, this webinar is certain to provide you with valuable insight and context that can help you to implement changes that lead to greater efficiency and a more streamlined forecasting process overall.

Register for our “Automating Enterprise Planning with EPBCS: A Case Study Featuring Sims Metal Management ” webinar:

Missed the webinar? View Recording Here.

 

Accelerate Your Ride to the Cloud: Extending ERP with Oracle Profitability & Cost Management Cloud Service (PCMCS) for Standard Cost Rate Development

A common need among manufacturing organizations is improvement in the process of developing annual labor and overhead standards to use as input into standard cost rates for product cost and inventory valuation. In spite of the investments that have been made in ERP solutions, it is typically an offline Excel-based exercise that is required to take historical data from the ERP to determine the updated direct labor rate & overhead rate components of a product standard cost for an upcoming fiscal year.  The release of Oracle Profitability and Cost Management-Cloud Service (PCMCS) in October 2016 provides a unique opportunity for manufacturers to ease, streamline and document the process of generating the cost-per-direct labor hour or cost-per-machine-hour rates that are requisite in standard costing.

Background

Generally accepted accounting principles (GAAP) allow for one of multiple methods for the valuation of inventory to a manufacturer: Last-In, First-Out (LIFO); First-In, First-Out (FIFO); or a Weighted Average.

Because prices for labor and materials fluctuate throughout a year and inventory is built or drawn, it is difficult to track inventory on an on-going basis using these methods. Further, from a management perspective, it is more meaningful to separate the effects of price changes and inventory builds/draws from values associated with normal business.  Pricing decisions, incentive compensation and matching expenses to the physical flow of goods would all be adversely impacted by trying to constantly manage to these methods.

A common approach to achieve meaningful inventory and cost of goods sold values is to establish a “standard cost” for every product and then adjust the value of inventory on a separate line at year-end, to bring it to the GAAP basis.

This standard cost requires direct labor, direct material and an inclusion of an amount representing the “absorption” of certain of plant-related overhead costs into the inventory value.

There are two forms of overhead that must be included in the inventory value from a GAAP perspective: 1) Labor overhead and 2) Manufacturing overhead, sometimes called Indirect Overhead.

  1. Labor overhead represents the costs of direct labor resources above and beyond their direct hourly wage rate. This amount includes payroll taxes, retirement and health care benefits, workers’ compensation, life insurance and other fringe benefits.
  2. Manufacturing overhead includes a grouping of costs that are related to the sustainment of the manufacturing process, but are not directly consumed or incurred with each unit of production. Examples of these costs include:
  • Materials handling
  • Equipment Set-up
  • Inspection and Quality Assurance
  • Production Equipment Maintenance and Repair
  • Depreciation on manufacturing equipment and facilities
  • Insurance and property taxes on manufacturing facilities
  • Utilities such as electricity, natural gas, water, and sewer required for operating the manufacturing facilities
  • The factory management team

The most common first step for determining the value of overheads in inventory is to use a predetermined rate that represents a cost charge per direct labor hour or cost per machine hour. From product bills of material and routings, the total number of hours or labor or machine usage for a unit volume of production is known. The value of the overhead cost rate per direct labor hour (or machine hour) x the number of hours required per unit of production, yields the overhead cost rate per unit. In the example below, the ERP will calculate the cost per work center, but it is reliant on the Direct Labor and Overhead Rates to complete this process.

dp-image-1jpg

The challenge comes when calculating the applicable pre-determined rate for overhead per direct labor hour or machine hour by the applicable cost or work center. PCMCS can assist with automating and updating this process.

A Better Solution: The Ranzal PCMCS Standard Cost Solution

PCMCS provides the ability to quickly and flexibly put the creation of multi-step allocation processes into the hands of business users. It also provides for the management of hierarchies without the need for external dimension management applications as well as standard file templates for data upload.  Further, a series of standard dashboard and report visuals augment the viewing and monitoring of results.  These capabilities allow organizations to quickly load and allocate expenses to applicable overhead cost pools and then merge those cost pools with applicable labor or machine hour values to obtain the relevant overhead rates.

PCMCS allows users to quickly select the cost centers or work centers that are applicable as sources to be included in the overhead rate:

dp-image-2jpg

Users then can easily select the targets for collecting these costs into relevant pools,

dp-image-3

as well as the operational metric to use to assign these overhead costs to their applicable pools.

dp-image-4

Users then can easily select the targets for collecting these costs into relevant pools,

dp-image-5

Edgewater Ranzal is the leading implementation services provider of Oracle and Hyperion EPM solutions and has extensive experience with Hyperion Profitability and Cost Management (HPCM). Following the release of PCMCS, Ranzal will be announcing a Cloud servicing offering that will leverage the power of the Cloud to provide an accelerated method of producing the required inputs for overhead allocation in standard costing.

More than just Standard Costing

Additionally, while PCMS provides an excellent way to develop overhead rates for standard costing, it can simultaneously be utilized to determine allocations and costing valuations that leverage other methodologies for product and customer costing and profitability. Much has been written about the potential for inaccuracies if the standard cost basis of overhead allocation in product costing were to be used universally or exclusively for management analysis.  Overhead has become such a large portion of the total cost, that in many cases, overhead rates can be three or four times higher than their respective direct labor rates.  This suggests a general lack of causality between overhead and direct labor hours in many cases, and this has led to the evolution of other methods for costing.  Activity Based Costing is one such example, while simply allocating manufacturing variances to product lines is another.

PCMCS can be used to meet the requirements for both the externally reported methods and the management methods of product costing.

All of the Results in One Place

Determining the method by which overhead should be captured in the cost of different products of inventory is an important process because it represents a step by which a large number of dollars is moved from an expense to an asset, usually temporarily but sometimes permanently, and this can impact profitability and stock share price.

For the purpose of valuing inventory for statutory reporting, the overhead rate method is considered acceptable and it is widely used. It is therefore important that organizations find a way to develop and manage these cost valuations in a manner that is well-documented, has transparent methodology and is one that reduces the amount of time spent on the process.  However, it is not the only method that should be used for considering overhead in product and customer costing and profitability analysis.  Further, selling, general and administrative expenses (SG&A) represents another layer of cost that while not part of standard inventory cost, should be considered in overall product costs from a management perspective.

To this end, the Edgewater Ranzal PCMCS Standard Cost solution will provide an opportunity to fulfill multiple needs in costing and profitability and will do so in a manner that will be faster and more user-friendly than what has previously been experienced.