A Cloud vs. On-Premise Comparison for Profitability: All You Need to Know

In a previous blog post, the history of Hyperion Profitability and Cost Management (HPCM) was discussed along with which modules made it to the Cloud. If you are after a more clear-cut comparison between Cloud and on-premise, the below table should fit the bill. Tables generally cannot provide all the needed context, yet they are, at times, the best starting point to understand the benefits and capabilities of one solution compared to another.  The below PCMCS vs. HPCM table is not exhaustive, and if you have questions on any of the items covered, email us and we will provide further details.

PCMCS 12-11 Image 1

Choosing between on-premise and Cloud depends on which factors are the most significant barring the overall licensing cost.

Allocations and data assignments cannot have “If” statements attached to them in the on-premise version of the software – a feature fundamental to supporting Tax transfer pricing capabilities.

The cross-dimension mapping is a functionality that is not available in HPCM. This mapping ensures the assignment of data sets to the same ID/name across multiple dimensions by using the “Same as Source but Different Dimension” option within PCMCS to support intercompany activities. This feature alone, or the lack of it, may significantly impact the design of an application and the overall complexity of allocation flows.

Features available in the Cloud but not yet released in on-premise solutions could tip the scale to favor the Cloud option when all other aspects surrounding a Cloud implementation no longer appear to be as pressing. Out-of-the-box content such as overnight backups, full application, and data restores that are at the business users’ fingertips – not to mention the reporting and dashboarding included in the Cloud version – are all differentiators of a product that enables business users to control their allocation process and methodology from its inception.

While there may be exceptions to the trend where on-premise solutions can have advantages (modules not available in the Cloud, for example) and, therefore, represent the best option at a given moment in time, the reality is that the future is being developed in the Cloud and for the Cloud, and at some point the shift will most likely no longer be an option, but a necessity.

If you need help making a decision with an existing implementation or you would like more details about HPCM vs. PCMCS to make a better informed decision, email us at infosolutions@alithya.com. Our PCM Center of Excellence team is ready to share leading practices and industry-specific solutions that accelerate the ROI and expand the capabilities of your chosen software.

Using Subscriptions with EDMCS

As an earlier blog mentioned, the 18.07 release of Enterprise Data Management Cloud Service (EDMCS) delivered one eagerly anticipated piece of functionality: Subscriptions! And do not fear – these subscriptions are useful and do not involve a 1-year subscription to the Fruit of the Month Club (not that there’s anything wrong with that).

This blog post dives deeper into this new functionality, describes how it works, and highlights some lessons learned from utilizing Subscriptions with a current project involving multiple EDMCS Custom applications supporting multiple Profitability and Cost Management Cloud Service (PCMCS) applications.

Why Are Subscriptions Important?

Subscriptions are a huge step towards true “mastering” of enterprise data assets within a single master data Cloud platform. With EDMCS, it is important to build deployment-specific applications configured to the dimensionality requirements of the target applications to most effectively use the packaged adapters, validations, and integration capabilities. But in many cases, you also need to share common hierarchies across applications and avoid duplicative (that’s my big word for today) maintenance. After all, why have a master data management tool if you still must perform maintenance in multiple places? That’s just silly.

The answer to this dilemma is Subscriptions. By implementing Subscriptions, requests submitted to a primary viewpoint will automatically generate parallel subscription requests to subscribing viewpoints to automatically synchronize your hierarchy changes across EDMCS applications.

Note

This comment is important: “automatically generate parallel subscription requests.” EDMCS will not update a target, or subscribing, viewpoint behind the scenes with no visibility or audit trail to what has occurred. A parallel Subscription request will be generated along with the Interactive request that will be visible in the Requests window, along with the full audit trail and details that you find in an Interactive request. Even better, the Subscription request will generate an email and attach a Request File of the changes.

Nerdy Details

Views and Viewpoints

The first thing to really think about is the View and Viewpoint design of your EDMCS applications. Subscriptions are defined at the Viewpoint level, so you need to identify the source and target viewpoint for your business situation. With my current project, I have multiple EDMCS applications supporting multiple PCMCS applications. While the dimensionality is similar across the applications, the hierarchies vary, especially with the alternate hierarchies. So, it has been important to isolate the “common” or shared structures that should be synchronized across applications into their own viewpoint so that a subscription mechanism can be created.

Node Type Converters

You will likely need to create a node type converter. If the source and target viewpoints do not share a common node type, you must create a node type converter for subscriptions to work. In my situation, I had already created node type converters since I wanted to compare common structures across EDMCS applications, so the foundation was there to readily implement subscriptions.

Permissions

To create a Subscription, the creator must have (at a minimum) View Browser permission to the source view, View Owner permission to the target view, and Data Manager permission to the target application.

The Subscription assignee (this is the user who will “submit” the subscription request) must have (at a minimum) View Browser permission to the source view and Data Manager permission to the target application.

Creating a Subscription

Once the foundation is in place in terms of viewpoints, node type converters, and permissions, the actual creation of a subscription is easy.

Inspect the target viewpoint (the viewpoint that is to receive the changes from a source viewpoint via subscription), navigate to the Subscriptions tab, and click Edit. From there you can select the source viewpoint, the request assignee, and enable Auto Submit if needed. Save the subscription and you are all set.

  • Currently, there is no capability to edit an existing subscription. You must delete and add a new subscription to effect a change.
  • Any validation errors for your subscription will appear on this dialog as well. These are documented nicely in the Oracle EDMCS administration guide.

Using Subscriptions with EDMCS Image 1

Auto-Submit and Email Notifications

Emails will be generated and sent to the Request Assignee, whether Auto-Submit is enabled or not. The email will include details such as the original request #, the subscription request #, and how many request items were processed or skipped.

Using Subscriptions with EDMCS Image 2

Note

  • Remember, the subscription request will have a Request File attached to it. View the request file attachment to see details on why specific request items were skipped.
  • The request file is not attached to the email itself, only to the request in EDMCS.

Lessons Learned

Like I mentioned earlier, the foundation is important to making subscriptions work. And it all boils down to design and ensuring the building blocks of that foundation are in place:

Design, Design, Design!

  • The importance of dimension, view, and viewpoint design cannot be overstated. For each dimension, evaluate the primary and alternate hierarchy content and identify what will be shared across dimensions or applications and what will be unique to each dimension and application.
  • Based on that analysis, carefully design your viewpoints to enable subscriptions across EDMCS applications for hierarchies that truly need “mastering.”
  • As early as possible, identify the EDMCS user population along with permission levels for applications and views. This is important to identify the appropriate “Request Assignee” for your Subscriptions. I recommend creating a security matrix identifying each user and the permissions each will have.
  • Without a clear and well thought out design, you will find yourself constantly re-doing your views and viewpoints which, in turn, will cause constant rework of your subscriptions. The “measure twice, cut once” adage certainly applies here!
  • I am a big proponent of standard, consistent naming conventions to improve the usability and end user experience. The same holds true for Subscriptions. Consider using a standard naming convention for your viewpoints so it is clear which viewpoints have a subscription. It’s not obvious – unless you Inspect the viewpoint – that a subscription exists.
    • One approach I’ve been using is to name my source and target viewpoints identically with a special tag or symbol at the end of the target viewpoint name to indicate a subscription is present. I’m sure there are other and probably better ideas, but I find the visual cue to be helpful.
    • Perhaps in the future, Oracle will display subscription details when you hover over a viewpoint name (hint hint).

Node Type Converters

  • Ensure you have node converters in place
  • Make sure your node type converters are mapping all required properties.
    • I ran into an issue where updates to one property in my source viewpoint were not being applied to my target viewpoint via subscription requests, but all other property updates worked fine. The reason? I had recently modified my App Registration and added this property to a dimension’s node type. But my node type converter had already been created and wasn’t mapping or recognizing the new property. Once I updated my node type converter, the problem was solved.

Troubleshooting

  • The request files attached to subscription requests are a valuable troubleshooting tool. Status codes and error messages are included in these Excel files that are extremely helpful to determine why your request was not auto-submitted.
  • Inspect the Subscriptions on your viewpoints. Any validation issues will be displayed and are easily addressed. Typical Subscription validation errors include:
    • The request assignee no longer has the correct permission levels
    • The viewpoint no longer is active
    • A node type converter is missing

Conclusion

I have been looking forward to the subscription functionality in EDMCS and am pleased with it so far. Subscriptions are easy to configure, can be configured to auto-submit if desired, and generate emails to remind the requester a request has occurred and to act if the request was not submitted or request items were skipped. EDMCS Subscriptions are a big step forward to enabling true mastering of your enterprise data management assets!

Labor Budget Increases, Staffing Shortages Loom Large for Healthcare Execs in 2019; Set Expectations Now and Uncover Your Capabilities for an Enterprise-Based Labor Productivity Solution!

The two resounding topics on healthcare websites and in related blog posts:   (1) increased labor costs and (2) burnout or shortages of clinical staff.  The article published in “Healthcare Finance” Labor Budget Increases, Staffing Shortages Loom Large for Healthcare Executives in 2019 highlights this exact topic.

This isn’t surprising considering access to healthcare for all has increased; therefore, there are more patients to see which, in turn, requires more staff which results in increased labor costs…see where I’m going here? It’s easy to see how this can quickly become a major concern for providers to analyze and keep up with demand.

It becomes evident while working with numerous healthcare clients that not all healthcare companies are treated equally regarding their maturity scale when answering specific labor questions, providing/analyzing data, or even supporting a labor productivity solution. Edgewater Ranzal’s complimentary Healthcare Labor Productivity Assessment Workshop not only helps reset clients’ expectations, but also uncovers clients’ enterprise-based labor productivity solution capabilities.

Our solution utilizes Oracle Cloud or on-premise technology to help clients see an immediate return-on-investment just by analyzing contract agency usage statistics, providing detailed overtime analysis, and offering the ability to compare productivity across national standards that are loaded into the system. Additionally, we help clients align their labor productivity solutions with their planning/budgeting processes to improve budget detail and accuracy.  Comprehensive experience with data integration – often a challenging task for clients – allows us to work with staff to bring all the required data elements together to create a cohesive picture of labor productivity details.

Take a look at our webinar recording of The Key Ingredients to Understanding Labor & Productivity to learn more about our solution to uncover best practices in addressing labor productivity in your organization.  Then contact Edgewater Ranzal’s Healthcare experts to answer specific questions about implementing a solution to help cut labor costs and provide data-rich analytics to your organization.

PCMCS…Yeah, FDMEE Can Do That!

Oracle Profitability and Cost Management Cloud Service and Oracle Financial Data Quality Management Enterprise Edition Working Together Better

Over the last year, we have been fielding, positioning, and aligning more with Oracle’s new Cloud products. Some of the most common questions we are asked are:

  1. Has Edgewater Ranzal done that before?
  2. What “gotchas” have you encountered in your implementations and how have you addressed them?
  3. What unique offerings do you bring?

These are all smart questions to ask your implementation partner because the answers provide insight into their relevant experience.

Has Edgewater Ranzal done that before?

Edgewater Ranzal is an Oracle PCMCS thought leader and collaborates with Oracle as a Platinum partner to enhance PCMCS with continued development. To date, we’ve completed nearly 20 PCMCS (Cloud) implementations, and almost 80 Oracle Hyperion Profitability and Cost Management (HPCM – on premise) implementations spanning multiple continents, time zones, and industries. Our clients gladly provide references for us which is a testament to our success and abilities. Additionally, we frequently have repeat clients and team up with numerous clients to present at various conferences to share their successes.

As a thought leader in the industry and for PCMCS, we sponsor multiple initiatives that deliver implementation accelerators, test the latest product enhancements prior to their release, and work in tandem with Oracle to enhance the capabilities of PCMCS.

Our Product Management team is comprised of several individuals. Specifically for PCMCS, Alecs Mlynarzek is the Product Manager and has published the following blog: The Oracle Profitability and Cost Management Solution: An Introduction and Differentiators.  I am the Product Manager for Data Integration and FDMEE with several published blog posts related to FDMEE.

Now let’s explore some of the data integration challenges one might unexpectedly encounter and the intellectual property (IP) Ranzal offers to mitigate these and other data integration challenges that lurk.

What gotchas have you encountered in your implementations and how do you mitigate them?

We could go into great depth when detailing the PROs for using FDMEE with PCMCS…but it is much more beneficial to instead share some of the other less obvious discoveries made. Note that we work directly and continuously with Oracle to improve the product offering.

  • Extracting data via FDMEE data-sync is challenging. The size of the data cube and configuration settings of PCMCS has a threshold limit – 5,000,000 records and a 1GB file size – both of which are quite often reached. As a result, we have developed a custom solution for the data-sync routine.
  • Large datasets directly into PCMCS via DM (Cloud-based Data Management) can exhibit performance problems due to the server resources available in the Cloud. Functionality in on-premise FDMEE (scripting, Group-By, etc.) helps reduce the number of records going into the Cloud and therefore provides a performance gain.
  • Patching to the latest FDMEE patch set is crucial. Cloud applications (PCMCS, FCCS, E/PBCS) update monthly. As a result, we need to consistently check/monitor for FDMEE patches. These patches help ensure that canned integrations from Oracle are top-notch.

FDMEE_PCMCS Image 1

  • Executing two or more jobs concurrently via EPMAutomate is quite troublesome due to the workflows needed and how EPMAutomate is designed. As a result, we have invested considerable time into cURL and RESTful routines. We discovered that the login/logout commands are tied to the machine, not the user-process, so any logout from another executing run logs out all sessions.

FDMEE_PCMCS Image 2

  • The use of EPMAutomate is sometimes difficult. It requires a toolset on a PC – “JumpBox” – or on-premise EPM servers. It also requires the use of .BAT files or other scripted means. By using FDMEE, the natural ease of the GUI improves the end-user experience.
  • Loading data in parallel via FDMEE or DM can cause Essbase Load Rule contention due to how the automatic Essbase load rules are generated by the system. Oracle has made every effort to resolve this before the next Cloud release. Stay tuned… this may be resolved in the next maintenance cycle of PCMCS (18.10) and then the on-premise update of patch-set update 230.
  • We all know that folks (mainly consultants) are always looking to work around issues encountered and come up with creative ways to build/deliver new software solutions. But the real question that needs to be asked is: Should we? Since FDMEE has most of the solutions already packaged up, that would be the best tool for the job. The value that FDMEE can bring is scores better than any home-grown solution.

What unique offerings do you bring?

At Edgewater Ranzal, we have started to take some of our on-premise framework and adopt it for PCMCS. Some of the key benefits and highlights we provide are:

  • To combat the complications with loading data via FDMEE because of FDMEE’s inability to execute PCMCS clears out-of-the-box, we have added the functionality into the Ranzal IP catalog and can deploy this consistently for our clients. This is done via the RESTful functionality of PCMCS. Some of the items we have developed using REST are:
    • Import/export mappings
    • Execute data load rules or batch jobs from 3rd party schedulers
    • Refresh metadata in the Cloud
    • Augment EPMAutomate for enhanced flexibility
    • Execute business rules/clear POV commands as part of the FDMEE workflow
    • Execute stored procedures (PL/SQL) against DBaaS (see below)
    • Enhanced validation framework (see below)
  • We have redeveloped our Essbase Enhanced Validate to function with the PCMCS Cloud application. FDMEE on-premise can now validate all the mapped data prior to loading. This is great for making sure data is accurate before loading.

FDMEE_PCMCS Image 3

  • The Edgewater Ranzal tool-kip for FDMEE includes the ability to connect to other Cloud offerings for data movements, including DBaaS and OAC.

FDMEE_PCMCS Image 4

Can FDMEE do that…and should FDMEE do that?

Yes, you should use FDMEE to load to PCMCS, and it is an out-of-the-box functionality! As you can see, unlike DM whose feature comparison to FDMEE will be discussed in a later blog and white-paper, there are a lot of added benefits.  The current release of FDMEE v11.1.2.4.220 provides product functionality enhancements and has greater stability for integrations with most Cloud products.  Suffice it to say, having python scripting available and server-side processing for large files will greatly enhance your performance experience.

FDMEE_PCMCS Image 5

Contact us at info@ranzal.com with questions about this product or its capabilities.

An Exploration of the EDMCS REST API

Recently my team and I had the opportunity to implement Oracle’s newest offering – Enterprise Data Management Cloud Service (EDMCS). EDMCS for those of you who are not familiar provides a cloud-based solution for managing master data (also referred to as metadata) across the organization.  Some like to refer to EDMCS as Data Relationship Manager (DRM) in the Cloud, but the truth is, EDMCS is not DRM in the Cloud.

EDMCS is a completely new vision of what master data management can and should be. The architect of this new cloud offering is the same person who founded Razza Solutions which was the company that developed the product now known as DRM.  That is important to know because it ensures that the best of what DRM has to offer is brought forward.  But, more importantly, it ensures that the learnings and wish list of capabilities that DRM should have are in the forefront of the developers’ minds.

Ok, now let’s get back to fun stuff. In the 18.05 patch for EDMCS, the REST API (v1) was exposed for public usage.  The documentation for the REST API can be found here:

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/rest-endpoints.html

As I highlighted in the previous post Troubleshooting Cloud Data Management Metadata Load Errors, I had developed an automation routine to upload EDMCS extracts to both PBCS and FCCS using FDMEE and Cloud Data Management.  We had been eagerly awaiting the REST API for EDMCS to finalize this automation routine and provide a true end-to-end process that can be scheduled or initialized via a single action.

Let’s take a quick look back at the automation routine developed for this customer. After the metadata has been exported to a flat file from EDMCS, the automation would upload a copy to the PBCS and FCCS pods, launch Cloud Data Management data load rules which would process the EDMCS metadata extracts, run a restructure of the database after all dimensions had been loaded, and then send a status email alerting the administrator of the result.  While elegant, I considered this to be incomplete.

Automation, in my view, is a process that can be executed without user interaction. While an automation routine certainly has parameters that must be generally maintained, once those parameters are set/updated, the automation cycle should not be dependent on user input or action.  In the aforementioned solution, we were beholden to the fact that EDMCS exports had to be run interactively; however, with the introduction of the publicly exposed REST API in the 18.05 EDMCS patch, we are now able to automate the extract of metadata from EDMCS.  That means we can finally complete our fully automated, end-to-end solution for loading metadata.  Let’s review the EDMCS REST API and how we did it.

The REST API for EDMCS is structured similar to other Oracle EPM REST APIs. By this, I mean that multiple REST commands may need to be executed to achieve a functional result.  For example, when executing a Cloud Data Management data load rule via the Data Management REST API, the actual execution of the data load rule is handled by a POST call to the jobs function with the required payload (e.g. DLR name, start period, etc.).  This call is just one portion of a functional requirement.  To achieve an actual data load, a file may need to be uploaded to the cloud, the data load rule initialized, and then the status of the data load rule be retrieved.  To achieve this functional result, three unique REST API executions would need to occur.

To export metadata from EDMCS to a flat file using the REST API, the following needs to be executed:

  1. Get the dimension information for the EDMCS application from which metadata will be exported
  2. Execute an export of the dimension(s)
  3. Determine the status of export
  4. Download the export to a flat file

Let’s explore each of these in a little more detail. First, we need to get the dimension IDs for the application from which we will be downloading metadata.  This is accessed from the applications function.

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-applications-get.html

When executing this function, the JSON object return includes all applications that exist in EDMCS (including those archived). So the JSON needs to be iterated to find the record that relates to the application from which metadata needs to be exported.  In this case, the name of the application is unique and can be used to locate the appropriate record.  Next, we need to query the JSON object to get the actual dimension id (circled in red).  The dimension ID is used in subsequent calls to actually export the dimension.

Great, now we have the dimension ID. Next, we need to execute the REST API call to export the dimension.

Automated Metadata 1.docx

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-dimensions-dimensionid-export-download-post.html

You will notice that when you access this POST method, the dimension ID from the previous step is required:

/epm/rest/v1/dimensions/{dimensionId}/export/download

The JSON object returned from this execution contains minimal information. It simply provides the URL to the next required REST API execution which will provide the status of the execution.

Automated Metadata 2.docx

With this information, we can check the status of the export using the jobRuns function

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-jobruns-jobrunid-get.html

The JSON object returned here provides us the status of the export invoked in the prior step (in yellow) as well as a URL to the actual file to download which is our last step in the process.

Automated Metadata 3.docx

Once the export job is complete, the files can be streamed using the URL provided by the REST execution in the prior step.

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-files-temp-fileid-get.html

And there you have it, a fully automated solution to download metadata to flat files from EDMCS. Those files are then provided to the existing automation routine and our end-to-end process is truly complete.

And for my next trick…let’s explore some of the different REST API tools that are available to help you in your journey with the EPM REST APIs.

 

Troubleshooting Cloud Data Management Metadata Load Errors

In my last post, I highlighted a solution that was recently deployed for a customer that leveraged Enterprise Data Management Cloud Service (EDMCS), Financial Data Quality Management Enterprise Edition (FDMEE), and Cloud Data Management (CDM) to create an automated metadata integration process for both Planning and Budgeting Cloud Service (PBCS) and Financial Close and Consolidation Cloud Service (FCCS). In this post, I want to take a bit of a deeper dive into the technical build and share some important learnings.

Cloud Data Management introduced the ability to load metadata from a flat file to the Oracle EPM Cloud Services in the 17.11 patch. This functionality provides customers the ability to leverage a common platform for loading both data and metadata within the Cloud.  Equally important, CDM allows metadata to be transformed using its familiar mapping functionality.

As noted, this customer deployed both PBCS and FCCS. Within the PBCS application, four plan types are active while FCCS has the default two plan types.  A design decision was made for EDMCS to create a single custom application type that would store the metadata for both cloud applications.  This decision was not reached without significant thought as well as counsel with Oracle development.  While the pros and cons of the decision are outside the scope of this post, the choice to use a custom application registration in EDMCS ensured that metadata was input a single time but still fed to both cloud applications.  As a result of the EDMCS design decision, a single metadata file (per dimension) was supplied with properties necessary to support each plan type.

CDM leverages its 23 “dimensions” to store metadata information for processing. Exactly like data, metadata is imported using an import format into the CDM relational repository.  Each field from a metadata file is aligned to a CDM dimension field.  The CDM Account dimension always represents the target application member name and the CDM Entity dimension represents the parent of the member.  All other fields can be aligned to any of the remaining 21 dimensions.  CDM Attribute dimensions can be utilized in the import and mapping process but are not available for exporting to the cloud application.  This becomes an important constraint especially in a multi-plan type deployment.  These 21 fields can be used to store any of the properties required to successfully load metadata to the target plan type.

Let’s consider this case study for a moment. The PBCS application has four plan types.  If a process were to be built to load all plan types from a single CDM data load rule, then we would not be able to have more than five plan type specific attributes or properties because we would not have enough CDM fields/dimensions to store the relevant information.  This leads to an important design approach.  Instead of a single CDM data load rule to load all plan types, a data load rule was created for each plan type.  This greatly increased the number of metadata properties and attributes that could be loaded by CDM and ensured that future growth could be accommodated without a redesign of the integration process.

It is important to understand that CDM utilizes the Planning Outline Load Utility (OLU) to actually perform the metadata load to the cloud application. The OLU loads metadata using merge (yes Planning experts, I realize that I am not discovering fire) which is important to understand especially when processing multiple metadata loads for a single application.  When loading metadata, there are certain properties that are Application level.  I like to think of these as being global.  Additionally, there are plan type specific attributes that can align (or not align) to the application level value/setting.  I like to think of these as local.

Why is this important? Well when loading a metadata file, if certain global properties are excluded from the metadata load file, the local properties (if specified) are utilized to default the global properties. Since metadata is loading using merge, this only becomes problematic when a new member is being added to the outline.

In this particular example, an alternate hierarchy with shared members was specified in one of the plan types. The storage property of the alternates was obviously set as Shared; however, when attempting the metadata load, the following error was encountered:

A Base Member cannot be changed to a Shared Member.

After much investigation (details to follow), I discovered that the global property should also be included in the metadata load.

While CDM utilizes the OLU to load metadata, it does not provide as much verbosity in the error information as the PBCS web interface (which also uses OLU) when loading metadata. Below is an example of the error in the CDM process log.  As a tangent, I’d love to check the logs without needing to open a Service Request.  Maybe Oracle will build an enhancement that allows that in the future (hint, hint, wink, wink to my friends at Oracle).

Baha Mar - Error Handling 1

So where do I go from here? Well, what do we know about CDM loading metadata to the cloud application?  We know that CDM uses the OLU to load a flat file generated by CDM.  Bingo!  The metadata file output by CDM is a good starting point.  That file is in the Outbox of the CDM application and can be downloaded in several different ways – CDM Import process (get creative folks), CDM process details, or EPM Automate.  Now we have the metadata file and can test to determine if the error is caused by CDM or the metadata itself.  It’s all about ruling out variables.  So, we take the metadata file and import it manually within the PBCS web interface and are able to replicate the error.  But now we have an important new data point – the line number from the metadata file that is causing the error.

Baha Mar - Error Handling 2

Now that we have actionable information, we can review each property and start isolating and eliminating different variables. We determined that this error was only occurring for new alternate hierarchy parents being added to the outline.  As a test, we added the global storage property and voila, the metadata load completed successfully.  Face palm!

Maybe this would have been obvious to folks with a lot of Planning experience, but there are plenty of folks learning the intricacies of Planning and Essbase (including our friends converting from HFM to FCCS), so I wanted to share a lesson learned in my journey of metadata integration using CDM.

CDM functionality for metadata represents two of the three primary operations of ETL. In my next post, we’ll dive deeper into how the extract component of ETL was accomplished to provide a seamless end- to-end ETL solution for metadata.

A Safe Step into the Cloud: The Argument for Account Reconciliation Cloud Service (ARCS)

Before forecasting models, before fancy dashboards and pretty reports, before a data point is even considered “Actual” comes the age old question…

                “Does this number even look right?”

Bulls*#!

Account reconciliations – the means by which this question is answered – are a fundamental part of the financial close process. Imagine you are trying to build a sandcastle. Now imagine your “sand” is harvested from a cow pasture. You *could* continue to build this “sandcastle,” but you will likely finish with a pile of…bull-sand. In the same way, if your account balances and transactions have an integrity equivalent to “bull-sand,” this will inevitably lead to problems down the line.

sandcastle_collapsing_400px

The shift to the Cloud has complicated the decision-making process when considering new enterprise-wide application tools. The choice of whether to go with a known “on-premise” solution or take a bold step into Cloud solutions is a daunting one, particularly when considering moving high-visibility cycles such as forecasting or financial consolidations into this brave new world.

A Justified Recommendation

Take the measured move instead. If you feel hesitant to go “all-in” on Cloud offerings, here are four reasons why you should consider entering the Cloud through the arch of ARCS…the ARCSway (Get it?…archway…ARCSway…never mind – just keep reading…)

Safe Bet on a Strong Foundation

Oracle introduced Account Reconciliation Cloud Service (ARCS) as the “one stop shop” solution for managing and streamlining the reconciliation cycle in the Cloud back in 2016. While it’s not uncommon for some EPM products to lose functionality during their initial transition into the Cloud space, ARCS retains the “good bones” of its on-premise counterpart – Account Reconciliation Manager (ARM). ARCS builds upon the clever functionality and customizability of ARM, released in 2012, yet with the slick look and feel of the Oracle Cloud experience.

Since its release, ARCS has become the “golden child” of the reconciliation product family, receiving not only “first dibs” on refinement of existing capabilities, but also benefiting from the newest components such as Transaction Matching (note: this has separate licensing than the Reconciliation Compliance component of ARCS).  As the product continues to gain steam, this trend is expected to continue. Between utilizing the tried-and-true foundation of the ARM tool and having Oracle’s watchful eye, ARCS is a safe bet.

No Mistakes with Modularity

Unlike some applications, ARCS is easy to implement in pieces. While good design will certainly prevent future heartache, there are no decisions made on Day 1 of a project that cannot be modified or enhanced in the future:

  • Want to manually enter data for reconciliations today, but automatically load them from a source system tomorrow? We can do this.
  • Missing fields for additional detail you would like users to include? Can be ready for next period (or the current one even!)
  • Only want to rollout in one country to start? No problem – go ahead and make the other entities jealous!

While some changes are “cleaner” than others (I am looking at you, Profile Segments!), ARCS welcomes you to “test the waters” and see what works in your company without needing to go “all-in.” For example, a current client has a live ARM application that provides a viable solution for its reconciliation process needs given the initial project timeline and budget. Although the client wasn’t able to fully utilize the available functionality at the time, the modularity of the reconciliation tools (both ARM and ARCS) allows the opportunity for enhancements without punishing this design decision – we are now revamping the client’s auto-reconciliation setup to further streamline the process. For Partners, this means additional project phases; for clients, this means not biting off more than you can chew (win-win!).

Want to dive deeper into this topic?  Read the blog posts in the series Modularity in Account Reconciliation Cloud Service (ARCS): No Mistakes from “Day 1” to “Day 100.”

Fast Implementation Cycles and Rapid ROI

Relative to other EPM project lifecycles, ARCS is typically a quick implementation. As with all projects, there are certainly exceptions, but with Ranzal’s “Quick Start” methodology, we have stood up applications in just six weeks! A strong inventory of project “accelerators” – custom tools and scripts that Ranzal has developed based on common requests across multiple clients – allows sophisticated deployments in a timely manner. Couple this with the inherent time saving benefits of Cloud technology (i.e. lack of infrastructure setup, etc.), and ARCS shines as the first step in a Roadmap, producing tangible metrics for evaluation (ex. completion percentages per period, timeliness per Preparer/Reviewer, reconciliation accuracy, etc.) and giving users a taste of the Oracle Cloud experience in a short period of time.

You Don’t Have Anything Today and It’s Costing You

I know that may read like a presumptuous fear tactic, but hear me out:

Account reconciliations ARE being completed in your company – one way or another. Whether that means your CPAs are *click*click* clicking away on their keyboards to manually update Excel spreadsheets or – heaven forbid – actually printing out recons to hand sign, if you cannot name the system that is comprehensively handling your reconciliation cycle, it’s because there isn’t one.

And this is normal. But there are costs associated with this normalcy.

Reconciliation cycles aren’t sexy (well…personal taste…) and often have low visibility to upper management. And yet (!) the reconciliation process is often widespread across the company spanning business entities, departments, and corporate ladders (I see you, Mr/s. Director signing off on recons). ARCS is an attractive option when considering enterprise-wide Cloud solutions to “test run” because everyone can try it. A successful ARCS implementation paves the way for easier adoption of future projects – it gets everybody onboard.

Step Through the “ARCSway” and Ditch the Bulls*#!

The shift to the Cloud is disrupting the traditional market of on-premise EPM solutions. As you look at the new strategic options available to your company’s roadmap, consider ARCS as a “first step.” Of note, it is important to have an accurate understanding of the tool – ARCS is first and foremost a management tool, and although it can provide helpful information in troubleshooting account variances, it does not replace actually performing a reconciliation in an ERP system. Additionally, customizing reports can be difficult (unless you are familiar with BI Publisher), although the out-of-the-box reports and strong dashboarding capabilities largely make up for this limitation. All-in-all, I strongly recommend this product as an introduction to the new Oracle offerings. ARCS’ “low risk, high reward” nature provides real company value quickly while presenting you with a good picture of life in the Cloud. Now is the perfect time to ditch your “bull-sand” reconciliation process and update to a more solid foundation in the Cloud through the ARCSway.

Contact us today for details about a custom Cloud solution for your business needs.

Large Sandcastle

Oracle Data Visualization for Strategic Analytics

The world of analytics and data visualization continues to change at a rapid pace. New tools, processes, buzz words (Cloud anyone?) have penetrated our industry and can become overwhelming.  Most of these changes, though, are for the better – one of them being self-service data visualization. Solutions like Tableau, PowerBI, and Qlik have been around for several years, earning reputations as leaders in the self-service, easy, and sexy exploration of data.

Oracle Business Intelligence Enterprise Edition (OBIEE), although a great tool and long-time market leader, lacks ease of deployment, maintenance, and connectivity, and the freedom from “IT tyranny” access to data craved by business users – attributes already addressed by Tableau and others. For many years, I’ve listened to IT describe OBIEE as its enterprise Business Intelligence (BI) solution while business users use Tableau to connect to their own databases and spreadsheets because Tableau doesn’t require them to enter a ticket and potentially wait weeks for that new column to be added to their report.

Today, there is an Oracle tool – Oracle Data Visualization (DV), an Oracle Analytics Cloud (OAC) component,  that exceptionally meets the needs of business users.  Because it is part of a suite of products, Data Visualization also provides enterprise and financial reporting capabilities, advanced analytics and big data, mobile access, what-if scenario modeling, ingestion, preparation and transformation of data, and yes – you guessed it – Cloud.  Some highlights include:

  • Dynamic visualizations that can be organized into stories to be shared across the organization
  • Consumer (drag and drop) style with easy uploads, mashups, and exploration
  • Mobile authoring and consumption, device agnostic, and with dynamic design optimization
  • Connections to dozens of different sources, including SaaS, relational databases, big data tools, NoSQL, and others through JDBC/ODBC
  • Cloud and desktop versions with identical features

The full Oracle Analytics Cloud suite is a comprehensive solution that offers analytics and reporting tools to effectively address business requirements as well as accommodate different users within an organization (analysts, consumers, admins, etc.). Also included with OAC is Business Intelligence Cloud Service (BICS), a tool that is essentially “OBIEE in the Cloud.”  Data Visualization and BICS complement each other and address different needs, summarized in the following:

BICS vs DV

With continued rapid changes inevitable in technology, the future of Oracle Analytics Cloud will likely be promising as additional components are added. After seeing and supporting clients as they “take the plunge” into Cloud analytics, it becomes clearer that Data Visualization is Oracle’s strategic future of analytics.

See an overview of the history of Oracle DV and OAC as well as a demonstration of the products’ main features and wide array of possible sources in this webinar: The Strategic Future of Analytics…Starring Oracle Data Visualization

The True Power of Oracle’s Enterprise Planning Suite Unleashed at POET: A Case Study

Enterprise Planning and Budgeting Cloud Service“If you are going to change the world, you need a system to help get you there. For us, it was… about strategic opportunities. POET was at a crossroads. We needed a system that we could grow with and that could grow with us.”
Lezlee Herdina, Director of FP&A, POET

A privately held corporation headquartered Sioux Falls, South Dakota, POET LLC is a U.S. biofuel company that specializes in the creation of bioethanol. The 1,900-employee company produces 1.8 billion gallons of ethanol annually and has been granted 90 patents in the U.S. and abroad.

In this webinar, Edgewater Ranzal’s Managing Director and HSF Practice Director Ryan Meester speaks with Lezlee Herdina, POET’s Director of FP&A, to give us a behind-the-scenes look into POET’s Enterprise Planning Solution journey, from realizing that significant change was needed to an extensive evaluation process to the ultimate solution and, finally, to the company’s enduring vision going forward.

The Right Tool for the Right Job (RTRJ)

While Lezlee and the POET team were open to the insights and recommendations generated by their Ranzal analysts, they also had some specific goals in mind from the outset:

  • Provide seamless integration of financial and operational data
  • Create a platform for process improvement, including implementing greater automation in monthly processes improving efficiency and increasing time for value-add analysis
  • Achieve better communication, including ease of reporting
  • Reduce reliance on Excel models and associated version control issues
  • Improve data governance, with clarity of data model with common definitions to facilitate planning and reporting processes
  • Integrate operational and financial dashboards for performance measurement
  • Use of scenario analysis to drive M&A and strategic business decisions
  • Increase emphasis on cash perspective
  • Understanding when to use SmartView, Financial Reports, and Oracle Business Intelligence Enterprise Edition (OBIEE), as well as take full advantage of the strengths of each of the Reporting Tools
  • Keep it simple, and trust in the higher level nature of HSF
  • Recognize the important nature of the user experience
  • Ensure that data integrations are seamless for the end-user

To learn more about the POET team’s initial ongoing business challenges, the lessons learned, and the ultimate results, view a recording of True Power of Oracle’s Enterprise Planning Suite Unleashed at POETwebinar.

To learn more about our Enterprise Planning solutions, visit www.ranz.al/epbcs-webinars

Missed the webinar? View Recording Here.

Introducing Strategic Modeling: Oracle’s HSF Cloud Offering

It’s no secret that Oracle, along with the rest of the software industry, has been moving swiftly to the Cloud over the past few years.

Within Oracle’s Enterprise Performance Management platform, this transition began in 2014 with the launch of Planning and Budgeting Cloud Service (PBCS). In 2016, Oracle added out-of-the-box content to solve specific business challenges in areas like Workforce Planning and Capital Planning and bundled that with existing PBCS functionality to create Enterprise Planning and Budgeting Cloud Service (EPBCS).

Now in 2017, Oracle continues the impressive build-out of its Cloud offering with the introduction of Strategic Modeling, a new product that facilitates long-range planning and financial modeling in the Cloud.

As part of Oracle’s EPBCS bundle, the Strategic Modeling module is available at no additional cost to customers who already have purchased EPBCS. While Strategic Modeling is “new” in the sense that this functionality has never before existed in its Cloud offering, the module actually has its roots in Oracle’s Hyperion Strategic Finance (HSF) product and will look similar to HSF for those who are familiar with its on-premise capabilities.

Take Your Strategic 264

Robust. Flexible. Versatile.

Strategic Modeling provides customers with a multitude of out-of-the-box functionalities built to simplify and enhance financial modeling processes.

In particular, it adds robust capabilities in the area of Balance Sheet and Cash Flow forecasting that were either missing or limited in prior incarnations of the EPBCS suite. It also adds scenario-modeling capabilities that can be created and driven by end users on the fly as they are responding to questions or requests from Senior Management.

While Strategic Modeling provides significant incremental capabilities to EPBCS, it also leverages many common components of the suite to make the user experience consistent. For example, it provides both a Web and an Excel user interface allowing users to create models and report in the tools with which they are already comfortable. The user interface, navigation, data integration and reporting capabilities are the same across all components of EPBCS, which provides a seamless user experience across the platform.

Strategic Modeling Capabilities

Organizations often try to build strategic plans in Excel or by extending the time horizon of their budgeting applications. All too often, these patchwork solutions fall short primarily because they make use of tools that were not intended for strategic planning.

Strategic Modeling, on the other hand, was built specifically for this purpose and provides numerous out-of-the-box capabilities to address challenges in this area. The following is a sampling of some of the key pre-built capabilities that organizations can utilize to streamline and enhance their strategic planning processes:

  • Fully Integrated Financial Statements: The standard template that ships with Strategic Modeling contains an integrated Income Statement, Balance Sheet and Cash Flow Statement. The Balance Sheet will automatically balance based on the cash position as derived on the Cash Flow Statement. The circularity inherent in Interest Income/Expense calculations (i.e. as Cash increases, Interest Income increases, which leads to more Cash) is automatically solved for by the system.
  • What-if Capabilities: Strategic Modeling greatly facilitates the process of creating on-the-fly scenarios and comparing them against the base case or other scenarios. Scenario Modeling capabilities are designed for end users to create and run and provide immediate results to key strategic questions.
  • Robust Funding Capabilities: Strategic Modeling has a Funding Routine that allows users to specify a prioritized order of how cash deficits will be funded (e.g. Sell Marketable Securities, Draw upon a Revolver or Issue Commercial Paper, Issue Long-Term Debt or Equity) and how cash surpluses will be utilized (e.g. Pay Down Debt, Share Buyback, Dividends, Buy Marketable Securities).
  • Debt Scheduler Utility: A guided user interface allows for easy input of the parameters for a new debt issuance. Strategic Modeling then will calculate and perform all of the accounting for items such as Accrued Interest, Amortization of Debt Issue Costs and Current Portion of Long-Term Debt.
  • Integrated Enterprise Planning: Strategic Modeling is part of Oracle’s EPBCS Suite and as such benefits from tight integration with the other processes (Financials, Capital, Projects, Workforce) and the reporting cubes of EPBCS. This integration allows users to quickly and easily move data from their detailed budgets and operating plans to their strategic plans within Strategic Modeling. Once the strategic plan is complete, data can be easily exported as higher-level targets to seed the budget process.

Whether you own already own EPBCS and are interested in deploying the Strategic Modeling module, you are an existing HSF customer looking to move to the Cloud or you are new to Oracle and its Cloud offerings, Strategic Modeling has a compelling use case for all organizations looking to improve their modeling and strategic planning capabilities.

Register for our “Take Your Strategic Planning to New Heights” webinar: Please join us for our webinar on July 18, 2017, to see a demo of Strategic Modeling in action and to learn more about its capabilities — register here.

Missed the webinar? View the recording here.