Labor Budget Increases, Staffing Shortages Loom Large for Healthcare Execs in 2019; Set Expectations Now and Uncover Your Capabilities for an Enterprise-Based Labor Productivity Solution!

The two resounding topics on healthcare websites and in related blog posts:   (1) increased labor costs and (2) burnout or shortages of clinical staff.  The article published in “Healthcare Finance” Labor Budget Increases, Staffing Shortages Loom Large for Healthcare Executives in 2019 highlights this exact topic.

This isn’t surprising considering access to healthcare for all has increased; therefore, there are more patients to see which, in turn, requires more staff which results in increased labor costs…see where I’m going here? It’s easy to see how this can quickly become a major concern for providers to analyze and keep up with demand.

It becomes evident while working with numerous healthcare clients that not all healthcare companies are treated equally regarding their maturity scale when answering specific labor questions, providing/analyzing data, or even supporting a labor productivity solution. Edgewater Ranzal’s complimentary Healthcare Labor Productivity Assessment Workshop not only helps reset clients’ expectations, but also uncovers clients’ enterprise-based labor productivity solution capabilities.

Our solution utilizes Oracle Cloud or on-premise technology to help clients see an immediate return-on-investment just by analyzing contract agency usage statistics, providing detailed overtime analysis, and offering the ability to compare productivity across national standards that are loaded into the system. Additionally, we help clients align their labor productivity solutions with their planning/budgeting processes to improve budget detail and accuracy.  Comprehensive experience with data integration – often a challenging task for clients – allows us to work with staff to bring all the required data elements together to create a cohesive picture of labor productivity details.

Take a look at our webinar recording of The Key Ingredients to Understanding Labor & Productivity to learn more about our solution to uncover best practices in addressing labor productivity in your organization.  Then contact Edgewater Ranzal’s Healthcare experts to answer specific questions about implementing a solution to help cut labor costs and provide data-rich analytics to your organization.

The Oracle Profitability and Cost Management Solution: An Introduction and Differentiators

What is Oracle Profitability and Cost Management?

Organizations with world class finance operations generally can close in a minimal number of days (2-3 in an ideal organization) and have frequent and efficient budget and forecast cycles while also visiting different ‘what if’ scenario analysis along the way. These organizations often deliver in-depth profitability and cost management analysis reports at fund, project, product, and/or customer level, completing the picture of an accurate close cycle.

Oracle offers packaged options in support of all these finance processes, but the focus of this post will be Profitability and Cost Management (PCM).

One of the most painful and time-consuming processes for any business entity is PCM analysis. The reasons why cost allocations processes are time consuming are too many to count – from model complexity to data granularity, driver metric availability, rigidity of allocation rules, delays with implementing allocation changes, and almost impossible-to-justify results. Instead of focusing on the negative aspect, let’s focus on what can be done to alleviate such pain and energize the cost accounting department by giving it access to meaningful and accurate data and empowering users through flexibility to perform virtually unlimited “what if” analysis.

The PCM Journey

The initial Profitability and Cost Management product, like almost all Oracle EPM offerings, was released on-premise in July 2008 and is known as Oracle Hyperion Profitability and Cost Management (HPCM). 10 years later, HPCM continues to deliver an easier way to design, maintain, and enhance allocation processes with little to no IT involvement as it has since it was initially launched, but with a greater focus on flexibility and transparency. The intent for HPCM was to be a user-driven application where finance teams would be involved beginning with the definition of the methodology all the way to the steps needed to execute day-to-day processing. Any cost or revenue allocation methodology is supported via HPCM while graphical traceability and allocation balancing reports support any query from top-level analysis all the way down to the most granular detail available in the application.

There are 3 HPCM modules available on-premise today. Each was designed and developed for a different type of allocation methodology or complexity need:

  1. Simple allocations – Detailed Profitability (a.k.a. single-step allocations. Example: From Accounts and Departments, allocate data to same Accounts, new target Departments, and to granular Products/SKU based on driver metric data. This module allows for a very high degree of granularity with dimensions >100k members, but it does not cater to complex driver calculations or to allocations requiring more than 1 stage).
  2. Average to high complexity allocations – Standard Profitability (a.k.a. multi-step allocations of up to 9 iterations/stages, allowing for reciprocal allocations. Example: Allocations from accounts and departments to channels, funds, and other departments. Allocation of results from previous steps are redistributed onto Products, Customers etc. Driver metric complexity is achievable with this module; custom generated drivers are available as well, but there are limitations regarding driver data granularity, granularity of allocated data, and overall hierarchy sizing).
  3. High complexity allocations – Management Ledger (unlimited number of steps, high number of complex drivers, custom driver calculations, custom allocations, more granularity, and increased flexibility in terms of defining and expanding allocation methodology). This is the last module added to the HPCM family and the only one available as SaaS Cloud Offering.

The Cloud is Your Oyster

In 2016, Oracle introduced the Cloud version of HPCM: Profitability and Cost Management Cloud Service (PCMCS).  PCMCS is a Software as a Service (SaaS) offering, and as with many of Oracle’s Cloud offerings, PCMCS includes key improvements that are not available in the on-premise version, and enhancements are made at a much faster pace.

There is currently no indication that the two HPCM modules – Detailed and Standard Profitability – will make their way to the Cloud, since increased allocation complexity as well as increased hierarchy sizing supported by the Management Ledger module caters to most, if not all, potential requirements.

The Management Ledger module included with the PCMCS SaaS subscription has a core strength in the ease of use and flexibility to change, enabling finance users to define and update allocation rules and methodologies via a point-and-click interface. While the initial setup is advisable to be performed with support from an experienced service provider, the maintenance and expansion of PCMCS (Management Ledger) models can be achieved by leveraging solely functional resources, in most cases. “What-if” scenario creation and analysis has never been easier. Users not only can copy data and allocation methodologies between scenarios, but they can also update the data sets and allocation steps independently from a standard scenario, generating as many simulation models as they need, gaining increased insight into decision making.

Standard Profitability models perform allocations in Block Storage Databases (BSO). While BSO applications are great for complex calculations and reciprocal allocation methodologies, they have the disadvantage of being limited in terms of structure or hierarchy sizing. This hierarchy restriction is not as pressing in Aggregate Storage Option (ASO) type applications, which is the technology used by Management Ledger. The design considerations for a Standard Profitability model are also significantly more rigid when compared with the Management Ledger module, which has no limitations regarding allocation stages, allocation sequencing, or a maximum number of dimensions per each allocation step.

Detailed Profitability models heavily leverage a database repository while any connected Essbase applications are used solely for reporting purposes. Initial setup and future changes, outside of the realm of simply adding new hierarchy members, will require specialized database management skills, and the usage of a single step allocation model is not as pervasive. Complex allocation methodologies may require the usage of Detailed Profitability models in conjunction with Management Ledger, but these situations represent the exception rather than the rule.

Why Should You Choose Oracle Profitability and Cost Management?

One of the key strengths for HPCM, available since it was released, and now included in PCMCS, is transparency – the ability to identify and explain any value resulting from the allocation process, with minimal effort. Each allocation rule or allocation step is uniquely identified, enabling users to easily navigate via the embedded/out-of-the-box balancing report to the desired member intersection opened through a point and click action in Excel (using Smart View) for further analysis and investigation. The out-of-the-box-program documentation reports identify the setup of each rule and can be leveraged for quick search by account, department, segment code, or any other dimension available in the application. The execution statistics reports delivered as part of the PCMCS offering enable users to quickly understand which allocation process is taking longer than expected and identify opportunities for overall process improvement or to simply monitor performance over time. These two out-of-the-box reports – execution statistics and program documentation – are the most heavily used reports during application development, troubleshooting, and particularly when new methodologies are developed. Users can quickly search through these documents, leverage them to keep track of methodology changes, and use them as documentation for training new team members.

Performing mass updates to existing allocation rules has never been faster. PCMCS contains a menu that allows end users to find and replace specific member name references in their allocations for each individual data slice, allocation step, or an entire scenario. A quick turnaround of such maintenance tasks results in an increased number of iterations through different data sets, giving the cost accounting team more time to perform in-depth analysis rather than waiting for system updates.

PCMCS-embedded analytics and dashboarding functionality is also a significant differentiator, enabling end users to create and share dashboards with the rest of the application users through the common web interface and without the need for IT support. Reports created in PCMCS are available immediately and without time consuming initial setup or migrations between environments followed by further security setup tasks.

A comparison of On-Prem vs Cloud will be available in a future post, so please subscribe below to receive notifications for PCMCS-related blog updates.

Automation in Account Reconciliation Cloud Service (ARCS): At Its Finest

In the previous post, Redesign in Account Reconciliation Cloud Service (ARCS): From the Ground Up, I showed you how to rebuild ARCS down to the Profile Segments to speed things up. This time we’re slowing everything down…

So grab a glass of wine and throw on your Marvin Gaye vinyl because we’re getting it on with ~~automation~~. Oh yeahhh…

The sexiest topic of account reconciliations (didn’t think you’d ever see that sentence, did ya?) consistently revolves around automation. Yes, ARCS provides a central repository. Yes, ARCS is auditable. YES, ARCS shows a traceable workflow throughout the reconciliation cycle. All of these features are highly useful and absolutely a prerequisite to an enterprise worthy solution, but if you want to really grab people’s attention in a design session, start talking about the things they won’t have to do. ARCS provides both out-of-the-box functionality as well as customizable tools that help preparers  focus on high-importance reconciliations rather than spending time on low value-add or monotonous items.

Automation occurs in two areas: outside of ARCS (e.g. data feeds) and within ARCS (e.g. auto reconciliations and rules). Setting up the former enhances the latter. Either Cloud Data Management (CDM) or Financial Data Quality Management Enterprise Edition (FDMEE) can be used to load data to ARCS, albeit in different manners, but how this is accomplished is beyond the scope of this post. This data can be sourced from a variety of general ledgers and sub ledgers/subsystems including Financials Cloud, E-Business Suite (EBS), PeopleSoft, JD Edwards, and even *gasp* Excel (…if we have to…). By automating these data feeds directly from the source, management can be confident in the validity of the data (e.g. accuracy, no manual intervention or “massaging,” live, etc.) and, with scheduling, administrators have one or more fewer task(s) to worry about. The latest application data is up-to-date by the time the office doors open. Additionally, data refreshes can occur multiple times throughout the reconciliation cycle without concern for loss of work. ARCS will only update reconciliations with differences from the last data load and will change the workflow status if data has been modified and needs to be looked at again.

Within ARCS, the “bread and butter” for gaining efficiencies in the reconciliation cycle is through utilizing the out-of-the-box auto reconciliation method property on the Profiles. This will set the conditions under which the reconciliation will automatically change the workflow status to “closed,” allowing preparers to focus on the remaining “open” reconciliations that require attention. Which conditions are available for selection depends on the Format type. Furthermore, this field can be easily updated after-the-fact. Using the Actions pane, this property can be updated to a mass of Profiles based on custom filtering.

Automation in ARCS 1

[Screenshot 10a: The “Set Attribute…” functionality from the Actions pane is a powerful tool that can be used to make mass updates from the user interface.]

 

Automation in ARCS 2

[Screenshot 10b: In this example, the “Set Attribute…” functionality can be used to make updates to the Auto Reconciliation Method property for all Profiles, selected Profiles, or Profiles that fit customized criteria.]

The “Set Attribute” functionality is a powerful tool for making changes across multiple Profiles within the ARCS user interface. In many instances, this is a preferable alternative to extracting the Profiles to a text file to modify offline. Screenshots 10a – 10b show how it can be used to update the Auto Reconciliation Method attribute specifically, but there are a plethora of other attributes that can be updated in this manner.

The last puzzle piece to the trinity of automation is customized rules. Similar to custom attributes, rules can be added in a variety of places within your reconciliations to further enhance and streamline the process for both end-users and application administrators. Attributes, formats, profiles, and even specific transaction types (ex. on Subsystem Adjustments, but not on Source System Adjustments) can contain separate sets of rules.

Automation in ARCS 3

[Screenshot 11a: Rules can be added at a Format level.]

 

Automation in ARCS 4

[Screenshot 11b: Different Rule resolutions will be available depending on where the rule is created. This screenshot, for example, shows the options for rules created at a Format level.]

 

Automation in ARCS 5

[Screenshot 11c: Rules can be added at a specific transaction type. In this screenshot, any rules created here would only affect Subsystem Adjustments and would not affect System Adjustments.]

 

Automation in ARCS 6

[Screenshot 11d: Different Rule resolutions will be available depending on where the rule is created. This screenshot, for example, shows the options for rules created at a specific transaction type level.]

Thus, rules can be used for anything from sweeping, application-wide changes down to differences at a transaction-by-transaction basis, as seen in Screenshots 11a – 11d. If ARCS is a suit, then rules are custom tailoring; they are made to fit your company’s specific needs.

The most common rule I see relates to Auto-Submission (as opposed to Auto Reconciliation). The out-of-the-box auto reconciliation methods previously discussed are set on Profiles and can be used to “close” a reconciliation for the period if the criteria is met. However, sometimes a reconciliation still needs reviewing such as if it is considered higher risk or only during certain periods in the fiscal year. Customized rules can dynamically determine which reconciliations can skip the preparer and be assigned directly to the reviewer, and which are clear to be automatically “closed” for the month (e.g. without approval by a preparer or reviewer). Tailoring rules in this manner still helps the preparers reduce their workload while giving management the confidence that the higher priority reconciliations are being reviewed – the best of both worlds!

No Mistakes with Modularity from “Day 1” to “Day 100”
So, there you have it: the four main manifestations of ARCS’ modularity. While nothing will replace proper planning, ARCS does not permanently punish any application decisions you (or your partner) have made in the past. The tool is able to grow with your company and accommodate your needs as they arise. There’s no reason to pick “today” or “tomorrow” – have them both.

Am I right? Am I off my rocker? You tell me! Answer in the comments below if ARCS’ (or ARM! We haven’t forgotten you…) has been able to accommodate the changes with your company’s growth.

If you like what you’ve read, please consider sharing this article through social media. And let me know in the comments what topic(s) you would like to see covered in future posts.

*Screenshots taken from the patch 1806 release.

Redesign in Account Reconciliation Cloud Service (ARCS): From the Ground Up

We talked about adding new scope in New Scope in Account Reconciliation Cloud Service (ARCS): Add-Ons and modifying your application inside (i.e. changing reconciliation methods) and outside of ARCS (i.e. new data feeds) in Modifications in Account Reconciliation Cloud Service (ARCS): Tweaking and Tuning.

Today, we’re going to tear it down and rebuild from the ground up.

Let me start with this:  redesign IS possible. ARCS does not permanently punish any design decisions made on “Day 1,”…but not all changes are equal in complexity, nor can all changes be made without consequence. A successful implementation ensures that the application design is sound for today and that a well laid roadmap is in place for tomorrow. Many “one-off” changes can be made directly to a deployed reconciliation (i.e. only within a single period) or permanently going forward (i.e. to the profile). The “catch” is the key properties set on a profile or reconciliation – the Account ID. The Account ID represents the granularity at which the reconciliation is being performed, such as [Business Unit]-[Account] or [Entity]-[Natural Account]-[Subaccount].

ARCS From the Ground Up 1[Screenshot 6: The Account ID is a unique identifier for the reconciliation.]

The Account ID is fundamental to the reconciliation, as indicated by the asterisks (i.e. “*”) in Screenshot 6. Changing it in any way will break the Prior Reconciliation “link” with previously completed instances of the reconciliation.

But let’s push that idea one step further – what if I want to change the key properties themselves – that is to say – change the actual Profile Segments? The Profile Segments determine the name (ex. from “Company” to “Business Unit”), number (ex. from 2 to 3 segments), and even type of values (ex. setting up the Business Unit segment to always be an integer) that are viable for use when setting up an Account ID. Therefore, if this was set up incorrectly or if the granularity at which reconciliations are performed has changed since the initial implementation, then redesigning the Profile Segments may become a requirement.

ARCS even makes this type of redesign possible, but at a cost. An administrator needs to first delete all Profiles; only then will the application allow a modification to the Profiles Segments in the Configuration card.

ARCS From the Ground Up 2[Screenshot 7a: Unable to modify the Name of Profile Segment 1 which is currently named “Company.” The field appears grayed out. This is because Profiles are currently using these Profile Segments.]

ARCS From the Ground Up 3[Screenshot 7b: After removing the Profiles, Profile Segment 1 is now able to be modified. In the example, Profile Segment is renamed to “Business Unit.”]

While Screenshots 7a & 7b show that this is possible, there are repercussions. Similar to changing the Account IDs, this change will break any links to previously completed reconciliations. Additionally, any existing mappings within outside Integration solutions such as Cloud Data Manager or FDMEE, or references to Profile Segments in customized attributes or rules may be affected. This type of redesign should only be done after carefully considering all options.

Other common questions relate to redesigning an attribute, typically the system attributes such as Process or Account Type. This is a straightforward change as it relates to updating the property on the Profiles; however, it is important to note that any reference to any existing artifact (i.e. an artifact can be a format, a custom attribute, an attribute member, etc.) within ARCS will prevent the deletion of said artifact. As an example, if the Account Type structure requires redesigning, but there is a reference to any of the members (such as in a historical period), then these members cannot be deleted without first removing the references. This can be tedious when there are multiple years of reconciliations to consider.

ARCS From the Ground Up 4

[Screenshot 8: When trying to remove the Custom Attribute named “PLACE CUSTOM ATTRIBUTE HERE,” ARCS prevents this deletion and cites which artifact is using the Custom Attribute. In this example, the Bank Reconciliation format is using this Custom Attribute – thus, it cannot be deleted.]

Unlike many system messages, ARCS actually provides useful troubleshooting information as seen in Screenshot 8. However, it still may not be worth it to you to retroactively make this change. A recommendation is to “archive” artifacts that will not be used going forward by renaming them with “Old” or “Hist,” then create a separate artifact to use going forward.

ARCS From the Ground Up 5[Screenshot 9: A work-around to deleting previously used artifacts is to rename them and then use a new artifact going forward. In this example, the suffix “- Old” is added to this Custom Attribute to indicate that it is no longer in use.]

Previous uses of the artifact such as in completed reconciliations will update to reflect the name change. In the example provided in Screenshot 9, this custom attribute for historical periods will be updated with the “– Old” suffix to indicate to ARCS administrators that it is no longer in use but was used historically.

ARCS is a flexible application solution that allows for nearly any change to be made, though the effort and complexity will vary. While sound design can prevent many issues, it should be a comfort to know that there is “wiggle room” if the requirements change in the future.

Join me in the last post of the ARCS modularity series – a real crowd pleaser: Automation in Account Reconciliation Cloud Service (ARCS): At Its Finest

*Screenshots taken from the patch 1806 release.

The Data Governance Triple Crown

A few weeks ago, those who follow horse racing witnessed a historic event. The race horse Justified captured the Triple Crown by winning the Belmont Stakes following earlier victories in the Kentucky Derby and Preakness Stakes. Justified became only the 13th horse in history to capture the Triple Crown, and the second horse to do so in the last 4 years (American Pharoah captured the honor in 2015). Interesting side note: both Justified and American Pharoah were trained by Bob Baffert. Why does that matter? Because he’s a fellow Arizonan native and University of Arizona alumnus, that’s why! Bear Down!

While it may be a stretch, the concept of a “triple crown” of sorts has been on my mind recently as it relates to recent Oracle Enterprise Performance Management (EPM) projects I’ve been working on involving Oracle Data Relationship Management (DRM) and Data Relationship Governance (DRG). Many people are familiar with the DRG module of the DRM product, but when the tool is coupled with two other critical components, you are well on your way to capturing the Data Governance Triple Crown.

1.    Tool – Data Relationship Governance

As you may know, DRG is a module of the DRM product and provides a governance framework for maintaining your DRM master data. DRG includes functionality such as workflows, approvals, email notifications, and separation of duties (to prevent someone from approving his own request). Workflows are often structured around dimension maintenance and may include requests like “Add Account,” “Update Account,” or “Move Account.” The workflow then guides the requester to select tasks and complete fields on a data entry form. Once submitted, the request enters optional enrichment stages where additional detail and context is added to the request before finally being committed and updating the relevant DRM structures.

Here are just a few of the key features in DRG:

  • Requests can be entered interactively or via bulk upload files
  • Documents (such as supporting request documentation, emails, or policies) can be attached to requests
  • Comments/supporting narrative can be included
  • Requests can be pushed back to a prior stage, approved, or rejected
  • Request can generate email notifications to approvers and/or participants in a workflow requests
  • Requests can include validations, calculated fields, and conditional criteria to enter or bypass specific stages in the workflow

While I could go on and on about DRG, I’ve noticed a DRG implementation is most effective when paired with two other components.

2.    Process – Data Governance Program

In my experience, DRG implementations are most successful when bundled into a broader data governance program. Data governance programs bring together the Tool (DRG), the People (data stewards, data specialists, data governance council), and the Process (process flows, metrics, and standards).

Key facets to an effective data governance program include:

  • Executive sponsorship
  • Data Governance Council
  • Clear Roles and Responsibilities
  • Standards (metrics, definitions, process flows)
  • Authority and Accountability

Data governance programs are not easy! The change management aspect to implementing effective data governance cannot be underestimated. There will be natural resistance, pushback, and challenges to any type of change, and data governance initiatives are no exception. Data governance implementations require patience and perseverance, and at times, even a bit of the “carrot and stick” approach. As a result, we have seen the following steps as crucial to getting your data governance program off the ground:

    1. Define Charter Team and Responsibilities
    2. Define the Mission Statement
    3. Define the High-Level Scope
    4. Define the Terminology and Standards
    5. Define the Current State Overview
    6. Define the Future State Vision
    7. Define the Draft Phased Approach
    8. Prepare the Project Charter
    9. Present the Project Charter for Executive Approval
    10. Ensure Executive Support

While there is much more content to dive into on a data governance program that is beyond the scope of this blog, I hope you appreciate the importance of People and Process in a data governance initiative and do not focus only on the Tool.

3.    Integration – DRM to External Systems

The third and final component to effective data governance, after the Tool and Process, is integration to external systems. This allows DRM to truly become the master data hub in your company’s eco-system and systematically push master data (which could include trees/hierarchies, base members, mappings, or all of the above) to both upstream and downstream systems.

By leveraging DRM’s robust integration capabilities and adding in some custom SQL or ETL integration as needed, DRM can produce master data in various forms (flat files, SQL tables, web services, external commits) for consumption by external applications. And these integrations can be run on-demand or scheduled.

Summary

So there you have it. Three critical components to effective data governance: a good tool (DRG), a robust process (data governance program), and automated integration (with DRM as the hub).

Are any of these components effective in their own right? Certainly. Each area adds value in its own right and can be implemented standalone. But when all three components are implemented in conjunction, the whole is definitely greater than the sum of the parts. Each component presents its own set of challenges and requires close collaboration with both technical and business personnel at a customer. And executive sponsorship and buy-in is absolutely vital to managing and overcoming the inevitable change management challenges. It ain’t easy, but like the saying goes, nothing worthwhile ever is, right?

I’d love to hear your thoughts on this topic along with any best practices, lessons learned, or battle scars earned along the way. Feel free to connect with me on LinkedIn or Twitter.

Modifications in Account Reconciliation Cloud Service (ARCS): Tweaking and Tuning

In the last post, New Scope in Account Reconciliation Cloud Service (ARCS): Add-Ons, we discussed how ARCS sets you up to easily add on additional scope to your existing application and scale your solution. However, not all changes are brand new. Clients are often concerned with being pigeonholed based on their “Day 1” decisions. A common question I am asked during a design session is “Can I manually enter this reconciliation today, but create new feeds to automatically load the data tomorrow?” The answer is a resounding YES, and it provides clear added value to the next phase of any ARCS (or ARM) project. It can be a viable project strategy to set up reconciliations using an Account Analysis format on “Day 1” and change to a Balance Comparison format when automated data loads are built on “Day 100.”

Modifications in ARCS 1

[Screenshot 5a: Reconciliation 100-1000 is setup with a Balance Comparison format in Sep 2017.*]

Modifications in ARCS 2

[Screenshot 5b: The previous period’s reconciliation can be viewed in the Prior Reconciliations tab.*]

Modifications in ARCS 3

[Screenshot 5c: Reconciliation 100-1000 was previously setup with an Account Analysis format in Aug 2017. The format of a profile can be changed while maintaining the Prior Reconciliations link.*]

Depending on how this change is made, it is even possible to keep the modified reconciliation “linked” to the previously completed reconciliations even though the Format has changed, such as in Screenshots 5a – 5c. The ease with which ARCS allows you to change Reconciliation Methods (via Formats) gives you the flexibility to not bite off more than you can chew in the beginning of a project.

Changing Reconciliation Methods is often related to new integrations. Moving from the manual “fat fingering” of data to directly loading general ledger and sub ledger balances through Financial Data Management Enterprise Edition (FDMEE) or Data Management combined with the inbuilt auto-reconciliation tools can bring a “quality of life” change for end users as well as added confidence in the data’s integrity. It is always a best practice to pull data from the source. Creating the integration from the general ledger is typically part of the initial scope. The usual candidates for building additional feeds after the first project phase are the sub ledgers related to fixed assets, accounts receivables, and accounts payables. However, the most “bang for your buck” as it relates to what integrations to build depends on your line of business and specific company requirements.*

*Note that adding multiple general ledger feeds introduces additional complexities beyond the scope of this article. Please consult with your Oracle partner before adding to your application.

In some cases, the greatest efficiencies to your existing reconciliation process are gained in utilizing the power of ARCS Transaction Matching. This module is better suited to handle massive data volumes at a transactional level. As an example, instead of performing just a reconciliation of the balance sheet’s intercompany balances in ARCS Reconciliation Compliance at the end of the month, an enhancement to this process could be to perform the daily matching process in ARCS Transaction Matching to clear up issues in real time as they arise. This simplifies the month end’s reconciliation. ARCS Transaction Matching is a powerful supplement to an existing reconciliation system and continues to receive special attention from Oracle as seen with the major release of new functionality in Patch 1805.

Just as there are many ways your company can change, ARCS can be modified to match your needs even in a live application. However, sometimes changes are more fundamental than a bit of tweaking such as in an acquisition or the introduction of a new, company-wide general ledger. Or, perhaps, you are just not satisfied with the solution design. Join me in the next post as we discuss the dangerous topic of redesign in ARCS – what is possible…and what it costs.

In the next post, Redesign in Account Reconciliation Cloud Service (ARCS): From the Ground Up, learn how redesign IS possible in ARCS.

*Screenshots taken from the patch 1806 release.

New Scope in Account Reconciliation Cloud Service (ARCS): Add-Ons

This post follows last week’s post Modularity in Account Reconciliation Cloud Service (ARCS): No Mistakes from “Day 1” to “Day 100.”

Out-of-the-box, ARCS makes it easy to “oh, and this!” when adding new scope. The obvious example is monthly maintenance. Reconciliation Administrators and Power Users can build new Profiles to deploy for future months (or even the current month) with relative ease. With the “Copy” feature, previously created Profiles can serve as ready-to-use templates and reduce the manual effort involved in building a Profile from scratch.

New Scope in Account Reconciliation Cloud Service (ARCS) - Add-Ons 1

[Screenshot 1: The Copy function from the Actions drop-down list can be used to duplicate existing Profiles*]

Copying existing Profiles, as seen in Screenshot 1, is intuitive, built-in functionality. This makes ARCS “Quick Starts” a popular project option when tight on a budget – the Partner will be contracted to create a limited subset of Profiles and the Client can then use these as a starting point to build out the rest, saving on the Build Phase effort.

Another common post-project add are Custom Attributes. As companies become more familiar with how their end users utilize the tool, new Custom Attributes can be included for reporting purposes (such as filtering or sorting in dashboards), providing information, or collecting feedback. Beyond the three system attributes of Process, Account Type, and Risk Rating, some typical Custom Attributes include source system names, supplemental detail such as cost center or department, or even more dynamic fields such as auto-populating metadata descriptions. Furthermore, where these are placed within a reconciliation changes the nature of what detail is being provided or collected. Custom Attributes can be placed at a reconciliation’s summary level, on each individual transaction, and even on the specific Action Plans within each transaction. Additionally, these can be inherited from a Format or set for individual Profiles. What information is useful or relevant to end users will change depending on the granularity.

New Scope in Account Reconciliation Cloud Service (ARCS) - Add-Ons 2[Screenshot 2: Custom Attribute on the Summary tab*]

New Scope in Account Reconciliation Cloud Service (ARCS) - Add-Ons 3[Screenshot 3: Custom Attribute on a Transaction*]

New Scope in Account Reconciliation Cloud Service (ARCS) - Add-Ons 4[Screenshot 4: Custom Attribute on an Action Plan*]

The variety of locations within the reconciliation to place these Custom Attributes, as seen in Screenshots 2 – 4, and the ease at which these can be added provides your company with the flexibility to determine ‘what’ and ‘where’ information should be presented.

ARCS provides a plethora of tools to grow the application with your company and add-on to your “Day 1” implementation. But what if you like what you have built, and just want to tweak it?  Perhaps you want to move from “fat fingering” to fully integrating with your ERP source systems? The next post, Modifications in Account Reconciliation Cloud Service (ARCS): Tweaking and Tuning, discusses how ARCS can be modularly modified, keeping what you have…but better.

*Screenshots taken from the patch 1806 release.

Patch Today! Don’t Delay! Best Reasons to Upgrade Your EPM System

Putting off that upgrade to 11.1.2.4? Cloud not whetting your appetite for patches? Patch today. Don’t delay!

“But we’re going to the Oracle EPM Cloud soon!” you say. You should maintain your patches anyway. With the recurring maintenance, updates, and patches available to the EPM Cloud products, expect the on-premise patches to contain similar updates. An upcoming conversion to Oracle EPM Cloud products may benefit from running the latest on-premise codelines.

If you have an existing on-premise installation of Oracle EPM System, be sure to maintain the latest EPM System Patch Set Updates every 3 to 6 months. Here are a few great reasons why:

New Features

Patches often contain reactive bug resolutions to known issues; however, we have also been seeing new functionality released in patches for 11.1.2.4.

You Own It

You already pay for it! As long as your Oracle Maintenance contract is current (very likely if you are reading this article), you’re already paying for access to patches. Why leave them unapplied? You are running legacy code when the latest version costs you nothing additional. Windows XP was a great OS, but we’ve got to keep up with the times.

Supportability

Maximize your success by reducing time to resolution on your issues. Should you submit a support request to the vendor, the first line of response to a ticket is often about current patch levels. Once provided, the subsequent reply frequently contains a recommendation to apply the latest Patch Set Updates (PSUs) to see if that fixes the issue. Annoying? Perhaps you’re a pessimist. Or have just been remiss with your patching. I’ve certainly changed my mind on the matter and can better side with them. The reason? Supporting the latest codeline is more efficient and effective for the vendor. Your problem may have already been addressed in a code fix. They can better and more quickly support you if they are troubleshooting the current release instead of legacy code.

Stability

In older versions, patches seem to come out on a haphazard schedule. Over the last few years, Oracle has regularly streamlined EPM System patch releases – typically releasing Patch Set Updates quarterly, which are different from Patch Set Exceptions. PSUs are a grouping of PSEs or fewer, more significant PSEs that get regression tested collectively by the vendor and are released under a singular patch. We’ve gained a much higher degree of confidence with this bulk model of PSUs. The organization of release schedule and bug fixes is more dependable and greatly appreciated. The PSU model provides less ambiguity on which patches to apply and brings greater stability to all customers.

Upgrade

Maybe it’s bigger than patching. Are you not on version 11.1.2.4 of your EPM System? Compliance with Enterprise IT requirements around browser version and operating systems is often impetus for an upgrade. But there are also plenty of compelling new software features, functions, conventions, and improvements in 11.1.2.4.

Operating System (OS) support for current platforms maximizes your investment and supportability. When 2.4 came out, many customers were forced to upgrade their older systems for compliance with the latest enterprise standards for server operating systems and/or client browser versions. Instead of being faced with an IT mandated technology upgrade, an upgrade on the business’ schedule is preferred.

What Kind of Effort is Involved?

The comprehensive effort to bring a simple deployment (3-4 servers, no High Availability) up to the latest PSUs is typically less than a day per environment. That includes an analysis of existing patches, the patching itself as well as any prerequisites, and a post-check verification to confirm all patches applied are properly indicated in the corresponding inventories.

An initial patch application may take a little bit longer because there are often common prerequisites to address that don’t have to be handled with subsequent patching. There are also considerations like bringing WebLogic up to the latest patch level, as well as one-offs like the fixes for the Equifax-discovered vulnerabilities, that don’t happen frequently. Once you’ve got a solid base of primary critical patching, additional patching events are typically shorter.

Patching can be tricky. Documentation can often be ambiguous, whether it be an unintended omission or even assumed knowledge based on an implied experience or understanding of the product. Sometimes post-install instructions get skipped or SQL statements do not get executed properly as part of the patch. Less experienced resources typically only patch the EPMSystem11R1 Oracle Home; however, did you know that Oracle’s ADF framework also has an Opatch directory under oracle_common? Possibly because those are often prereqs. But what about Oracle Data Integrator (ODI) and Oracle HTTP Server (OHS)? They also may have applicable OPatches. Who knows what you’re missing? We do! Let’s button it up.

Contact us for more details.

Laser Tag for Cloud Analytics

A friendly game of laser tag between out-of-shape technology consultants became a small gold mine of analytics simply by combining the power of Essbase and the built-in data visualization features of Oracle Analytics Cloud (OAC)! As a “team building activity,” a group of Edgewater Ranzal consultants recently decided to play a thrilling children’s game of laser tag one evening.  At the finale of the four-game match, we were each handed a score card with individual match results and other details such as who we hit, who hit us, where we got hit, and hit percentage based on shots taken.  Winners gained immediate bragging rights, but for the losers, it served as proof that age really isn’t just a number (my lungs, my poor collapsing lungs).  BUT…we quickly decided that it would be fun to import this data into OAC to gain further insight about what just happened.

Analyzing Results in Essbase

Using Smart View, a comprehensive tool for accessing and integrating EPM and BI content from Microsoft Office products, we sent the data straight to Essbase (included in the OAC platform) from Excel, where we could then apply the power of Essbase to slice the data by dimensions and add calculated metrics. The dimensions selected were:

  • Metrics (e.g. score, hit %)
  • Game (e.g.Game 1, Game 2, Total),
  • Player
  • Player Hit
  • Target (e.g. front, back, shoulder)
  • Bonus (e.g. double points, rapid fire)

With Essbase’s rollup capability, dimensions can be sliced by any one item or at a “Total” level. For example, the Player dimension’s structure looks like this:

  • Players
    • Red Team
      • Red Team Player 1
      • Red Team Player 2
    • Blue Team
      • Blue Team Player 1
      • Blue Team Player 2

This provides instant score results by player, by “Total” team, or by everybody. Combined with another dimension like Player Hit, it’s easy to examine details like number of times an individual player hit another player or another team in total. You can drill in to Red Team Player 1 shot Blue Team or Red Team Player 1 shot Blue Team Player 1 to see how many times a player shot an individual player. A simple Smart View retrieval along the Player dimension shows scores by player and team, but the data is a little raw. On a simple data set such as this, it’s easy to pick out details, but with OAC, there is another way!

Laser Tag 1

Even More Insight with Oracle Analytics Cloud (OAC)

Using the data visualization features of OAC, it’s easy to build queries against the OAC Essbase cube to gain interesting insight into this friendly folly and, more importantly, answer the questions everybody had: what was the rate of friendly fire and who shot who? Building an initial pivot chart by simply dragging and dropping Essbase dimensions onto the canvas including the game number, player, score, and coloring by our Essbase metric “Bad Hits” (a calculated metric built in Essbase to show when a player hit a teammate), we discovered who had poor aim…

Laser Tag 2

Dan from the Blue team immediately stands out as does Kevin and Wayne from the Red team!  This points us in the right direction, but we can easily toggle to another visualization that might offer even more insight into what went on. Using a couple of sunburst type data visualizations, we can quickly tie who was shooting and who was getting hit – filtered by the same team and then weight by the score (and also color code it by team color).

Laser Tag 3

It appears that Wayne and Kevin from the Red Team are pretty good at hitting teammates, but it is also now easy to conclude that Wayne really has it out for Kevin while Kevin is an equal opportunity shoot-you-in-the-back kind of teammate!

Reimagining the data as a scatter plot gives us a better look at the value of a player in relation to friendly fire. By dragging the “Score” Essbase metric into the size field of the chart, correlations are discovered between friendly fire and hits to the other team.  While Wayne might have had the highest number of friendly fire incidents, he also had the second highest score for the Red team.  The data shows visually that Kevin had quite a few friendly fire incidents, but he didn’t score as much (it also shows results that allow one to infer that Seema was probably hiding in a corner throughout the entire game, but that’s a different blog post).

Laser Tag 4

What Can You Imagine with the Data Driving Your Business?

By combining the power of Essbase with the drag-and-drop analytic capabilities of Oracle Analytics Cloud, discovering trends and gaining insight is very easy and intuitive. Even in a simple and fun game of laser tag, results and trends are found that aren’t immediately obvious in Excel alone.  Imagine what it can do with the data that is driving your business!

With Oracle giving credits for a 30-day trial, getting started today with OAC is easy. Contact us for help!

A Safe Step into the Cloud: The Argument for Account Reconciliation Cloud Service (ARCS)

Before forecasting models, before fancy dashboards and pretty reports, before a data point is even considered “Actual” comes the age old question…

                “Does this number even look right?”

Bulls*#!

Account reconciliations – the means by which this question is answered – are a fundamental part of the financial close process. Imagine you are trying to build a sandcastle. Now imagine your “sand” is harvested from a cow pasture. You *could* continue to build this “sandcastle,” but you will likely finish with a pile of…bull-sand. In the same way, if your account balances and transactions have an integrity equivalent to “bull-sand,” this will inevitably lead to problems down the line.

sandcastle_collapsing_400px

The shift to the Cloud has complicated the decision-making process when considering new enterprise-wide application tools. The choice of whether to go with a known “on-premise” solution or take a bold step into Cloud solutions is a daunting one, particularly when considering moving high-visibility cycles such as forecasting or financial consolidations into this brave new world.

A Justified Recommendation

Take the measured move instead. If you feel hesitant to go “all-in” on Cloud offerings, here are four reasons why you should consider entering the Cloud through the arch of ARCS…the ARCSway (Get it?…archway…ARCSway…never mind – just keep reading…)

Safe Bet on a Strong Foundation

Oracle introduced Account Reconciliation Cloud Service (ARCS) as the “one stop shop” solution for managing and streamlining the reconciliation cycle in the Cloud back in 2016. While it’s not uncommon for some EPM products to lose functionality during their initial transition into the Cloud space, ARCS retains the “good bones” of its on-premise counterpart – Account Reconciliation Manager (ARM). ARCS builds upon the clever functionality and customizability of ARM, released in 2012, yet with the slick look and feel of the Oracle Cloud experience.

Since its release, ARCS has become the “golden child” of the reconciliation product family, receiving not only “first dibs” on refinement of existing capabilities, but also benefiting from the newest components such as Transaction Matching (note: this has separate licensing than the Reconciliation Compliance component of ARCS).  As the product continues to gain steam, this trend is expected to continue. Between utilizing the tried-and-true foundation of the ARM tool and having Oracle’s watchful eye, ARCS is a safe bet.

No Mistakes with Modularity

Unlike some applications, ARCS is easy to implement in pieces. While good design will certainly prevent future heartache, there are no decisions made on Day 1 of a project that cannot be modified or enhanced in the future:

  • Want to manually enter data for reconciliations today, but automatically load them from a source system tomorrow? We can do this.
  • Missing fields for additional detail you would like users to include? Can be ready for next period (or the current one even!)
  • Only want to rollout in one country to start? No problem – go ahead and make the other entities jealous!

While some changes are “cleaner” than others (I am looking at you, Profile Segments!), ARCS welcomes you to “test the waters” and see what works in your company without needing to go “all-in.” For example, a current client has a live ARM application that provides a viable solution for its reconciliation process needs given the initial project timeline and budget. Although the client wasn’t able to fully utilize the available functionality at the time, the modularity of the reconciliation tools (both ARM and ARCS) allows the opportunity for enhancements without punishing this design decision – we are now revamping the client’s auto-reconciliation setup to further streamline the process. For Partners, this means additional project phases; for clients, this means not biting off more than you can chew (win-win!).

Want to dive deeper into this topic?  Read the blog posts in the series Modularity in Account Reconciliation Cloud Service (ARCS): No Mistakes from “Day 1” to “Day 100.”

Fast Implementation Cycles and Rapid ROI

Relative to other EPM project lifecycles, ARCS is typically a quick implementation. As with all projects, there are certainly exceptions, but with Ranzal’s “Quick Start” methodology, we have stood up applications in just six weeks! A strong inventory of project “accelerators” – custom tools and scripts that Ranzal has developed based on common requests across multiple clients – allows sophisticated deployments in a timely manner. Couple this with the inherent time saving benefits of Cloud technology (i.e. lack of infrastructure setup, etc.), and ARCS shines as the first step in a Roadmap, producing tangible metrics for evaluation (ex. completion percentages per period, timeliness per Preparer/Reviewer, reconciliation accuracy, etc.) and giving users a taste of the Oracle Cloud experience in a short period of time.

You Don’t Have Anything Today and It’s Costing You

I know that may read like a presumptuous fear tactic, but hear me out:

Account reconciliations ARE being completed in your company – one way or another. Whether that means your CPAs are *click*click* clicking away on their keyboards to manually update Excel spreadsheets or – heaven forbid – actually printing out recons to hand sign, if you cannot name the system that is comprehensively handling your reconciliation cycle, it’s because there isn’t one.

And this is normal. But there are costs associated with this normalcy.

Reconciliation cycles aren’t sexy (well…personal taste…) and often have low visibility to upper management. And yet (!) the reconciliation process is often widespread across the company spanning business entities, departments, and corporate ladders (I see you, Mr/s. Director signing off on recons). ARCS is an attractive option when considering enterprise-wide Cloud solutions to “test run” because everyone can try it. A successful ARCS implementation paves the way for easier adoption of future projects – it gets everybody onboard.

Step Through the “ARCSway” and Ditch the Bulls*#!

The shift to the Cloud is disrupting the traditional market of on-premise EPM solutions. As you look at the new strategic options available to your company’s roadmap, consider ARCS as a “first step.” Of note, it is important to have an accurate understanding of the tool – ARCS is first and foremost a management tool, and although it can provide helpful information in troubleshooting account variances, it does not replace actually performing a reconciliation in an ERP system. Additionally, customizing reports can be difficult (unless you are familiar with BI Publisher), although the out-of-the-box reports and strong dashboarding capabilities largely make up for this limitation. All-in-all, I strongly recommend this product as an introduction to the new Oracle offerings. ARCS’ “low risk, high reward” nature provides real company value quickly while presenting you with a good picture of life in the Cloud. Now is the perfect time to ditch your “bull-sand” reconciliation process and update to a more solid foundation in the Cloud through the ARCSway.

Contact us today for details about a custom Cloud solution for your business needs.

Large Sandcastle