EDMCS and Data Governance – Part 3

Welcome to Part 3 – the finale – of the blog series “EDMCS and Data Governance!”

Part 1 provides an introduction and primer for data governance workflows in Enterprise Data Management Cloud Service (EDMCS) which was introduced in the 19.02 release.

Part 2 discusses Workflow Stages in greater detail and dives into the brains of EDMCS workflows – the Approval Policy. Approval policies at different levels of the data chain are explained, and we conclude by building a sample workflow at the dimension level.

In Part 3, I’ll attempt to tie a bow around everything and offer some parting thoughts.

Recap

As I continue to explore and learn about collaborative workflows in EDMCS, these are the key points that come to mind:

  • Emphasize the Fundamentals – No matter what tool you are using, People and Process are extremely important in any data governance solution along with strong executive sponsorship and robust change management.
  • Build the Foundation – get the client comfortable with the tool and content before you introduce workflows. A strong foundation (your applications, dimensions, views, and viewpoints) is needed before you start the plumbing and wiring (workflows).
  • Brush up on Security – I haven’t discussed security extensively in this blog series, but the Oracle EDMCS User Guide does a nice job describing security requirements for assigning and approving workflow requests. Note that security enhancements have been introduced along with workflows. A new “Submitter” permission is now available to go along with Owner, Data Manager, and Browser. And permissions can be assigned at the Application, Dimension, Hierarchy Set, and Node Type levels.
  • Ponder the Approval Policy – this is the most interesting one to me. As we discussed in Part 2, approval policies can be defined at 4 points in the data chain (see Figure 1). With the inheritance and inter-dependencies of approval policies across the data chain along with the actions each policy can govern, it is critical to efficiently design your approval policies up front.

o   For example:

  • Suppose your client requires a final “audit” type of approval across the board for any type of request for any dimension. Or they always a require an upfront “gatekeeper” type of approval to make sure the request is justified and complete before it continues down the approval chain. These would be good candidates for an approval policy at the Application level. And it would avoid having to define duplicative approval policies at lower levels in the data chain.
  • Will your application contain dimensions that do not need data governance workflows? Then Application level approval policies should be avoided.
  • Say you want to limit and govern the actions of a specific group so it can only work with existing nodes (insert, remove, update). An approval policy at the Hierarchy Set level is probably best.

o   Overall, I believe approval policies at the dimension level are a good place to start. Then as the workflows evolve and requirements become more clear, you can determine if there are common factors across all dimension approval policies that can be consolidated at a higher level (Application level approval policy), or if there are specific subsets of actions that need to be broken out to a lower level (Node Type or Hierarchy Set level approval policy).

o   All of which brings up another interesting point: effective approval policy design directly ties into effective viewpoint design. Think about it – you can define the set of Allowed Actions (Add, Insert, Move, etc.) at a Viewpoint level. Which means what? Special-purpose maintenance views are likely required to support certain approval policies, especially those at the Node Type or Hierarchy Set levels.

Figure 1 – Approval Policies and Data Chain

EDMCS and Data Governance – Part 3 - Image 1

How do EDMCS Workflows Compare with DRM/DRG?

I was reluctant to include this section at first because in general, I don’t like comparing Data Relationship Manager (DRM) and EDMCS. Yes, they are both master data management tools and yes, they do share some common concepts and terminology. But overall, the two products are so different in terms of philosophy, deployment design, and underlying architecture that I think comparing the products is often less than helpful.

However, with data governance and collaborative workflows, I feel there is enough commonality that it is worth highlighting a few items. So here goes:

Topic DRM/DRG EDMCS
Workflow Design
  • Based on workflow models and workflow tasks
  • Tasks linked to specific actions (Add Leaf, Add Limb, Insert, Move, etc.)
  • Based on Approval Policies
  • Approval policy level (Application, Dimension, Node Type, Hierarchy Type) determines context and scope of actions governed

 

Workflow Stages
  • Use a Submit stage, a Commit stage, and optionally, one or more Enrich and/or Approve stages
  • ·Use a Submit stage and (implied) Commit stage
  • Approval policies determine approval stages (sequential vs parallel, # of approvers)
  • Requests can be re-assigned for collaboration prior to Submit
User Interface (UI)
  • Form-based design
  • No forms
  • Requesters and approvers interact directly with the viewpoints
Approval Options
  • Support Approve, Reject, and Push Back
  • Support comments, narrative, attachments
  • Support Approve, Reject, and Push Back
  • Support comments, narrative, attachments
Escalations
  • Requests can be escalated based on defined intervals
  • Requests can be escalated based on defined intervals
Separation of Duties
  • Workflows can be configured to prevent a submitter from approving their own request
  • Workflows can be configured to prevent a submitter from approving their own request
Email Notifications
  • Generates email notifications
  • Generates email notifications
Other
  • Supports conditional workflows
  • Supports splitting of requests based on pre-defined criteria
  • Not yet supported

I’m curious if Oracle will introduce a form-based UI for workflows. Part of me would very much like to see that so that you can present a clean user interface to the approvers, hide unnecessary details, and display special instructions and messages, but part of me does not. One of my favorite features of EDMCS is the visual highlighting of pending request changes and the “shopping cart” of request items that are displayed prior to submitting a request. I would hate to lose that by going with a forms-based workflow UI, but perhaps there is a solution that combines the best of both worlds. 

Conclusion

Well that’s it, an initial look at workflows and approval policies in EDMCS. I’m excited to see how this functionality evolves and expands over time. Talk to you next time!

And don’t forget to follow me on Twitter (@kblackEPM) and check out these links for more information:

EDMCS and Data Governance – Part 2

Welcome to Part 2 of the blog series “EDMCS and Data Governance!”

Part 1 provides an introduction and primer for data governance workflows in Oracle Enterprise Data Management Cloud Service (EDMCS) which was introduced in the 19.02 release. This exciting feature addresses a major gap in EDMCS as the product continues to rapidly evolve and mature.

In Part 2, we dive into the details of how to configure workflows. This process revolves around the concept of an “approval policy.” Interestingly, approval policies can be configured at different points of the EDMCS data chain and cascade or inherit to affect downstream points of the data chain.

Workflow Stages

Before we dive into approval policies, let’s discuss EDMCS workflow stages a bit more. They are similar in concept to Data Relationship Governance (DRG) workflow stages. See Figure 1 for an overview:

Figure 1 – EDMCS Workflow StagesEDMCS and Data Governance – Part 2 - Image 1
  1. Submit (or Assign) Request – A request is initially created as you do today. But wait…there’s more! You can Submit the request to immediately move the request into the Approve stage OR you can Assign the request to colleagues to collaborate on the request together. When the request is ready, it is submitted to move to the Approve stage.
  2. Approve Request – The approver(s) have 3 choices:
    • Approve – the request is approved and moves forward (thanks Captain Obvious!).
    • Push Back – like DRG, the request is pushed back to the submitter for clarification or changes, who then updates and resubmits the request.
    • Reject – like DRG, the request is denied and closed. Think of “reject” as the RAID of the data governance world – it kills requests dead.
  3. Commit Request – once fully approved, the request is auto-committed and closed. EDMCS has now been updated.

Approval Policies

Now for approval policies. Approval policies can be configured at 4 levels:

  1. Application
  2. Dimension
  3. Node Type
  4. Hierarchy Set

It is important to note that each data chain object can contain one, and only one, approval policy. However, approval policies have a cascading impact so that multiple approval policies can work in concert to govern and control exactly what you want. Yes, you heard that right:  Approval Policy Inheritance – it’s not just for properties anymore!

The types of actions governed by an approval policy depend on the data chain object it is configured with – see figure 2 below:

Figure 2 – Approval Policies and Data Chain

EDMCS and Data Governance – Part 2 - Image 2As you can see, policies defined at the Application or Dimension level govern all actions (add, delete, insert, remove, move, etc.) while policies defined at the Node Type or Hierarchy Set level govern a subset of actions. Why is this important? Because it means you need to carefully design what types of actions you want to govern and who will perform them. If I define an approval policy at the Hierarchy Set level and then submit a request that Adds 3 accounts, how many approvers are required for the request? A big ZERO! Since I requested “add” actions and only have an approval policy at the Hierarchy Set level, no applicable approval policy exists to govern the request.

Putting It All Together

Let’s walk through an example.

  1. Define Approval Policy

First, I will define an approval policy for the Account dimension. To do this, Inspect either the application or default viewpoint and access the Account dimension from the Definition tab. From there, click the Policies tab.

Here you will see the Approval policy for the Account dimension. Click on the Approval link to inspect the approval policy.

EDMCS and Data Governance – Part 2 - Image 3The General tab will display basic information about the approval policy. You can edit the approval policy name and description if necessary.

EDMCS and Data Governance – Part 2 - Image 4The Definition tab is where the magic happens. Select edit to update the following parameters:

  • Enabled – click this check box to enable the approval policy.
  • Approval Method – select Serial or Parallel.
  • One Approval Per Group – if using Serial approvals, this will automatically be set to “True.” If using Parallel approvals, you can select one approval per group or define a Total Required # of approvers.
  • Include Submitter – enable this to allow the submitter to also be an approver (the submitter’s approval will be automatically granted). If “separation of duties” is required for your company, do not enable this.
  • Reminder Notification – the # of days that will elapse before reminder emails are sent.
  • Approval Escalation – the # of times a reminder occurs before an escalation email will be sent.
  • Approval Groups – select user(s) and/or group(s) to be included in the approval process. When using Parallel approvals, the order of approval groups does not matter. When using Serial approvals, the order of approval groups does matter – you need to list the approval groups in the order that approvals should be executed.

With my example approval policy, I am using serial approvals, 2 approval groups (a Planning group and GL group), a reminder interval of 5 days, and an escalation interval of 2 reminders.

EDMCS and Data Governance – Part 2 - Image 5

  1. Submit Request

Now we’re cooking with gas. It’s time to submit a request. I will submit a request to my default Account viewpoint that includes 1 add, 1 property update, and 1 move. Here is the request in Draft status:

EDMCS and Data Governance – Part 2 - Image 6

Did you notice something new? Look at the Actions button next to Submit. This is where you can assign the request to another user and collaborate with him to finish up the request.

EDMCS and Data Governance – Part 2 - Image 7

EDMCS and Data Governance – Part 2 - Image 8

  1. Approve the Request

After the request is submitted, it is considered “in flight” because it has been submitted, but not yet approved/committed. And look! EDMCS now offers a nice Activity page on the home screen displaying the status of various workflow requests:

EDMCS and Data Governance – Part 2 - Image 9

First, the users in the Planning Approvers group will receive an email notifying them that they have been “invited to approve a request” (it’s very polite):

EDMCS and Data Governance – Part 2 - Image 10

As mentioned earlier, an approver has 3 choices: Approve, Reject, or Push Back. Reject and Push Back are available under the Actions dropdown. Here are the dialog windows that will be displayed for those actions (note the comment field is required):

EDMCS and Data Governance – Part 2 - Image 11

Otherwise, the approver will click the Approve button and see this:

EDMCS and Data Governance – Part 2 - Image 12

And then the same process will continue with the GL Approvers group since I am using Serial approvals. Once again, an approver can reject, push back, or approve. Once approved, the request is committed and closed.

Congratulations! You have now completed your very first data governance workflow request in EDMCS!

Conclusion

This blog post should be useful in providing more details and clarity on workflows, workflow stages, and approval policies. In the third and final post for this series, I’ll offer a recap and some closing thoughts. Talk to you then.

Read the next post in this EDMCS blog series:  EDMCS and Data Governance – Part 3

And don’t forget to follow me on Twitter (@kblackEPM) and check out these links for more information:

OPA! The Future of Cloud Integration – Important Updates Are Coming

Much to the chagrin of Product Management, I often abbreviate Cloud Data Management to CDM.  Why do they not like that I do this?  Well there is a master data management tool for Customer data that you can guess also uses the same acronym.  While I understand the potential confusion, since I’m telling you up front, there should be no confusion when I use CDM throughout this post.

I recently had the opportunity to meet with Oracle Product Management and Development for FDMEE/CDM to get a preview of what’s coming to the product and offer feedback for additional functionality that would benefit the user community.  We generally get together about once a year; however, it’s been a bit longer than that since our last meeting, so I was excited to hear what interesting things Oracle’s been working on and what we may see in the product in the future.

Now any good Oracle roadmap update would not be complete without a safe harbor reminder.  What you read here is based on functionality that does not yet exist.  The planned features described may or may not ever be available in the application – at the sole discretion of Oracle. No buying decisions should be made based on the information contained in this post.

Ok, now that we have that out of the way, let’s get into the fun stuff.  There are a number of enhancements coming and planned, but today I am going to focus on two significant ones:  performance and ground to cloud integration.

Performance Enhancements

We’re all friends here, so we can be honest with each other.  CDM (and FDMEE) isn’t an ETL tool in the truest sense of the word. It is not designed to handle the massive data volumes that more traditional ETL can and does.  You might think to yourself thanks for the info there Tony, but we all know that, and you wouldn’t be wrong, but I like to set the stage a bit.

If you know the history of FDMEE, you know that it was originally designed to integrate with Hyperion Enterprise and then HFM.  Essbase and Planning became targets later.  Integrating G/L data is far different than the more operational data that is often needed by targets like EPBCS and PCMCS.  While CDM (and FDMEE) can technically handle the volume of data with this more granular data, the performance of those integrations are sometimes less than optimal.  This dynamic has plagued users of CDM for years.  It has only been exacerbated when integrations are built that do not have a deep understanding of how to tune CDM (and FDMEE) processes to achieve the highest level of performance within the constructs of the application. As CDM has grown in popularity (owing to the growth of Oracle EPM Cloud), the problem of performance has become more visible.

To address performance concerns, Oracle is planning to support 3 workflow methods:

  • Full – No change from legacy process
  • Full, No Archive – Same workflow as today but data is deleted from the data table (tDataseg) after a successful export.  This means the data table will contain less rows and should allow new rows to be added faster (inserts during the workflow process).  The downside of this method is that drill through is not available.
  • Simple – Same workflow as today but data is never moved from the staging/processing table (tDataSeg_T) to the data storage table (tDataSeg).  This is the most expensive (in terms of time) action in the workflow process so eliminating it will certainly improve performance. The downside is that data can never be viewed in the Workbench and Drill Through is not available.

Oracle has begun testing and has seen performance improvements in the range of 50% in data sets as large as 2 million rows.  To achieve that metric required the full complement of the new features of Data Integrations (i.e., Expressions) to be utilized. That said, this opens up a world of possibility for how CDM can potentially be used.

If you have integrations that are currently less than optimal in terms of performance, continue monitoring for this enhancement.  If you need assistance, feel free to reach out to us to connect with our team of data integration experts.

On-Premise Agent

Ground to cloud integration is one of the most important capabilities to consider when implementing Oracle EPM Cloud.  As the Oracle EPM Cloud has evolved, so too has the complexity of the solutions deployed within it which has steadily increased the complexity of the integrations needed to support solutions.  While integration with on-premises has always been supported through EPM Automate, this requires a flat file to be generated by the system from which data will be sourced. The file is then loaded to the cloud and processed by CDM.  This is very much a push approach to data integration.

The ability of the cloud to pull data from on-premises systems simply did not exist. For integrations with this requirement, FDMEE (or some other application) was needed. Well as the old saying goes, the only thing constant is change.

Opa! – a common Greek emotional expression. It is frequently used during celebrations.  Well it’s time to celebrate because Oracle will soon (CY19) be introducing an on-premises agent (OPA) for CDM!

This agent will allow a workflow to be initiated from CDM, communicate back to the on-premises systems, initialize and then upload an extract to the cloud. The extract will be natively imported by CDM.  This approach is similar to how the FDMEE SAP adaptor currently works.  From an end user perspective, they click Import on the Data Load Workbench and after some time, data appears in the application. What’s happening in the background is that the adaptor is initializing an extract from SAP and writing the results to a flat file which is then imported by the application. OPA will function in an almost identical way.

OPA is a light weight JAVA utility that requires no additional software (other than JAVA) that will be installed on local systems. It will support both Microsoft and Linux operating systems. Like all Oracle on-premises utilities (e.g., EPM Automate), password encryption will be supported. The only port(s) which are required to be opened are 80 (HTTP) or 443 (HTTPS).  A customer can then use an externally facing web server to redirect to an internal port for the agent to receive the request.  This is true only if the customer wants to run the agent on a port other than 80 or 443 and do not want to open that port on their enterprise firewall.  If the customer wants to run the agent on port 80 or 443 and either of those ports are open, then no firewall action would be required.

The on-premises agent will have native support for Oracle EBS and PeopleSoft GL – meaning the queries are prebuilt by Oracle.  Additionally, OPA will support connecting to on-premises relational data sources.  Currently Oracle, SQL Server and MySQL drivers are bundled natively but additional drivers can be deployed as needed meaning systems such as Teradata will be able to be leveraged as data sources.

OPA will also provide an ability to execute scripts (currently planned for JAVA but discussions for Groovy and Jython are in flight) before and after the on-premises extract process.  This is similar to how the BefImport and AftImport event scripts are currently used in FDMEE.  This will allow the agent to perform pre and post processing such as running a stored procedure to populate a data view from which CDM will source data.

The pre and post events of OPA really open up a world of opportunity and lay the foundation for CDM to support scripting.  How you might ask?  In v1.0, OPA is intended to provide a mechanism to load on-premises data to the cloud.  But in theory, CDM could make a call to OPA at the normal workflow events (of FDMEE) and instead of waiting for a data file, simply wait for an on-premises script to return an execution code.  This construct would eliminate the security concerns that prevented scripting from being deployed in CDM as the scripts would execute locally instead of on the cloud.

The OPA framework is really a game changer and will greatly enhance the capability of CDM to provide Oracle EPM Cloud customers a true “all cloud” deployment.  I am thrilled and can’t wait to get my hands on OPA for beta testing.  I’ll share my updates once I get through testing over the next couple of months.  I’ll also be updating the white paper I authored back in December of 2017 once OPA is released to the general public.  Stay tuned folks and feel free to let out a little exclamation about these exciting coming enhancements…OPA!

EDMCS and Data Governance – Part 1

Ahh… February. An interesting month with a variety of happenings. From the significant – Black History Month and President’s Day, to the exciting – the Super Bowl…well sometimes. From the romantic -Valentine’s Day, to the silly – that tenacious ground hog trying to find his shadow…AGAIN. Not to mention that Spring is just around the corner and brings us the glorious event known as “March Madness!”

Why am I babbling about February? <segue> Because it is also the month that introduced Data Governance and Collaborative Workflows with the release of Enterprise Data Management Cloud Service (EDMCS) v19.02. <segue>

As we continue this journey to Enterprise Performance Management (EPM) Cloud, the addition of Data Governance to EDMCS is a major step forward, especially for those of us who have worked with the classic on-premise solutions (Data Relationship Management (DRM) and Data Relationship Governance (DRG)) and who have been awaiting a similar offering in EDMCS to support our Cloud clients. From what I’ve seen so far, a major gap between DRM/DRG and EDMCS has been addressed with this release.

In this blog series, I’d like to further explore Data Governance in EDMCS. At a high level, this is how I see this series unfolding:

  • Part 1 will provide the foundation, background, and basic concepts for EDMCS and Data Governance
  • Part 2 will get more into the “techy” stuff and dive deeper into Approval Policies and Security
  • Part 3 will provide a recap and closing thoughts/lessons learned

So, with that said, onto Part 1…

Prerequisites

Before diving head first into configuring Data Governance and collaborative workflows in EDMCS, there are a few things to consider.

  • Don’t forget people and process. I’m a big believer that people and process are just as (and usually much more) important as the tool. Please refer to this blog post for a quick read on this: The Data Governance Triple Crown.

I believe the same tenets apply to EDMCS and that it’s important to start thinking about a formal data governance program that includes a charter, executive sponsorship, roles & responsibilities, metrics, and much more. Data Governance can be a challenging cultural shift for many organizations which requires strong change management to handle the inevitable resistance. This is where a formal data governance framework can help.

  • Establish the foundation. As with building a house, it’s important to lay a solid foundation before you install the wiring and plumbing. Build your EDMCS application(s) and dimensions, and populate your primary and alternate hierarchies first. Get the client comfortable with the tool and the content. Then you can start to layer in the workflows.
  • Start to identify the “who” (e.g. the people involved and the roles they will play: who will be submitting requests? Who will be approving? Who will do both?
  • Start to think about the “what.” What applications/dimensions/hierarchies will be governed? What are the use cases and typical scenarios that require data governance? Start to collaboratively mock up and storyboard some typical workflows with the client to visualize how the workflows will function. And don’t try to build a workflow for every possible scenario. Start with the big hitters and low hanging fruit first. You can always add more workflows later.

What’s Included in EDMCS Workflows?

Are you wondering what EDMCS includes as far as data governance functionality? In summary, EDMCS supports:

  • Two types of roles – submitters and approvers
  • Separation of duties – workflows can be configured to prevent submitters from approving their own requests
  • The “four eyes” principle: EDMCS data governance adheres to the principle that requests must be approved by at least two people
  • Default application views and maintenance views: workflows can work with both types of views
  • Subscriptions: workflows can be triggered by Subscription requests
  • Email-based notifications
  • Serial and Parallel approvals:
    • Serial approval means a sequential order of approvals is required. For example, Approver #2 can’t approve until Approver #1 approves, Approver #3 can’t approve until Approver #2 approves, and so on.
    • Parallel approval means the approvals can occur in any order and at the same time.
    • With either method, all approvals must occur before the request is committed.
  • Configuration of Reminder and Escalation intervals
  • Multiple Workflow Stages:
    • Submit – initiate the request and add/edit/delete line items in the request. Note that with the 19.02 release, you can also attach documents and insert comments at the line item level. These enhancements are helpful to attach policies, supporting details, and other documentation related to the workflow request.
    • Approve – similar to DRG, an approver can approve, push back, or reject a request. Pushing back will send the request back to the submitter for additional changes. Rejecting will close the request and end the workflow.
    • Commit (implied) – once the request is fully approved, it is committed, hierarchies are updated, and the request history can be viewed like any other request.
  • Approval Policies – this is really the brains of how workflows are configured in EDMCS, and the next blog post cover this in greater detail. But here is a screenshot of the Approval Policy screen showing the available options:

Kevin Black - EDMCS and Data Governance - Part 1 - 3-8-19 Image 1

Conclusion

I hope you found this blog post helpful as an introduction to EDMCS and data governance, and that you will keep reading as the rest of the series is posted. Please contact me with any questions and comments!

And don’t forget to follow me on Twitter (@kblackEPM) and check out/subscribe to my blog (along with the blogs authored by my very talented colleagues at Alithya).

Read the next post in this EDMCS blog series:  EDMCS and Data Governance – Part 2

https://ranzal.blog/author/kblackranzal/

https://ranzal.blog/

Interested in better understanding EDMCS, the RESTful API, and Cloud Data Management? Be sure to check these excellent blog posts by Tony Scalese, aka FDM Guru: https://ranzal.blog/author/ascalese/

Looking for an outstanding resource for all things master data-related and more? Look no further!  https://datarestless.com/

Oracle Announces Removal of Support for Transport Layer Security Protocol 1.0 and 1.1; How Does that Affect Me?

Oracle has announced that as of May 3, 2019, the use of Transport Layer Security Protocols 1.0 and 1.1 will no longer be supported.  Communications to Cloud products will only be supported with TLS1.2.

The announcement was made in the following February What’s New communications from Oracle:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 1

The WHAT has come; now WHO is affected?

There are many ways to connect to the Cloud, so to better understand them, let’s break down the more popular ways of connecting and the common technology that these tools use for their connection, HTTPS:

  1. EPMAutomate
  2. Web Browsers
  3. cURL / PowerShell
  4. Financial Data Quality Management, Enterprise Edition (FDMEE)

EPMAutomate is pretty much a done deal.  If there are issues or fixes needed, Oracle will be releasing an update to go along with the Cloud deployment.  Keep an eye out on the What’s New pages as well as a notification when running EPMAutomate itself.

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 2

Both recent versions of Internet Explorer and FireFox both support TLS1.2 out-of-the-box.  It might not be enabled based on IT policies, but the functionality is present and easy to check.

Internet Explorer > Tools > Internet Options > AdvancedWayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 3

Firefox > about:config > security.tls.versionWayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 4a

Value 1 =  TLS-1.0 and a Value 4 = TLS-1.3

Now if you have ventured out into custom scripting, EPMAutomate doesn’t count in this situation, but have fully embraced REST…. then cURL and PowerShell might need some tweaks as well.  This is the start of the real reason why Oracle has started to outline and share information with the end-user community.

As a result, these solutions will need to be updated and retested.  For this purpose, Oracle has stated that you can early request, via Oracle Support, a TLS1.2-only POD for testing.  I highly recommend this, as it has provided some great insight for Alithya.  We were also able to pass along our findings to Oracle early to help stream-line the patching process of FDMEE; more on this later.

cURL scripts will need to be updated to use the ”–tlsv1.2” command when being invoked.

For PowerShell, you will need to add the following line in your scripts:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

The thing that really got me excited: FDMEE!

The last topic that Oracle mentions is if you use FDMEE on-premise.  If you are like me, an FDMEE fanatic, then you’ll know that this caused all the triggers in my brain to start firing.  All the things that I do in FDMEE will need to be tested to make sure they comply and work.  The things I use in my daily activities are:

  1. JSON based RestFUL API calls in Jython scripts
  2. Target Application Registrations to Cloud Applications

I quickly shot off an Oracle Support ticket to get myself a TLS1.2 POD.  Oracle responded in relatively short time and stated that my POD was ready, and I had it for roughly two weeks for testing.  Without any changes to my virtual-lab, I attempted to connect to see what happens.  Sure enough, I received an error:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 5

I also imported an LCM of a previous Cloud application to get around this error to see what a set of custom Jython scripts with JSON/RestFUL API would produce and received similar errors:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 6

…as well as the out-of-the-box Refresh Metadata & Refresh Members options:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 7

I confirmed with my colleagues in Development that this was the expected result when TLS is not at the right levels and all the appropriate patches are set up and configured.  Knowing this, I also tested with my browser option disabled and received the same result.  So now that I know I have a good starting point, I was off to the races to figure out how to continue.

Unfortunately, the links that Oracle provided in the What’s New announcement appear to be broken and not public.  As a result, I had to create an SR to gain access to the information.  After I received them and did some light reading, I was able to formulate a patch strategy, apply the necessary patches, apply the registry updates, and test again.

This time I was able to run successful tests of both FDMEE scripts and Oracle adaptor connections to the Cloud.

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 8

Great… Now what do we do?

Patching the environments was not always an easy task.  It took quite a bit of time to complete as there were multiple products that needed updates.  Most of them weren’t standard EPM (HFM, Planning, etc.) products that needed updating:  WebLogic, JRockit, JDK, OHS, etc. all needed to be updated, but because these are the building blocks on which the EPM suite runs, they caused update dependencies into the EPM products we used.

Oracle has stated that this is going into effect on May 3rd which is right around the corner.  Alithya, an Oracle Platinum Partner, is here to help you assess your current EPM installation and build that patch plan.

Even if you don’t use the Cloud today but are thinking about moving to the Cloud at some point, it is important to make sure your environment is ready and that you have the necessary support.

For more information, contact us at infosolutions@alithya.com.

Out-of-the-Box Features: Profitability and Cost Management Cloud Service (PCMCS) – Intelligence and Dashboarding: Analysis Views and Scatter Analysis

PCMCS Out-of-the-Box (OOTB) Features:  2. Intelligence and Dashboarding – Analysis Views and Scatter Analysis

Two teams of consultants with similar amounts of experience and prestige guarantee that they can perform an application implementation to the highest quality: one at a higher cost, but shorter timeframe; and the other at a lower cost, but in a longer timeframe?  All other considerations being equal, should I save money, or should I save time?

A few days ago, I released my first blog post on PCMCS, covering Rule Balancing reports usage and customization. This post builds on that first post to cover intelligence capabilities, some of which are only available in the Cloud version of the PCM software.

There are 6 menu options when accessing the Intelligence menu within PCMCS.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 1  1.  Analysis Views

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 2  2.  Scatter Analysis

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 3  3.  Profit Curves

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 4  4.  Traceability

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 5  5.  Queries

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 6  6.  Key Performance Indicators

This post covers the first two menu options to explain how to set up Analysis Views and how to use Scatter Analysis.

Analysis Views

Analysis Views are the first set of reports available to end users within the PCMCS user interface.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 7

These views represent a way to predefine and save intersections of members for future review.  The selections within Analysis Views are open to all dimensions within the PCMCS application at various levels within the hierarchies. This is the first step you need to take towards building or defining a dashboard for your PCMCS application.

If you cannot create or edit an analysis view, then you need to reach out to your PCMCS administrator in order to review and adjust your security settings.

The example Analysis Views for this post are based on the “Demo Bikes” application that can be deployed with a few clicks in your PCMCS instance BksML30.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 8

A data slice is a combination of rows and columns along with the page selection, which, in this case, is the Period dimension.

Any dimension that is not specified in any of the 3 areas (row, column, page) will be read at top level and will be displayed in the settings menu.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 9

The Add Filter section allows you to filter the columns based on specific numerical values. In this case, the columns are represented by the Product dimension selections.

To create an analysis view, click on the plus (+) sign on the main menu. The three tabs displayed will allow you to define a name and description as well as the setup for row and column dimensions. You cannot select more than a dimension for either rows or columns.

Within the Row dimension selection, you can leverage different formulas applicable to the hierarchies within PCM such as Children of member, Member and children, Level 0 descendants, etc.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 10

Columns do not have options for member formulas beyond the usage of User preferences.

The row dimension will allow you to display further information such as generation or level details. For example, for the Product dimension, we can display the generation 3 and 4 information alongside the level 0 members, allowing us to expand our analysis to different product categories, or types.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 11

Selecting new members within the Analysis Views will not impact the original data definition. If you choose to display data for any month other than the one that was setup and saved in the Analysis view, you can do so because the Page parameter is open to end user modifications. If; however, you want to update and store a selection change within the analysis view, you must perform such update via the Edit menu instead of simply selecting a new parameter on the screen in view mode.

You may need to utilize the concept of period ranges when using Analysis Views in order to dynamically reference specific members of your Period dimension.

Defining a current period for the application is mandatory in order to be able to create formulas dependent on time. This action is available via the Application menu by selecting the Edit application option and navigating to the tab called Dimension settings. Here is where you can define the current Period and the Current Year for your PCMCS application.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 13

These settings will be applied when using the “Single…” or “Current” selection options within Analysis Views. Single (-1) Level 0 selection represents, in this case, the month of May, since the current Period selection for the PCMCS application is June. The Single (-1) Level 1 selections return Q1, since June is in Q2.

Scatter Analysis

Scatter Analysis graphs will compare one member’s values against another member’s values. The two members selected must be within the same dimension. Your PCMCS Demo application may not have any sample Scatter Analysis graphs. However, you can create one by leveraging the Analysis Views at your disposal.

You can launch Analysis Views from within Scatter graphs.

Note that saved Scatter Analysis cannot be reused or referenced in dashboards. You should use this section to create graphs for ad-hoc use outside of the dashboarding capability.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 14

If you need to include Scatter Analysis within your dashboards, you will have a corresponding menu item that allows you to create dashboards within the list of available items.

You can select an existing Analysis view, but you must reselect your X-axis and Y-axis dimension references.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 15

Conclusion:  PCMCS Intelligence – Analysis Views and Scatter Analysis

While there are many alternative reporting solutions to use in conjunction with PCMCS applications, assuming that both time and money are of essence in any project implementation, it is safe to conclude that using the PCMCS OOTB reporting features would be cost effective as well as efficient. The Intelligence screens shared in this post are included in the PCMCS subscription cost, and any end user of a PCMCS application with the right level of access can take charge and build the desired reports, saving end users in a location accessible to their peers while spending no time in iterations of reporting requirements and data validations.

The PCMCS OOTB reporting features support not only troubleshooting, but also detailed analysis and reporting within one screen.  Such capabilities should not be ignored as they will surely add meaningful insight into finance teams’ day-to-day use of PCMCS.

If you need advice and guidance on how to leverage the PCMCS reporting capabilities for existing or future applications, reach out to our team of PCMCS experts at infosolutions@alithya.com.

The remaining intelligence menus will be covered in subsequent posts over the next few weeks. If you are interested in receiving notifications of such posts, subscribe to notifications.

Out-of-the-Box Features: Profitability and Cost Management Cloud Service (PCMCS) – Rule Balancing Reports

PCMCS Out-of-the-Box (OOTB) Features:  1. Rule Balancing Reports

The other day, I was thinking about the times I used to study Finance, and specifically about a course regarding Interest and how it represents the value of Time. What is the cost, or value, of one’s time? – is it high, resulting in a higher interest rate per period, or is it low, resulting in a low interest rate per period? How much time am I willing to spend working in order to get that new car? How much time do I have before that competitor will outrun me and snatch that market share from me?

This was how I started thinking about various out-of-the-box features (OOTB). Such features are often key in deciding whether to acquire a software/service/product because the one resource that we constantly complain about not having enough of is “time.”

You are now reading the first blog post on OOTB features in PCMCS covering one of the most used Reports for data analysis as well as troubleshooting profitability calculation results. At the end of this blog post, you should know what Balancing reports are, where to find them, how to use them, and also how to further expand them with minimal time and effort invested.

What are Rule Balancing Reports?

Rule Balancing reports provide quick insight into the validity of the application results. These reports are powerful OOTB artifacts that can be further configured to cater to any custom application requirements in order to support validation of calculation results as well as contribution analysis and traceability.

The PCMCS OOTB Rule balancing report is initially based on a Default Model View with a standard selection of upper level members for each dimension. Starting from this Default Model View, the administrators or users of the PCMCS applications can perform a deep dive analysis on more granular intersections and configure detailed reports for a ruleset or a group of rulesets they choose to investigate.

The Default Rule Balancing report is available as soon as the application has been deployed, and it can be accessed via the Main Navigator menu found under the Manage section.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 1I will be using the default BikesML30 application to demonstrate the capabilities of the Rule Balancing reports. If you have loaded your sample application and cannot see any results in the Rule Balancing reports, check that you ran your end-to-end calculations for any given POV from the Manage Calculation Menu. The POV I have chosen for this demonstration is FY16, January, Actual Scenario.

As you open the Rule Balancing menu, the Default Model View is the only view available when you initially set up your application and your allocation rules. Any other Rule Validation reports that you see within the Demo application besides the Default Model View have been built and configured outside of the out-of-the-box list of features.

What are PCMCS Model Views?

A Model View represents a predefined data slice within the PCMCS application; consider the model views as a set of selections of members for each dimension that displays only the relevant data points for a required intersection.

Rule Balancing Report Example

After running the entire set of allocation rules within the Demo BksML30 application, the Rule Balancing report should look like this:

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 2

The description of each rule selected will be displayed along with the rule number. The rules will be displayed in the order that they were launched following the user-defined sequencing, regardless of the actual Rule Number/Rule ID that has been assigned.

  • The “Input” column enables users to confirm that what was loaded into the application matches the expected values received from the source system.
  • The Allocation In and Allocation Out columns validate the allocations performed by the application from both a balance perspective (Allocation In should be equal and opposite to Allocation Out) and a numeric one.  The balance aspect is particularly of interest when allocations are executed with custom calculation rules.  In these cases, two separate rules are typically required, one for the “credit out” and one for the “debit in.”  As such, there is a greater risk that the formulas for the outbound and inbound values will not produce amounts equal and opposite in total, thereby causing an undesired imbalance.  In these situations, the Allocation In and Allocation Out values are shown on two separate rows, and they quickly illustrate to the user the success of their calculations.

Rule Balancing and Smart View Ad Hoc Reports

Any highlighted data point/data value in the Balancing screen will allow you to further investigate the allocation step through a Smart View ad hoc report. These hyperlinks represent pre-built/pre-defined queries that point directly to the Essbase database, allowing you to further expand the analysis of a selected data point.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 3

When you click on the highlighted number, a Smart View link will be downloaded to your workstation.

As an example, you can see how the detail for Net Change looks like for the Custom calculation rule R0001 – Utilities Expense Adjustment in a Linked report in Smart View.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 4

The column headers for the Rule Balancing report will list the relevant Balance dimension members. If there are members that are not populated, these will be automatically filtered out of the view. You can choose to display them by selecting View -> Columns and tagging the members you would like to display on your report – whether they have data or not.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 5

For further information on what each of these Balance dimension members represent, check out my blog post on Demystifying the Balance dimension in PCMCS.

You can view and edit the model view definition in the collapsed area between the POV and the Balancing report.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 6

The Input data on this customized Model View is pertinent only to Operating Expenses rather than the entire pool of data. This is the reason that the total USD value may be different from data displayed on the Default Model View report.

You can perform ad hoc edits to the Model View as you are using it, but none of the newly made selections will be stored. If you want to apply permanent changes to a specific Model View selection, you will have to edit the Model View in the corresponding menu.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 7

Your Model Views can be defined in the same order of operations as your allocations, or you can choose to create Model Views that are more detailed and dive deeper into a custom grouping of rules, regardless of the ruleset to which they might belong. The only dimensions displayed in your Model View selection are the Business dimensions. POV, Balance, Rule, and Attribute dimensions are not represented and therefore are not open for selection. The data points you define in the Model view will apply to all relevant rules IDs that generated the new cells.

Enhancing and Customizing Your Rule Balancing Reports

In the Demo BikesML30 application, there are several standard Rule Balancing reports that are split by Ruleset while others are named “Trace.” The Trace Model views are built in order to support point troubleshooting of allocation areas that are either complex or open to high variation during each run.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 8

If you want to use the Rule Balancing report values outside of the ad hoc capacity, you can export the report into XLS, but remember that such an export will not represent a Smart View report – it will simply be a listing of the information presented on the Rule Balancing screen, as some members displayed here do not have a direct equivalent in the application (Running Remainder, Running Balance). This export option can be found in the Actions menu, export to Excel, or by selecting the button in the below screen capture.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 10

A new workbook is downloaded called RuleBalance, and the entire set of data displayed on your screen will be available in XLS.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 9

PCMCS Rule Balancing Drawbacks

Rule Balancing does not allow filtering based on Attributes, UDAs, or Names.

Rule Balancing hyperlinks open SmartView tabs called Linked View, and any new selections of links within the Rule Balancing report will overwrite the contents of the existing tab. If you start developing a report by using Rule Balancing, remember to always rename the tab in case you want to kick off another report for a secondary data point within the same workbook.

Common Issues When Using Rule Balancing Reports

“Rule Balancing Report Links Don’t Work”

Your workstation must have Smart View installed before using the hyperlink feature within PCMCS. The latest Smart View version is available for download through the Navigator main menu under the Installations section.  For more guidance on generic EPM product patching, read the blog post Patch Today! Don’t Delay!

When selecting a hyperlink in the Rule Balancing report, you should be able to see that a download has started. As you click on the downloaded content, a new Excel tab will open, and you will be prompted to enter your Cloud credentials in order to have access to the requested data point intersections. If you do not have Excel open at the time you are accessing the downloaded content, the prompt to enter your Cloud credentials may not appear on the screen.

“I Can’t See Any Data in the PCMCS Rule Balancing Report.”

If data is not displayed on the screen, you are looking at one of the following situations:

  1. There is no data loaded and/or calculated for the POV at the intersections you have defined in the Rule Balancing report. Check your job console to see if such tasks have been triggered and completed successfully.
  2. Your security setup is restricting you from seeing any data values. Reach out to your administrator to adjust data grants or application access.
  3. (This used to happen occasionally during on-premise implementations) If your Business dimensions are tagged as Label Only, check that the first child contains values. You may be able to see data at base level intersections within your application, yet the Rule Balancing report shows no vales due to the Dimension Type, Member Storage, or Aggregation operators you have defined in the metadata.

“I Can’t Create a PCMCS Model View.”

This restriction is based on provisioning. Reach out to your PCMCS Administrator for assistance with your profile or settings.

Rule Balancing Wrap Up

Rule Balancing reports are easy to set up and use.  They retrieve data quickly, are accessible to all application users through the same menu, and they should be the first stop during a model run to quickly identify if there were any issues with data allocations.

Because Rule Balancing is a fast reporting tool with a predefined template OOTB, it is one of the commonly used troubleshooting reports for PCMCS, which can be leveraged for quick balance checks. It is also a mechanism for quick report building at detailed Rule level, a faster alternative to reading the Rule definition and manually replicating the intersections in a Smart View report.   Because these reports are system generated and their hyperlinks are based on application and rules set-up, there is no room for manual errors when building validations.

Save precious time by leveraging the PCMCS OOTB functionality. The next post in this series covers Intelligence screens – Analysis Views and Scatter Analysis.  If you have further questions on the usage of Balancing Reports within PCMCS, please reach out to our team of PCMCS experts at infosolutions@alithya.com.

Implementing Zero Based Budgeting: Setting Up Your Environment

The previous post – Implementing Zero-Based Budgeting: The Requirements – outlined two key components of a successful zero-based budgeting program:  a culture change and a centralized system. We recommended creating a centralized system with Oracle Planning and Budgeting Cloud Service (PBCS)/Enterprise Plainning and Budgeting Cloud Service (EPBCS) because of the many advantages it provides such as an environment with data depth.

Even with a zero-based budgeting blueprint, many companies are still hesitant to go “all in” thinking that a zero-based budgeting program implementation requires too much time and resources. The introduction of Cloud services such as Oracle PBCS/EPBCS makes the implementation of a centralized financial system easier than ever, greatly reducing the barrier to entry.

This final post in this series shares the power of a PBCS/EPBCS environment to achieve the greatest success with a newly implemented zero-based budgeting program.

How Can PBCS/EPBCS Environments Enhance the ZBB Experience?

There are four key ways to gain the most from a PBCS or EPBCS environment, including the setup of targets and accountability metrics that offer more meaningful data and greater transparency when making budgeting decisions.

Clients are often given target settings goals in management meetings or over the phone, but we demonstrate for them how to integrate this into their budgeting systems. On numerous occasions, Alithya has been contracted to implement target settings where leadership sets growth targets and the systems flows down the revenue by service, product line, etc. In turn, analysts match the underlying details.

Not surprisingly, this is a common request because target setting has been a long-time tradition during the budget process. By setting up this target setting process in PBCS\EPBCS, an off-line process is instead online and is molded with the overall budgeting system process.  Combining that with the zero-based budgeting mantra allows targets to be set and provides analysts with their needed baseline.  Moreover, analysis can be done on departments that take the typical “reduce expenses by 10% approach” to archive the target number instead of the more insightful zero-based budget journey.  Yes, target setting in a centralized system is easier, but the benefit of a centralized system is the ability to see how teams react to the new target.  Did they take the traditional “reduce budget percentages to fit the numbers,” or did they look at their budget as a whole and analyze each line item and question the numbers organically?

After targets are set and the budget is approved, we look at the said cost saving come to fruition.  A centralized system allows capital projects or initiatives to be tracked to help systematically measure the expenditures of cost savings activities found during the zero-based budget discovery. This provides a clear picture of what each department is doing and holds them more accountable for project decisions. It is an achievement to complete a zero-based budget “diet,” but holding teams accountable brings them to the next level of the zero-based budget “lifestyle.”

In essence, this new budgeting environment provides better insight into data – insight that ultimately allows savings to be found more effectively. For example, if you want to see the cost of direct materials, this centralized system can be set up to capture the costs in order to analyze and keep track of the different KPIs that reduce or increase overall costs.

Another example of how this works is by segmenting down employee costs such as travel. Instead of having a run rate of 10% of direct labor or travel costs, determine what job or tasks required that travel and use this KPI to negotiate travel expenses to further drive down costs.  Essentially, use PBCS/EPBCS as a tool to capture KPIs (e.g. travel costs by job) and determine the best use of travel dollars and – more importantly – negotiate with vendors on key travel.

Lastly, a budgeting environment provides clarity to help teams make better informed decisions about future initiatives. With the ability to see all of the underlying data points in a single location, it is possible to identify past sales and marketing campaigns and expenditures that led to profitable customers. Therefore, zero-based budgeting teams that took the initiative to determine the best sales and marketing costs to benefit analysis from the ground up are able to dedicate more resources (e.g. dollars, people, etc.) to winning strategies.  This is in contrast to the traditional budgeting approach of “10% rate of marketing spend year-of-year” that often masks the winning and more importantly losing marketing initiatives. Moreover, such planning and availability of different data points helps draw key inferences that allow sales and marketing teams to be more successful.

Summary 

Utilizing a Cloud service such as Oracle PBCS/EPBCS makes it easier for companies to implement a centralized system and achieve success with a zero-based budgeting program. PBCS/EPBCS environments can and should be set up in a way that enhances the zero-based budgeting experience. This is achieved by integrating target setting goals and establishing accountability metrics that allow a deeper dive into budget data while providing greater transparency to make better informed decisions.

To learn more about zero-based budgeting best practices and to get professional help with your Oracle PBCS/EPBCS environments, feel free to contact our team of experts.

Oracle’s ARCS Patch 1812 and Patch 1811 Review: Gazing into the Crystal Ball

Peruse the Account Reconciliation Cloud Service (ARCS) forums on Oracle’s Cloud Customer Connect and you’ll notice a theme: Transaction Matching. Questions, comments, and critiques have been flooding in from across companies and industries, clients and consultants alike. Combine this with Oracle’s game-changing announcement of the EPM Cloud price simplification plan teased for 2019 – that is, the strategic move to strictly sell bundled EPM Cloud products in the near future (more on this another time – it’s a doozy) – the changes released for ARCS in Patches 1811 and 1812 could not have come at a more opportune time. Furthermore, these changes provide a sneak peek into Oracle’s crystal ball of what’s to come.

The WHAT has come: Changes for ARCS at the end of 2018

The most important change to ARCS from the 2018 season finale, Patch 1812, is the shift from having separate reconciliations between Transaction Matching and Reconciliation Compliance to one standardized use of Profiles. This is configured through the new reconciliation methods provided in Formats (Balance Comparison with Transaction Matching, Account Analysis with Transaction Matching, and Transaction Matching only).

Oracle’s ARCS Patch 1812 and Patch 1811 Review - Gazing into the Crystal Ball Image 1

The implication is that Transaction Matching reconciliations receive all the benefits that previously only Reconciliation Compliance enjoyed, including but not limited to: bulk uploads/updates to Profiles and reconciliations, access to new Workflow options such as Reviewers and Teams, and detailed filtering options including the more hidden statistical metrics (such as attributes related to count, etc.). It is important to note, though, that these new features will almost exclusively relate to new reconciliations using one of the two ‘*with Transaction Matching’ format options, as seen below. Still, the opportunity for clever design is there.

Oracle’s ARCS Patch 1812 and Patch 1811 Review - Gazing into the Crystal Ball Image 2

Oracle’s ARCS Patch 1812 and Patch 1811 Review - Gazing into the Crystal Ball Image 3

Furthermore, to support this change, Period will now be shared between the two feature sets. Additionally, reconciliations that are performed in Transaction Matching will now utilize their period-end Balances loaded to Reconciliation Compliance. While historically there have been business processes put in place to ensure that the balance loaded to Transaction Matching equaled the balance loaded for the month-end reconciliation in Reconciliation Compliance, patch 1812 ensures that a system process governs the data’s integrity – certainly a more reassuring thought.

Two additional under-the-radar features introduced in Patch 1812 are (1) the ability to have Workflow that includes multiple members while not requiring an order precedence to the work and (2) the option to now have end-users approve their own re-assignments, reducing the administrative bottleneck. These changes provide value-add functionality that demonstrate Oracle’s willingness to listen to customer feedback even during these more “stuffed” patches.

The last item to mention was actually included in Patch 1811. In Transaction Matching, a text file can now be generated with the transactions or adjustments from the tool which can then be uploaded to the ERP source systems as a journal adjustment. This has been an ongoing request, and I am happy to see it finally actualized.

The WHAT does it mean: Implications and Expectations for ARCS in 2019

Transaction Matching’s relative strength to its competitors is becoming increasingly apparent, as Oracle continues to sure up areas in need of support while also providing updates that show a sensitivity to market demand. The move to unify Transaction Matching and Reconciliation Compliance is not a new idea, as Patch 1805 made apparent with the uniting of the two UIs (and much more – see Oracle Product Management’s webinar update here), but nonetheless is a bold one that I anticipate will pay dividends. The automatic conversion of Transaction Matching reconciliations to Profiles is a nice touch too, making the transition an easier pill to swallow for skeptical clients who I am sure were not eager to pay expensive consulting fees for this. Even smaller changes such as providing a space for strictly manual matching (i.e. without Auto Match rules; Patch 1811 change) demonstrate ARCS’ commitment to be an approachable and modular product that grows with your company – a benefit I have consistently touted in the past, and I expect to continue to do so in the future.  More details about the benefits of ARCS are shared in the posts A Safe Step into the Cloud: The Argument for Account Reconciliation Cloud Service (ARCS) and Modularity in Account Reconciliation Cloud Service (ARCS): No Mistakes from “Day 1” to “Day 100”

Changes continue to come to ARCS that only slowly trickle, if at all, down to Account Reconciliation Manager (ARM). This was true for the Variance Analysis reconciliation method which arrived in May 2017 for ARCS, but not until Dec 2017 for ARM, and it is a fair guess that this will be true for the aforementioned “All Preparers” and “All Reviewers” workflow options and end-user re-assignment configuration setting. Combine this with more and more dollars being invested in Transaction Matching compared to Reconciliation Compliance (from where I’m looking, anyway), and the message is clear on who the favorite is in the Oracle product family. While ARM contains strong functionality as an on-premise option, expect the functionality gap to increase compared to its Cloud counterpart.

Lastly, the inclusion of a journal adjustment export out of Transaction Matching is a combo solution: a “we can do that too” to product competitor Blackline’s existing functionality as well as a demonstration of Oracle’s willingness to think outside of the product. This highlights ARCS’ flexibility as a tool capable of being used within other processes. In fact, the Oracle EPM Cloud ecosystem is one of ARCS’ biggest strengths over its competitors.  I would love to see this journaling ability out of Reconciliation Compliance as well which would provide the functionality to most ARCS clients. Regardless, this is a step in the right direction.

This post has been cross-posted on the #DataRestless blog site – read it here and other Oracle-related posts as well.

Implementing Zero-Based Budgeting: The Requirements

A Culture Change and a Centralized System

The first post in this 3-post series – Implementing Zero-Based Budgeting: Benefits, Myths, and Goals – covers the benefits of zero-based budgeting. To summarize, it enables you to achieve long-term savings that result in sustainable growth and holds your financial analysts accountable for the cost figures they approve and how they are managing the overall budget. This allows more effective recognition of any unwanted costs and how you that money can be shifted into other growth areas within the company.

However, to reap the benefits of a zero-based budgeting program, a culture change is needed first at certain levels within the company. The goal is to eventually have the entire company complete this culture shift, but it is best to start small. Along with a change in culture, a centralized reporting system needs to be created as well to provide teams the ability to share real-time numbers with each other to achieve the goals of this new budgeting program.

Better Than a Quick Fix

What exactly is meant by a culture change? This means starting small and fostering this culture change in other departments starting with Finance. To be successful with this new program, other departments will eventually have to jump on board with this new budgeting approach. These departments will need to step up in analyzing their own costs and how they can save more without diminishing their capabilities.

For example, while financial analysts talk to the shop floor to see where costs can be reduced, the HR department should work with Finance to determine how it can become leaner. Moreover, the IT department should take the lead on negotiating with its vendors to find any areas that can be saved. These are just a few examples of how different departments can step up to the plate; implementing a successful zero-based budgeting program will requires team effort.

Changing the culture doesn’t happen overnight. Senior leaders should take the lead in fostering this change. To ensure that everyone is on the same page, managers need advocate the new approach within their respective departments.

Incentives also help teams to buy into this new budgeting approach.  Although incentives for growth metrics may already exist, additional incentives can effectively encourage staff to find ways to reduce costs for the metrics they manage.

Some examples of incentive metrics are the realized ROI based on the requested capital expenditure and the total cost saving dollars resulting from a zero-based budgeting program. For the former, this can mean moving to the Cloud to save money or reducing redundant tasks by introducing centralized software. For the latter, it can be exemplified by achieving a 10% cost reduction per phone.

Best Practice to Achieve Success

A crucial component of the success of a zero-based budgeting program is an officer who governs the entire process from start to finish. This individual (or team) should contain deep knowledge of the budgeting process. Naturally, s/he will not know the ins and outs of each department, so that is why s/he needs to be an ambassador to department leaders. The officer will also provide oversight to ensure that past bad habits of budgeting do not return to plague this new program. And lastly, s/he must be dedicated to the craft of continuous improvement which means seeking outside counsel when needed.

As mentioned earlier in the post, a culture change needs to be accompanied by a centralized reporting system. Alithya has helped clients implement Oracle Planning and Budgeting Cloud Service (PBCS) and Enterprise Planning and Budgeting Cloud Service (EPBCS) and overcome the deficiencies of Excel-based models. These models lose sight of what the true cost numbers are because past budgets are simple anchors of history rather than detailed breakdowns of cost. Moreover, these numbers become siloed within the vast library of Excel models. With Oracle PBCS or EPBCS, budgets can be highly surgical and help leaders in the company pinpoint reductions.

A centralized system allows the capture of all changes in a single location in real-time, and it provides insight into how effectively managers seek cost savings. This can be used as a key indicator to determine if their actions are in line with this new methodology.

Furthermore, centralization not only holds managers more accountable, but it also empowers them to create innovative cost-saving solutions. Driven by incentives, staff will burn with a clear purpose to find new ways to achieve sustainable growth for the company and be rewarded for hard work.

Recapping What It Takes to Achieve ZBB Success

The goal is to create a cost savings culture that allows more capital to be invested into growing parts of the company. To be successful, follow the best practices outlined, starting with a culture change within the company and giving your teams a centralized PBCS and EPBCS system to more clearly see all data points. The hard work does not stop here, though! The next post delves into setting up a zero-based budgeting system.