EDMCS and Data Governance – Part 3

Welcome to Part 3 – the finale – of the blog series “EDMCS and Data Governance!”

Part 1 provides an introduction and primer for data governance workflows in Enterprise Data Management Cloud Service (EDMCS) which was introduced in the 19.02 release.

Part 2 discusses Workflow Stages in greater detail and dives into the brains of EDMCS workflows – the Approval Policy. Approval policies at different levels of the data chain are explained, and we conclude by building a sample workflow at the dimension level.

In Part 3, I’ll attempt to tie a bow around everything and offer some parting thoughts.

Recap

As I continue to explore and learn about collaborative workflows in EDMCS, these are the key points that come to mind:

  • Emphasize the Fundamentals – No matter what tool you are using, People and Process are extremely important in any data governance solution along with strong executive sponsorship and robust change management.
  • Build the Foundation – get the client comfortable with the tool and content before you introduce workflows. A strong foundation (your applications, dimensions, views, and viewpoints) is needed before you start the plumbing and wiring (workflows).
  • Brush up on Security – I haven’t discussed security extensively in this blog series, but the Oracle EDMCS User Guide does a nice job describing security requirements for assigning and approving workflow requests. Note that security enhancements have been introduced along with workflows. A new “Submitter” permission is now available to go along with Owner, Data Manager, and Browser. And permissions can be assigned at the Application, Dimension, Hierarchy Set, and Node Type levels.
  • Ponder the Approval Policy – this is the most interesting one to me. As we discussed in Part 2, approval policies can be defined at 4 points in the data chain (see Figure 1). With the inheritance and inter-dependencies of approval policies across the data chain along with the actions each policy can govern, it is critical to efficiently design your approval policies up front.

o   For example:

  • Suppose your client requires a final “audit” type of approval across the board for any type of request for any dimension. Or they always a require an upfront “gatekeeper” type of approval to make sure the request is justified and complete before it continues down the approval chain. These would be good candidates for an approval policy at the Application level. And it would avoid having to define duplicative approval policies at lower levels in the data chain.
  • Will your application contain dimensions that do not need data governance workflows? Then Application level approval policies should be avoided.
  • Say you want to limit and govern the actions of a specific group so it can only work with existing nodes (insert, remove, update). An approval policy at the Hierarchy Set level is probably best.

o   Overall, I believe approval policies at the dimension level are a good place to start. Then as the workflows evolve and requirements become more clear, you can determine if there are common factors across all dimension approval policies that can be consolidated at a higher level (Application level approval policy), or if there are specific subsets of actions that need to be broken out to a lower level (Node Type or Hierarchy Set level approval policy).

o   All of which brings up another interesting point: effective approval policy design directly ties into effective viewpoint design. Think about it – you can define the set of Allowed Actions (Add, Insert, Move, etc.) at a Viewpoint level. Which means what? Special-purpose maintenance views are likely required to support certain approval policies, especially those at the Node Type or Hierarchy Set levels.

Figure 1 – Approval Policies and Data Chain

EDMCS and Data Governance – Part 3 - Image 1

How do EDMCS Workflows Compare with DRM/DRG?

I was reluctant to include this section at first because in general, I don’t like comparing Data Relationship Manager (DRM) and EDMCS. Yes, they are both master data management tools and yes, they do share some common concepts and terminology. But overall, the two products are so different in terms of philosophy, deployment design, and underlying architecture that I think comparing the products is often less than helpful.

However, with data governance and collaborative workflows, I feel there is enough commonality that it is worth highlighting a few items. So here goes:

Topic DRM/DRG EDMCS
Workflow Design
  • Based on workflow models and workflow tasks
  • Tasks linked to specific actions (Add Leaf, Add Limb, Insert, Move, etc.)
  • Based on Approval Policies
  • Approval policy level (Application, Dimension, Node Type, Hierarchy Type) determines context and scope of actions governed

 

Workflow Stages
  • Use a Submit stage, a Commit stage, and optionally, one or more Enrich and/or Approve stages
  • ·Use a Submit stage and (implied) Commit stage
  • Approval policies determine approval stages (sequential vs parallel, # of approvers)
  • Requests can be re-assigned for collaboration prior to Submit
User Interface (UI)
  • Form-based design
  • No forms
  • Requesters and approvers interact directly with the viewpoints
Approval Options
  • Support Approve, Reject, and Push Back
  • Support comments, narrative, attachments
  • Support Approve, Reject, and Push Back
  • Support comments, narrative, attachments
Escalations
  • Requests can be escalated based on defined intervals
  • Requests can be escalated based on defined intervals
Separation of Duties
  • Workflows can be configured to prevent a submitter from approving their own request
  • Workflows can be configured to prevent a submitter from approving their own request
Email Notifications
  • Generates email notifications
  • Generates email notifications
Other
  • Supports conditional workflows
  • Supports splitting of requests based on pre-defined criteria
  • Not yet supported

I’m curious if Oracle will introduce a form-based UI for workflows. Part of me would very much like to see that so that you can present a clean user interface to the approvers, hide unnecessary details, and display special instructions and messages, but part of me does not. One of my favorite features of EDMCS is the visual highlighting of pending request changes and the “shopping cart” of request items that are displayed prior to submitting a request. I would hate to lose that by going with a forms-based workflow UI, but perhaps there is a solution that combines the best of both worlds. 

Conclusion

Well that’s it, an initial look at workflows and approval policies in EDMCS. I’m excited to see how this functionality evolves and expands over time. Talk to you next time!

And don’t forget to follow me on Twitter (@kblackEPM) and check out these links for more information:

EDMCS and Data Governance – Part 2

Welcome to Part 2 of the blog series “EDMCS and Data Governance!”

Part 1 provides an introduction and primer for data governance workflows in Oracle Enterprise Data Management Cloud Service (EDMCS) which was introduced in the 19.02 release. This exciting feature addresses a major gap in EDMCS as the product continues to rapidly evolve and mature.

In Part 2, we dive into the details of how to configure workflows. This process revolves around the concept of an “approval policy.” Interestingly, approval policies can be configured at different points of the EDMCS data chain and cascade or inherit to affect downstream points of the data chain.

Workflow Stages

Before we dive into approval policies, let’s discuss EDMCS workflow stages a bit more. They are similar in concept to Data Relationship Governance (DRG) workflow stages. See Figure 1 for an overview:

Figure 1 – EDMCS Workflow StagesEDMCS and Data Governance – Part 2 - Image 1
  1. Submit (or Assign) Request – A request is initially created as you do today. But wait…there’s more! You can Submit the request to immediately move the request into the Approve stage OR you can Assign the request to colleagues to collaborate on the request together. When the request is ready, it is submitted to move to the Approve stage.
  2. Approve Request – The approver(s) have 3 choices:
    • Approve – the request is approved and moves forward (thanks Captain Obvious!).
    • Push Back – like DRG, the request is pushed back to the submitter for clarification or changes, who then updates and resubmits the request.
    • Reject – like DRG, the request is denied and closed. Think of “reject” as the RAID of the data governance world – it kills requests dead.
  3. Commit Request – once fully approved, the request is auto-committed and closed. EDMCS has now been updated.

Approval Policies

Now for approval policies. Approval policies can be configured at 4 levels:

  1. Application
  2. Dimension
  3. Node Type
  4. Hierarchy Set

It is important to note that each data chain object can contain one, and only one, approval policy. However, approval policies have a cascading impact so that multiple approval policies can work in concert to govern and control exactly what you want. Yes, you heard that right:  Approval Policy Inheritance – it’s not just for properties anymore!

The types of actions governed by an approval policy depend on the data chain object it is configured with – see figure 2 below:

Figure 2 – Approval Policies and Data Chain

EDMCS and Data Governance – Part 2 - Image 2As you can see, policies defined at the Application or Dimension level govern all actions (add, delete, insert, remove, move, etc.) while policies defined at the Node Type or Hierarchy Set level govern a subset of actions. Why is this important? Because it means you need to carefully design what types of actions you want to govern and who will perform them. If I define an approval policy at the Hierarchy Set level and then submit a request that Adds 3 accounts, how many approvers are required for the request? A big ZERO! Since I requested “add” actions and only have an approval policy at the Hierarchy Set level, no applicable approval policy exists to govern the request.

Putting It All Together

Let’s walk through an example.

  1. Define Approval Policy

First, I will define an approval policy for the Account dimension. To do this, Inspect either the application or default viewpoint and access the Account dimension from the Definition tab. From there, click the Policies tab.

Here you will see the Approval policy for the Account dimension. Click on the Approval link to inspect the approval policy.

EDMCS and Data Governance – Part 2 - Image 3The General tab will display basic information about the approval policy. You can edit the approval policy name and description if necessary.

EDMCS and Data Governance – Part 2 - Image 4The Definition tab is where the magic happens. Select edit to update the following parameters:

  • Enabled – click this check box to enable the approval policy.
  • Approval Method – select Serial or Parallel.
  • One Approval Per Group – if using Serial approvals, this will automatically be set to “True.” If using Parallel approvals, you can select one approval per group or define a Total Required # of approvers.
  • Include Submitter – enable this to allow the submitter to also be an approver (the submitter’s approval will be automatically granted). If “separation of duties” is required for your company, do not enable this.
  • Reminder Notification – the # of days that will elapse before reminder emails are sent.
  • Approval Escalation – the # of times a reminder occurs before an escalation email will be sent.
  • Approval Groups – select user(s) and/or group(s) to be included in the approval process. When using Parallel approvals, the order of approval groups does not matter. When using Serial approvals, the order of approval groups does matter – you need to list the approval groups in the order that approvals should be executed.

With my example approval policy, I am using serial approvals, 2 approval groups (a Planning group and GL group), a reminder interval of 5 days, and an escalation interval of 2 reminders.

EDMCS and Data Governance – Part 2 - Image 5

  1. Submit Request

Now we’re cooking with gas. It’s time to submit a request. I will submit a request to my default Account viewpoint that includes 1 add, 1 property update, and 1 move. Here is the request in Draft status:

EDMCS and Data Governance – Part 2 - Image 6

Did you notice something new? Look at the Actions button next to Submit. This is where you can assign the request to another user and collaborate with him to finish up the request.

EDMCS and Data Governance – Part 2 - Image 7

EDMCS and Data Governance – Part 2 - Image 8

  1. Approve the Request

After the request is submitted, it is considered “in flight” because it has been submitted, but not yet approved/committed. And look! EDMCS now offers a nice Activity page on the home screen displaying the status of various workflow requests:

EDMCS and Data Governance – Part 2 - Image 9

First, the users in the Planning Approvers group will receive an email notifying them that they have been “invited to approve a request” (it’s very polite):

EDMCS and Data Governance – Part 2 - Image 10

As mentioned earlier, an approver has 3 choices: Approve, Reject, or Push Back. Reject and Push Back are available under the Actions dropdown. Here are the dialog windows that will be displayed for those actions (note the comment field is required):

EDMCS and Data Governance – Part 2 - Image 11

Otherwise, the approver will click the Approve button and see this:

EDMCS and Data Governance – Part 2 - Image 12

And then the same process will continue with the GL Approvers group since I am using Serial approvals. Once again, an approver can reject, push back, or approve. Once approved, the request is committed and closed.

Congratulations! You have now completed your very first data governance workflow request in EDMCS!

Conclusion

This blog post should be useful in providing more details and clarity on workflows, workflow stages, and approval policies. In the third and final post for this series, I’ll offer a recap and some closing thoughts. Talk to you then.

And don’t forget to follow me on Twitter (@kblackEPM) and check out these links for more information:

EDMCS and Data Governance – Part 1

Ahh… February. An interesting month with a variety of happenings. From the significant – Black History Month and President’s Day, to the exciting – the Super Bowl…well sometimes. From the romantic -Valentine’s Day, to the silly – that tenacious ground hog trying to find his shadow…AGAIN. Not to mention that Spring is just around the corner and brings us the glorious event known as “March Madness!”

Why am I babbling about February? <segue> Because it is also the month that introduced Data Governance and Collaborative Workflows with the release of Enterprise Data Management Cloud Service (EDMCS) v19.02. <segue>

As we continue this journey to Enterprise Performance Management (EPM) Cloud, the addition of Data Governance to EDMCS is a major step forward, especially for those of us who have worked with the classic on-premise solutions (Data Relationship Management (DRM) and Data Relationship Governance (DRG)) and who have been awaiting a similar offering in EDMCS to support our Cloud clients. From what I’ve seen so far, a major gap between DRM/DRG and EDMCS has been addressed with this release.

In this blog series, I’d like to further explore Data Governance in EDMCS. At a high level, this is how I see this series unfolding:

  • Part 1 will provide the foundation, background, and basic concepts for EDMCS and Data Governance
  • Part 2 will get more into the “techy” stuff and dive deeper into Approval Policies and Security
  • Part 3 will provide a recap and closing thoughts/lessons learned

So, with that said, onto Part 1…

Prerequisites

Before diving head first into configuring Data Governance and collaborative workflows in EDMCS, there are a few things to consider.

  • Don’t forget people and process. I’m a big believer that people and process are just as (and usually much more) important as the tool. Please refer to this blog post for a quick read on this: The Data Governance Triple Crown.

I believe the same tenets apply to EDMCS and that it’s important to start thinking about a formal data governance program that includes a charter, executive sponsorship, roles & responsibilities, metrics, and much more. Data Governance can be a challenging cultural shift for many organizations which requires strong change management to handle the inevitable resistance. This is where a formal data governance framework can help.

  • Establish the foundation. As with building a house, it’s important to lay a solid foundation before you install the wiring and plumbing. Build your EDMCS application(s) and dimensions, and populate your primary and alternate hierarchies first. Get the client comfortable with the tool and the content. Then you can start to layer in the workflows.
  • Start to identify the “who” (e.g. the people involved and the roles they will play: who will be submitting requests? Who will be approving? Who will do both?
  • Start to think about the “what.” What applications/dimensions/hierarchies will be governed? What are the use cases and typical scenarios that require data governance? Start to collaboratively mock up and storyboard some typical workflows with the client to visualize how the workflows will function. And don’t try to build a workflow for every possible scenario. Start with the big hitters and low hanging fruit first. You can always add more workflows later.

What’s Included in EDMCS Workflows?

Are you wondering what EDMCS includes as far as data governance functionality? In summary, EDMCS supports:

  • Two types of roles – submitters and approvers
  • Separation of duties – workflows can be configured to prevent submitters from approving their own requests
  • The “four eyes” principle: EDMCS data governance adheres to the principle that requests must be approved by at least two people
  • Default application views and maintenance views: workflows can work with both types of views
  • Subscriptions: workflows can be triggered by Subscription requests
  • Email-based notifications
  • Serial and Parallel approvals:
    • Serial approval means a sequential order of approvals is required. For example, Approver #2 can’t approve until Approver #1 approves, Approver #3 can’t approve until Approver #2 approves, and so on.
    • Parallel approval means the approvals can occur in any order and at the same time.
    • With either method, all approvals must occur before the request is committed.
  • Configuration of Reminder and Escalation intervals
  • Multiple Workflow Stages:
    • Submit – initiate the request and add/edit/delete line items in the request. Note that with the 19.02 release, you can also attach documents and insert comments at the line item level. These enhancements are helpful to attach policies, supporting details, and other documentation related to the workflow request.
    • Approve – similar to DRG, an approver can approve, push back, or reject a request. Pushing back will send the request back to the submitter for additional changes. Rejecting will close the request and end the workflow.
    • Commit (implied) – once the request is fully approved, it is committed, hierarchies are updated, and the request history can be viewed like any other request.
  • Approval Policies – this is really the brains of how workflows are configured in EDMCS, and the next blog post cover this in greater detail. But here is a screenshot of the Approval Policy screen showing the available options:

Kevin Black - EDMCS and Data Governance - Part 1 - 3-8-19 Image 1

Conclusion

I hope you found this blog post helpful as an introduction to EDMCS and data governance, and that you will keep reading as the rest of the series is posted. Please contact me with any questions and comments!

And don’t forget to follow me on Twitter (@kblackEPM) and check out/subscribe to my blog (along with the blogs authored by my very talented colleagues at Alithya).

https://ranzal.blog/author/kblackranzal/

https://ranzal.blog/

Interested in better understanding EDMCS, the RESTful API, and Cloud Data Management? Be sure to check these excellent blog posts by Tony Scalese, aka FDM Guru: https://ranzal.blog/author/ascalese/

Looking for an outstanding resource for all things master data-related and more? Look no further!  https://datarestless.com/

Key Features of EDMCS 18.10

Usually, when October rolls around each year, there are three things I am very excited about:

  1. The start of the NHL season
  2. Watching one of my favorite scary movies (Halloween ’78)
  3. Eating leftover Halloween candy for the next six months because we bought 200 pounds of it at Costco and only had 5 trick-or-treaters show up at our door

But this year, add #4 to the list: The October 18.10 release of Enterprise Data Management Cloud Service (EDMCS)! Trust me, this is a monster of a release (pun intended). This is the biggest release since 18.07 and here are some of the highlights:

  • New packaged adapter for Oracle Financials Cloud GL
  • Property Inheritance
  • Add Related Nodes Across Viewpoints
  • Download Viewpoint in Request Context
  • Request Load File Summary Statistics
  • Level Property
  • Metadata Deletion
  • Object Details Popup

Here are more details and insights on a few of these features:

Oracle Financials Cloud GL Adapter

This is a major adapter in terms of functionality. And like the packaged adapters for Planning and Budgeting Cloud Service (PBCS) and Enterprise Planning and Budgeting Cloud Service (EPBCS), you can create an EDMCS application for Oracle Financials Cloud GL using a standard wizard interface, identify a connected application or use a file interface, and reap the benefits of several built-in validations.

The registration process includes steps to identify basic settings (e.g. Active Languages, Active Trees, Multiple Active Tree Versions, and Max Depth), register multiple Segments and Trees, and add/modify/remove Financial Categories. From reviewing the Oracle EDMCS Administrator Guide, I identified at least 16 built-in validations for EDMCS applications based on the Oracle Financials Cloud GL Adapter. This is certainly a packaged adapter with a lot of “meat on its bones.”

Property Inheritance

For those who love Oracle Data Relationship Management (DRM), you likely used the inheritance features of that product extensively – whether it was the global, inheriting property definition or using functions like ParentPropValue(). Well now you have property inheritance in EDMCS.

Property inheritance is set during the Application Registration process by modifying custom properties within your dimension node types. The choices are “None” or “Positional” which work as you would expect. Inherited property values can be overwritten in a request for exception cases. The other feature I like is that inheritance supports Shared members for relationship-level properties, allowing a shared member to have a different inherited property value than its primary member.

NOTE: property inheritance is only available for EDMCS Custom applications. It is not available for EDMCS applications based on the PBCS, EPBCS, or Financial Cloud GL adapter.

Request Load File Summary Statistics

This nice little feature gives you immediate statistics and feedback as you load a Request File into EDMCS. The number of rows processed, loaded, and skipped are identified, and you can open the request file attached to the in-flight request to see details on why rows were skipped. This all occurs before you submit the request and provides helpful, proactive input as to the changes that will be processed in your request.

Key Features of EDMCS Image 1

Metadata Deletion

With 18.10, you can now delete custom properties and custom applications. This makes me happy, as I’m not ashamed to admit that as I learned EDMCS, I made plenty of mistakes in creating applications or properties that I could not un-do. Oh the shame! Instead, I had to resort to archiving the application to partially hide my guilt. But now you can delete those unwanted custom properties and applications. Hopefully, future releases will allow intelligent deletion of other “mistakes” involving data chain objects like node sets, hierarchy sets, and node types that are archived and no longer used/needed.

Object Details Popup

Another minor, but helpful feature. Hover your mouse over a viewpoint name/label, and a popup window will display the application and dimension being used in the viewpoint. Helpful to keep your bearings when you start to use maintenance views that span multiple EDMCS applications and dimensions.

Key Features of EDMCS Image 2

Conclusion

As you can see, 18.10 is a significant release and arguably the single largest release in terms of new features since EDMCS was introduced in January 2018.

The biggest feature? Definitely the new adapter for Oracle Financials Cloud GL applications. I recommend reviewing the Oracle EDMCS Administration Guide to understand the full power that comes with this adapter. It is quite interesting.

The feature I will use right away? Property inheritance. With my current project involving multiple Custom EDMCS applications, property inheritance is a welcomed feature that will significantly ease the maintenance of key properties throughout my maintenance views.

I’d love to hear any insights and feedback from you as we continue this crazy journey called EDMCS. And stay tuned for future blogs discussing new EDMCS functionality and lessons learned from current projects!

Missed our earlier blogs on EDMCS, Cloud Data Management, and REST API? Be sure to check them out:

Using Subscriptions with EDMCS

As an earlier blog mentioned, the 18.07 release of Enterprise Data Management Cloud Service (EDMCS) delivered one eagerly anticipated piece of functionality: Subscriptions! And do not fear – these subscriptions are useful and do not involve a 1-year subscription to the Fruit of the Month Club (not that there’s anything wrong with that).

This blog post dives deeper into this new functionality, describes how it works, and highlights some lessons learned from utilizing Subscriptions with a current project involving multiple EDMCS Custom applications supporting multiple Profitability and Cost Management Cloud Service (PCMCS) applications.

Why Are Subscriptions Important?

Subscriptions are a huge step towards true “mastering” of enterprise data assets within a single master data Cloud platform. With EDMCS, it is important to build deployment-specific applications configured to the dimensionality requirements of the target applications to most effectively use the packaged adapters, validations, and integration capabilities. But in many cases, you also need to share common hierarchies across applications and avoid duplicative (that’s my big word for today) maintenance. After all, why have a master data management tool if you still must perform maintenance in multiple places? That’s just silly.

The answer to this dilemma is Subscriptions. By implementing Subscriptions, requests submitted to a primary viewpoint will automatically generate parallel subscription requests to subscribing viewpoints to automatically synchronize your hierarchy changes across EDMCS applications.

Note

This comment is important: “automatically generate parallel subscription requests.” EDMCS will not update a target, or subscribing, viewpoint behind the scenes with no visibility or audit trail to what has occurred. A parallel Subscription request will be generated along with the Interactive request that will be visible in the Requests window, along with the full audit trail and details that you find in an Interactive request. Even better, the Subscription request will generate an email and attach a Request File of the changes.

Nerdy Details

Views and Viewpoints

The first thing to really think about is the View and Viewpoint design of your EDMCS applications. Subscriptions are defined at the Viewpoint level, so you need to identify the source and target viewpoint for your business situation. With my current project, I have multiple EDMCS applications supporting multiple PCMCS applications. While the dimensionality is similar across the applications, the hierarchies vary, especially with the alternate hierarchies. So, it has been important to isolate the “common” or shared structures that should be synchronized across applications into their own viewpoint so that a subscription mechanism can be created.

Node Type Converters

You will likely need to create a node type converter. If the source and target viewpoints do not share a common node type, you must create a node type converter for subscriptions to work. In my situation, I had already created node type converters since I wanted to compare common structures across EDMCS applications, so the foundation was there to readily implement subscriptions.

Permissions

To create a Subscription, the creator must have (at a minimum) View Browser permission to the source view, View Owner permission to the target view, and Data Manager permission to the target application.

The Subscription assignee (this is the user who will “submit” the subscription request) must have (at a minimum) View Browser permission to the source view and Data Manager permission to the target application.

Creating a Subscription

Once the foundation is in place in terms of viewpoints, node type converters, and permissions, the actual creation of a subscription is easy.

Inspect the target viewpoint (the viewpoint that is to receive the changes from a source viewpoint via subscription), navigate to the Subscriptions tab, and click Edit. From there you can select the source viewpoint, the request assignee, and enable Auto Submit if needed. Save the subscription and you are all set.

  • Currently, there is no capability to edit an existing subscription. You must delete and add a new subscription to effect a change.
  • Any validation errors for your subscription will appear on this dialog as well. These are documented nicely in the Oracle EDMCS administration guide.

Using Subscriptions with EDMCS Image 1

Auto-Submit and Email Notifications

Emails will be generated and sent to the Request Assignee, whether Auto-Submit is enabled or not. The email will include details such as the original request #, the subscription request #, and how many request items were processed or skipped.

Using Subscriptions with EDMCS Image 2

Note

  • Remember, the subscription request will have a Request File attached to it. View the request file attachment to see details on why specific request items were skipped.
  • The request file is not attached to the email itself, only to the request in EDMCS.

Lessons Learned

Like I mentioned earlier, the foundation is important to making subscriptions work. And it all boils down to design and ensuring the building blocks of that foundation are in place:

Design, Design, Design!

  • The importance of dimension, view, and viewpoint design cannot be overstated. For each dimension, evaluate the primary and alternate hierarchy content and identify what will be shared across dimensions or applications and what will be unique to each dimension and application.
  • Based on that analysis, carefully design your viewpoints to enable subscriptions across EDMCS applications for hierarchies that truly need “mastering.”
  • As early as possible, identify the EDMCS user population along with permission levels for applications and views. This is important to identify the appropriate “Request Assignee” for your Subscriptions. I recommend creating a security matrix identifying each user and the permissions each will have.
  • Without a clear and well thought out design, you will find yourself constantly re-doing your views and viewpoints which, in turn, will cause constant rework of your subscriptions. The “measure twice, cut once” adage certainly applies here!
  • I am a big proponent of standard, consistent naming conventions to improve the usability and end user experience. The same holds true for Subscriptions. Consider using a standard naming convention for your viewpoints so it is clear which viewpoints have a subscription. It’s not obvious – unless you Inspect the viewpoint – that a subscription exists.
    • One approach I’ve been using is to name my source and target viewpoints identically with a special tag or symbol at the end of the target viewpoint name to indicate a subscription is present. I’m sure there are other and probably better ideas, but I find the visual cue to be helpful.
    • Perhaps in the future, Oracle will display subscription details when you hover over a viewpoint name (hint hint).

Node Type Converters

  • Ensure you have node converters in place
  • Make sure your node type converters are mapping all required properties.
    • I ran into an issue where updates to one property in my source viewpoint were not being applied to my target viewpoint via subscription requests, but all other property updates worked fine. The reason? I had recently modified my App Registration and added this property to a dimension’s node type. But my node type converter had already been created and wasn’t mapping or recognizing the new property. Once I updated my node type converter, the problem was solved.

Troubleshooting

  • The request files attached to subscription requests are a valuable troubleshooting tool. Status codes and error messages are included in these Excel files that are extremely helpful to determine why your request was not auto-submitted.
  • Inspect the Subscriptions on your viewpoints. Any validation issues will be displayed and are easily addressed. Typical Subscription validation errors include:
    • The request assignee no longer has the correct permission levels
    • The viewpoint no longer is active
    • A node type converter is missing

Conclusion

I have been looking forward to the subscription functionality in EDMCS and am pleased with it so far. Subscriptions are easy to configure, can be configured to auto-submit if desired, and generate emails to remind the requester a request has occurred and to act if the request was not submitted or request items were skipped. EDMCS Subscriptions are a big step forward to enabling true mastering of your enterprise data management assets!

Catching up with EDMCS

Last time, in the Wonderful World of Enterprise Data Management Cloud Service (EDMCS), we discussed initial impressions of this exciting new Oracle Cloud product and highlighted some early functionality enhancements.

But do you realize how much functionality has been added to EDMCS since its initial release in January 2018? The short list is impressive:

  • Enhanced node alignment/location in side-by-side viewpoint compares
  • Exposed REST API operations including dimension imports/exports and request creation/submission
  • Enhanced searching across members (name and descriptions) and data objects
  • Lifecycle management of data objects
  • Incremental imports
  • Viewpoint download from selected node

Furthermore, in the areas of REST API and metadata integrations, Tony Scalese, Vice President at Edgewater Ranzal and Oracle ACE, has written several blogs. These posts were written from the perspective of hands-on, real-world experience by working with one of three customers accepted into the Oracle EDMCS Early Adopter program. The blog posts include:

In this post, I’d like to highlight another feature that was recently added to EDMCS: enhanced request load files.

Enhanced Request Load Files

In the initial release, EDMCS provides a mechanism to perform bulk updates to EDMCS hierarchies – the Excel request load file. While the feature immediately had some advantages over its distant cousin in Data Relationship Manager (DRM) (action scripts), there were limitations. Primarily, EDMCS would only recognize the first tab or worksheet in an Excel file.

Well that has been fixed! Request load files can now contain multiple worksheets, and EDMCS will recognize all of them (provided the worksheet names match your viewpoint names of course). Additionally, EDMCS will automatically select all valid worksheets to load into EDMCS when loading a request file. This makes it very easy to download viewpoints to Excel and build a request file containing updates for multiple viewpoints to bulk upload at one time.

This also means you need to be careful! Since EDMCS auto-selects any matching worksheet name, if you were not paying attention, you could accidentally load outdated requests from a worksheet. But you can still delete any unwanted request items prior to submitting the request, if you catch them first.

Catching Up on EDMCS 1

While you could always load multiple request files in a single request since the initial release of EDMCS, this feature is a nice usability and productivity enhancement. It works great for situations such as adding a node to a primary hierarchy/viewpoint and inserting it into an alternate hierarchy/viewpoint, all from the same request.

Conclusion (and a teaser)

While EDMCS is the new kid on the block in the Oracle EPM cloud space, it’s exciting to see how it’s quickly closing the gap with new functionality being added regularly! REST API operations, enhanced request files, and the other enhancements mentioned above show how far EDMCS has come in just 6 months.

But wait, there’s more!

The 18.07 release of EDMCS looks to be a HUGE release chock full of new features, including one I am especially excited for: subscriptions!

Look for more blog posts coming soon to discuss the subscription functionality and utilizing EDMCS for a Profitability and Cost Management Cloud Service (PCMCS) implementation.

An Exploration of the EDMCS REST API

Recently my team and I had the opportunity to implement Oracle’s newest offering – Enterprise Data Management Cloud Service (EDMCS). EDMCS for those of you who are not familiar provides a cloud-based solution for managing master data (also referred to as metadata) across the organization.  Some like to refer to EDMCS as Data Relationship Manager (DRM) in the Cloud, but the truth is, EDMCS is not DRM in the Cloud.

EDMCS is a completely new vision of what master data management can and should be. The architect of this new cloud offering is the same person who founded Razza Solutions which was the company that developed the product now known as DRM.  That is important to know because it ensures that the best of what DRM has to offer is brought forward.  But, more importantly, it ensures that the learnings and wish list of capabilities that DRM should have are in the forefront of the developers’ minds.

Ok, now let’s get back to fun stuff. In the 18.05 patch for EDMCS, the REST API (v1) was exposed for public usage.  The documentation for the REST API can be found here:

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/rest-endpoints.html

As I highlighted in the previous post Troubleshooting Cloud Data Management Metadata Load Errors, I had developed an automation routine to upload EDMCS extracts to both PBCS and FCCS using FDMEE and Cloud Data Management.  We had been eagerly awaiting the REST API for EDMCS to finalize this automation routine and provide a true end-to-end process that can be scheduled or initialized via a single action.

Let’s take a quick look back at the automation routine developed for this customer. After the metadata has been exported to a flat file from EDMCS, the automation would upload a copy to the PBCS and FCCS pods, launch Cloud Data Management data load rules which would process the EDMCS metadata extracts, run a restructure of the database after all dimensions had been loaded, and then send a status email alerting the administrator of the result.  While elegant, I considered this to be incomplete.

Automation, in my view, is a process that can be executed without user interaction. While an automation routine certainly has parameters that must be generally maintained, once those parameters are set/updated, the automation cycle should not be dependent on user input or action.  In the aforementioned solution, we were beholden to the fact that EDMCS exports had to be run interactively; however, with the introduction of the publicly exposed REST API in the 18.05 EDMCS patch, we are now able to automate the extract of metadata from EDMCS.  That means we can finally complete our fully automated, end-to-end solution for loading metadata.  Let’s review the EDMCS REST API and how we did it.

The REST API for EDMCS is structured similar to other Oracle EPM REST APIs. By this, I mean that multiple REST commands may need to be executed to achieve a functional result.  For example, when executing a Cloud Data Management data load rule via the Data Management REST API, the actual execution of the data load rule is handled by a POST call to the jobs function with the required payload (e.g. DLR name, start period, etc.).  This call is just one portion of a functional requirement.  To achieve an actual data load, a file may need to be uploaded to the cloud, the data load rule initialized, and then the status of the data load rule be retrieved.  To achieve this functional result, three unique REST API executions would need to occur.

To export metadata from EDMCS to a flat file using the REST API, the following needs to be executed:

  1. Get the dimension information for the EDMCS application from which metadata will be exported
  2. Execute an export of the dimension(s)
  3. Determine the status of export
  4. Download the export to a flat file

Let’s explore each of these in a little more detail. First, we need to get the dimension IDs for the application from which we will be downloading metadata.  This is accessed from the applications function.

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-applications-get.html

When executing this function, the JSON object return includes all applications that exist in EDMCS (including those archived). So the JSON needs to be iterated to find the record that relates to the application from which metadata needs to be exported.  In this case, the name of the application is unique and can be used to locate the appropriate record.  Next, we need to query the JSON object to get the actual dimension id (circled in red).  The dimension ID is used in subsequent calls to actually export the dimension.

Great, now we have the dimension ID. Next, we need to execute the REST API call to export the dimension.

Automated Metadata 1.docx

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-dimensions-dimensionid-export-download-post.html

You will notice that when you access this POST method, the dimension ID from the previous step is required:

/epm/rest/v1/dimensions/{dimensionId}/export/download

The JSON object returned from this execution contains minimal information. It simply provides the URL to the next required REST API execution which will provide the status of the execution.

Automated Metadata 2.docx

With this information, we can check the status of the export using the jobRuns function

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-jobruns-jobrunid-get.html

The JSON object returned here provides us the status of the export invoked in the prior step (in yellow) as well as a URL to the actual file to download which is our last step in the process.

Automated Metadata 3.docx

Once the export job is complete, the files can be streamed using the URL provided by the REST execution in the prior step.

https://docs.oracle.com/en/cloud/saas/enterprise-data-management-cloud/edmra/op-v1-files-temp-fileid-get.html

And there you have it, a fully automated solution to download metadata to flat files from EDMCS. Those files are then provided to the existing automation routine and our end-to-end process is truly complete.

And for my next trick…let’s explore some of the different REST API tools that are available to help you in your journey with the EPM REST APIs.

 

Troubleshooting Cloud Data Management Metadata Load Errors

In my last post, I highlighted a solution that was recently deployed for a customer that leveraged Enterprise Data Management Cloud Service (EDMCS), Financial Data Quality Management Enterprise Edition (FDMEE), and Cloud Data Management (CDM) to create an automated metadata integration process for both Planning and Budgeting Cloud Service (PBCS) and Financial Close and Consolidation Cloud Service (FCCS). In this post, I want to take a bit of a deeper dive into the technical build and share some important learnings.

Cloud Data Management introduced the ability to load metadata from a flat file to the Oracle EPM Cloud Services in the 17.11 patch. This functionality provides customers the ability to leverage a common platform for loading both data and metadata within the Cloud.  Equally important, CDM allows metadata to be transformed using its familiar mapping functionality.

As noted, this customer deployed both PBCS and FCCS. Within the PBCS application, four plan types are active while FCCS has the default two plan types.  A design decision was made for EDMCS to create a single custom application type that would store the metadata for both cloud applications.  This decision was not reached without significant thought as well as counsel with Oracle development.  While the pros and cons of the decision are outside the scope of this post, the choice to use a custom application registration in EDMCS ensured that metadata was input a single time but still fed to both cloud applications.  As a result of the EDMCS design decision, a single metadata file (per dimension) was supplied with properties necessary to support each plan type.

CDM leverages its 23 “dimensions” to store metadata information for processing. Exactly like data, metadata is imported using an import format into the CDM relational repository.  Each field from a metadata file is aligned to a CDM dimension field.  The CDM Account dimension always represents the target application member name and the CDM Entity dimension represents the parent of the member.  All other fields can be aligned to any of the remaining 21 dimensions.  CDM Attribute dimensions can be utilized in the import and mapping process but are not available for exporting to the cloud application.  This becomes an important constraint especially in a multi-plan type deployment.  These 21 fields can be used to store any of the properties required to successfully load metadata to the target plan type.

Let’s consider this case study for a moment. The PBCS application has four plan types.  If a process were to be built to load all plan types from a single CDM data load rule, then we would not be able to have more than five plan type specific attributes or properties because we would not have enough CDM fields/dimensions to store the relevant information.  This leads to an important design approach.  Instead of a single CDM data load rule to load all plan types, a data load rule was created for each plan type.  This greatly increased the number of metadata properties and attributes that could be loaded by CDM and ensured that future growth could be accommodated without a redesign of the integration process.

It is important to understand that CDM utilizes the Planning Outline Load Utility (OLU) to actually perform the metadata load to the cloud application. The OLU loads metadata using merge (yes Planning experts, I realize that I am not discovering fire) which is important to understand especially when processing multiple metadata loads for a single application.  When loading metadata, there are certain properties that are Application level.  I like to think of these as being global.  Additionally, there are plan type specific attributes that can align (or not align) to the application level value/setting.  I like to think of these as local.

Why is this important? Well when loading a metadata file, if certain global properties are excluded from the metadata load file, the local properties (if specified) are utilized to default the global properties. Since metadata is loading using merge, this only becomes problematic when a new member is being added to the outline.

In this particular example, an alternate hierarchy with shared members was specified in one of the plan types. The storage property of the alternates was obviously set as Shared; however, when attempting the metadata load, the following error was encountered:

A Base Member cannot be changed to a Shared Member.

After much investigation (details to follow), I discovered that the global property should also be included in the metadata load.

While CDM utilizes the OLU to load metadata, it does not provide as much verbosity in the error information as the PBCS web interface (which also uses OLU) when loading metadata. Below is an example of the error in the CDM process log.  As a tangent, I’d love to check the logs without needing to open a Service Request.  Maybe Oracle will build an enhancement that allows that in the future (hint, hint, wink, wink to my friends at Oracle).

Baha Mar - Error Handling 1

So where do I go from here? Well, what do we know about CDM loading metadata to the cloud application?  We know that CDM uses the OLU to load a flat file generated by CDM.  Bingo!  The metadata file output by CDM is a good starting point.  That file is in the Outbox of the CDM application and can be downloaded in several different ways – CDM Import process (get creative folks), CDM process details, or EPM Automate.  Now we have the metadata file and can test to determine if the error is caused by CDM or the metadata itself.  It’s all about ruling out variables.  So, we take the metadata file and import it manually within the PBCS web interface and are able to replicate the error.  But now we have an important new data point – the line number from the metadata file that is causing the error.

Baha Mar - Error Handling 2

Now that we have actionable information, we can review each property and start isolating and eliminating different variables. We determined that this error was only occurring for new alternate hierarchy parents being added to the outline.  As a test, we added the global storage property and voila, the metadata load completed successfully.  Face palm!

Maybe this would have been obvious to folks with a lot of Planning experience, but there are plenty of folks learning the intricacies of Planning and Essbase (including our friends converting from HFM to FCCS), so I wanted to share a lesson learned in my journey of metadata integration using CDM.

CDM functionality for metadata represents two of the three primary operations of ETL. In my next post, we’ll dive deeper into how the extract component of ETL was accomplished to provide a seamless end- to-end ETL solution for metadata.

New…and Cool Features in EDMCS

Previously on The Wonderful World of Enterprise Data Management Cloud Service (EDMCS), we highlighted some of the features offered in this new product (released on Jan 25, 2018), including packaged application adapters for PBCS/EPBCS and the visual cues provided in the user interface as you modify master data. With the 18.03 release, subtle but helpful features have been added, and this post shares details of those along with useful tips resulting from actual project work done with one of three clients selected for the EDMCS Early Adopter Program. This client has fully embraced Oracle EPM Cloud by utilizing EDMCS, Financial Consolidation and Close Cloud Service (FCCS), Planning and Budgeting Cloud Service (PBCS), Account Reconciliation Cloud Service (ARCS), and Customer Data Management Cloud (CDM), along with on-premise Financial Data Quality Management Enterprise Edition (FDMEE).

Incremental Dimension Import

You now have two options available for import: Full (reset dimension) or Incremental. Full operates as expected and will completely erase and replace your dimension. All history will be lost. Certainly, a helpful feature when first seeding EDMCS, but a feature to be used carefully.

Incremental allows you to incrementally update property assignments during an import. Why is that important or useful? Because sometimes another system is point-of-entry and is feeding a new extract to EDMCS. Those extracts are typically complete data extracts (not incremental), so the incremental feature allows you to update new or changed property assignments without completely erasing your dimension.

*One important clarification – even when performing an incremental import, the complete set of node relationships must be imported as all parent-child relationships will be replaced. The “incremental” piece of the import applies to updating the changed property values of those node relationships.

New Features in EDMCS 1

Metadata Object Search

The search feature for objects (views, node sets, hierarchy sets, applications, etc.) has been expanded to include name and description. Enter your search text, and any object that contains the matching text in either the name or description is returned in the search results.

Node Search

Like the metadata object search, the node search in a viewpoint has been enhanced to include name and description. You do not need to toggle between name and description like in Data Relationship Management (DRM); instead, enter your search text and any node containing that text in the name or description is returned.

Request Load Files

EDMCS provides the capability to directly load Excel files, and a great way to start creating your request file is to use the download feature in EDMCS. Download a viewpoint (either the entire viewpoint or from the selected node and its descendants) to Excel, and you now have the basis for your request file: the complete “hierarchy” in Excel including parent, child, and all properties with the correct property labels as column headers in Excel. You can then modify the Excel file by adding an Action Code column and including the relevant node relationships and properties you wish to include in your request file.

*One note: If you need to modify top nodes in a request load file, you must include the Parent Node Type column. The column can be blank for the top nodes, but the column itself must be present. You can always modify top nodes interactively in a request.

Summary

While these features like incremental import and improved search function may seem minor, it’s often the little things that end up making a big difference when you put them all together. And it’s exciting to see this type of functionality being added so soon after the initial release.

Check back regularly for new updates and insights as EDMCS continues to evolve and mature -we’ll keep sharing! We’d love to hear your questions and observations related to EDMCS and how it fits into the ecosystem of Oracle EPM Cloud, so please comment below or contact us to share your experiences.

Try Oracle Cloud for Free

Cloud Data Management (CDM) and Financial Data Quality Management Enterprise Edition (FDMEE): A Case Study in Working Together

Why buy Financial Data Quality Management Enterprise Edition (FDMEE) when Cloud Data Management (CDM) is free?  As outlined in my recent white paper – FDMEE vs. Cloud Data Management – there are myriad factors that can drive the decision.  This blog post highlights how one customer gained a highly flexible and automated solution for data and master data management with an on-premise deployment of FDMEE in conjunction with Cloud Data Management.

This customer adopted a pure Cloud strategy as it relates to Enterprise Performance Management (EPM) procuring subscriptions to Planning and Budgeting Cloud Service (PBCS), Financial Close & Consolidation Cloud Service (FCCS), and Account Reconciliation Cloud Service (ARCS).  A diverse business, the customer has many unique operational systems with varying formats and charts of accounts.  So far, no reason why Cloud Data Management (CDM) can’t handle this requirement, right?  This is what CDM does – uses import formats and maps to consume and transform data – right?  Sure, but with caveats.  Notice that I used the word consume and not extract.  CDM does not provide the ability to link with on-premise systems to extract data.  Additionally, flat file data extracts that lack a consistent structure often cannot be natively consumed by CDM.

In this case, data needs to be loaded each day from numerous sources to support daily operational reporting.  The systems are a blend of on-premise, hosted, and Cloud applications.  The customer requirement dictated that any on-premise system should be connected directly to eliminate the need for a flat file extract to be generated daily.  Additionally, the hosted and Cloud applications are very industry specific and, in some cases, provided by very niche vendors.  The ability to modify extract formats was cost prohibitive or simply not supported.  As a result, several of these data feeds were not consumable by CDM without preprocessing/modification.

In light of the above requirements, the customer procured and deployed FDMEE on-premise.  The power of FDMEE allows a solution to be deployed that provides a direct connection to multiple on-premise systems as well as consume the flat file extracts from hosted and Cloud applications including Excel files (not in the required FDMEE/CDM format) and XML.  Because FDMEE on-premise supports scripting, we were able to greatly enrich the data integration cycle with full end-to-end automation including FTP downloading of hosted data, enhancement of the data integration cycle to detect data mapped to members not yet in PBCS or FCCS, dynamically setting substitution variables based on the processing day, running calculations in PBCS, and sending email status alerts to outline the success or failure of a data load cycle.

Although I am a huge FDMEE advocate, I recognize the value of Cloud Data Management and the benefits it provides in a case like this one.  This customer was one of just three participants in the Oracle Enterprise Data Management Cloud Service (EDMCS) program.  This means that they were able to use the software before it was publicly available – otherwise known as GA.  To participate in this program, one must recognize the absence of certain features and functions with the software.  The program allows the customer (and partner) to offer Oracle development and product management valuable input about the software and in some ways drive what features are prioritized within the product roadmap.

EDMCS currently lacks native connections to FCCS, but this will change over time.  So how does CDM help with loading metadata to FCCS?  In a recent update to CDM, Oracle included the ability to import a flat file into CDM and load metadata to a registered target application such as PBCS or FCCS.  John Goodwin gives a detailed overview of the technical setup.

FDMEE and CDM have come together in this case to provide a fully automated data integration process and an automated master data integration process.  Within EDMCS, a Custom application type was created.  The required properties for FCCS were built and attached to the multiple dimensions being mastered, and flat file exports were generated for FCCS.  We knew we were going to use CDM to manage the master data load process, but we had a decision to make – do we leverage EPM Automate or FDMEE as our automation hub?

We chose FDMEE.  Why?  Simply because a lot of automation assets had already been developed in FDMEE that could readily be reused for this process including execution of EPM Automate commands, a framework for leveraging the REST API (for PBCS and FCCS), and email alerting.  Additionally, we found the capabilities of EPM Automate to be somewhat limited.

For example, when you execute a CDM data load rule from EPM Automate, the process ID associated with the execution is not returned.  Why is that important?  Because in the event of a failure, I’d want to download the process log and attach it to the email so the user has information to address the issue.  Could I use the ListFiles command of EPM Automate to get the process log? Possibly, but it doesn’t account for potential concurrency, and I am not doing my job as a consultant if I build a process that can’t handle concurrent operations.  For reasons such as these, we leveraged EPM Automate when possible and the REST API as needed, and we wrapped it all together with an FDMEE process that could be executed on a scheduled basis or on demand simply by using the Script Execution functionality.

Let’s review the end-to-end solution.  In EDMCS, metadata is maintained for PBCS and FCCS.  The metadata is extracted to a flat file (.csv) after maintenance is completed and saved to a network folder.  From FDMEE, the master data integration process is initiated to upload the metadata files to FCCS and PBCS.  Cloud Data Management data load rules are initialized to process the metadata extracts.  In the event of an error, the CDM process log is downloaded.  Finally, an email is generated to alert the administrator of the data integration process status.

There you have it – EDMCS, FDMEE, and CDM working in concert to provide a seamless and elegant solution to data and master data integration for a customer that adopted a Cloud EPM strategy.  If you want to learn how you can enhance your Oracle EPM integration processes, contact us and we’ll be happy to discuss your options.