Out-of-the-Box Features: Profitability and Cost Management Cloud Service (PCMCS) – Intelligence and Dashboarding: Profit Curves

Welcome back to this series of blog posts to cover out-of-the-box (OOTB) features of Profitability and Cost management Cloud Service (PCMCS). There is a need within the Oracle Cloud client community to discover what can be achieved with the tools provided when subscribing to one or more Oracle Cloud Services. A lack of awareness of the features included with your subscription is an unmeasured cost and a missed opportunity to gain much needed insight without further spend.

PCMCS applications – whether built for Fully Allocated P&L Solutions, Transfer Pricing, Shared Services Allocations or Customer/Product Profitability – have OOTB reporting capabilities available via the Intelligence menu that offer insight into allocation models with reduced effort. Here, we’ll explore how to set up, configure, and use such features and fully leverage the functionality that is included in the Oracle Cloud subscription cost.

The order in which I am covering the OOTB features is directly related to the Intelligence menu options available in PCMCS.  The 6 menu options are:

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 1  1.  Analysis Views (learn how to create, customize, and use them here)

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 2  2.  Scatter Analysis (discover how to set up and configure them here)

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 3  3.  Profit Curves (this blog post focuses on Profit Curves)

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 4  4.  Traceability

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 5  5.  Queries

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 6  6.  Key Performance Indicators

The content of this blog is based on the standard Bikes (BkML30) demo application, so you can follow the step-by-step information without having to go through an app setup from scratch. You can load and deploy this application directly from your PCMCS instance through a couple of clicks via the Application menu using the + / Create button.

 

Profit Curves – What Are They?

If you are looking for a graphical representation for the concentration of your profit by either Customer, Products, Channels, or Funds, look no further than the Profit Curves section in PCMCS. Profit Curves, also referred to as Whale Curves, are used to identify which cluster of Customers, Channels, or Products generate the most profit. Profit Curves display a graphical representation of the relationship between economic profit and the quantity of output sold.

The details of the profit or net income split by unit/service/customer displayed in a Profit Curve identify issues with:

  • expansion of a production line
  • breadth of services that may have a negative impact on profit
  • onerous clients consuming numerous resources without justifying the cost for the profit gained from their engagement
  • potential costing issues of “over” or “under” costing products (for example, overburdening a product or product line inappropriately);  a cost study should be performed to determine the appropriate allocation
  • pricing

Information illustrated with a Profit Curve can be enlightening and help to put the focus on specific customers, products, or channels where the greatest profit attention is needed, indicating situations where a few products, services, or clients create enough profit to maintain the rest of the company’s offering. Profit Curves are key to strategic decision making, especially when dealing with competing projects and limited resources.

During one of my recent PCMCS implementations, a Profit Curve proved valuable when the client’s staple product, advocated as being its best and most profitable, was discovered to be the least profitable after the implementation of an accurate cost allocation methodology in PCM!

The easy-to-follow Profit Curve provides the foundational insight needed to rapidly shift gears across product lines, ensuring alignment of management decisions backed up by real information.

 

Building a Profit Curve

There are several Profit Curves available in the Demo application BksML30. In order to build a Profit Curve, there must be a corresponding Analysis View that can be leveraged as the basis for data selection. See a step-by-step guide on how to build an Analysis View here.

Analysis Views can contain multiple references to Measures and/or Accounts; however, the Profit Curve using the Views analyzes and displays only one measure at a time.  Users can choose to define names for the X and Y axis to add clarity to the Profit Curve information consumers.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 1.png

Here is an example of a Profit Curve:

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 2

The curve displays a listing of Net Income generated by Customer.

From a Quarter-to-Date perspective (the Period selected at the top of the View), this Profit Curve indicates that all customers are profitable.  That may raise questions about whether or not the overhead is allocated appropriately or an even spread is used, thus skewing the results.

Note: Data in the BksML30 model at the time this Profit Curve was generated was calculated only for January, confirming the Profit Curve display, as the profit by customer distribution was evened out at Quarter-to-Date level.

The details of each customer/product/channel/segment and how much net income each is generating can be reviewed in the Category Analysis section. From a cost management and process improvement point of view, the right side is the most important.  This side generally represents customers/products/channels with a negative profit or that cost the company money.  While these customers/products/channels can’t always be eliminated, they can be watched and reviewed for pricing changes.

Using a PCMCS Profit Curve

There are options to filter data by the POV dimension, Period, or by metrics tied to Customers. For example, we can exclude from the analysis any Customers with Operating Expenses that are considered marginal. After defining the required filters, we can refresh the Profit Curve and review the newly generated pie charts.  Filters can be added to all available metrics and can be stacked up to generate any custom report.

Below is an example of the same “All Customers” Profit curve, limited to January and with a selection of all Customers who had a Net Income smaller than 1 positive unit (USD or the currency defined in the PCM model) thereby highlighting Customers creating losses.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 3

In the Details section of the Profit Curve, there is a count of 886 customers with a Net Income smaller than 1.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 4100% of the customers analyzed based on the specified criteria are unprofitable. The “Actual Profit” in this Details section can be translated into “Actual Loss” as the total accumulated value across the 886 customers is US$ -1,148,670.

If there are doubts regarding the data intersection for the remaining dimensions in the PCM model such as Product or Entity, we can analyze related information through the configuration icon located next to the “Add Filter” menu. These selections are predefined in the Analysis View that was used during the creation of the Profit Curve, and you will not be able to modify them unless you modify the underlying View.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 5

If questions are raised during the analysis on the Profit Curve screen and a list of details by Customer is requested, we have the option to launch a report from the “Analysis Links” menu under the Category section.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 6

A report in the following format will be generated to display the Customer detail records along with all the other settings defined in the Analysis View.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 7

This report can be exported in .xls format (“Export to Excel” option), and it represents a base level data dump report, in column format, containing multiple generations and references to attribute dimensions.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 8

Note: When launching this report, users must check that the parameters have transitioned correctly from the previous screen. The Period parameter, which is saved to be Quarter-to-Date on the original Analysis View used in the Profit Curve diagram, will override any other selection made during run time analysis. If there is a need to revert to a specific month before launching the Export to Excel, users will have to make this update on the Filter /POV area and perform a data Refresh.

We can make changes to the Analysis View to add further details (for example, Cost of Goods).

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 9

For the 886 customers that are not profitable, we can dive deeper into their Cost of Goods data, Operating Expenses, or analyze whether or not the products sold are so heavily discounted that they no longer generate a margin.

 

Pie Charts Related to PCM Profit Curves

 

We can further analyze the resulting Profit Curve data by using the available predefined categories tied to the Attribute dimensions available in the PCMCS application, in the underlying Analysis View displayed in the adjacent Pie Chart.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 10

The available categories to display the Pie Chart data for the Profit Curve chosen are the following:

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 11

When selecting the Region category/attribute, we learn that the Southeast area contains 26,07% of all the unprofitable customers.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 12

If we change the Focus of the Category to be on Top 10% most unprofitable customers by Amount vs. All Customers/Number of Customers, the following information is displayed:

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 13
Alex Mlynarzek - Profit Curves - 4-18-19 - Image 14

The Pie Chart reveals that the Southeast region has the highest number of unprofitable customers both by Number of Customers as well as by Total Amount/Loss.

When adding a filter based on Customer Generation 3 which distinguishes between Department Stores and Specialty Retailers, it looks like 87.64% of the Top 10% most unprofitable customers are from Department Stores.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 15

A look at the 4th generation in the Customer dimension where we can analyze the split of the losses at Customer level indicates that one store is responsible with 65.17% of all losses within the top 10% most unprofitable Customers.

Alex Mlynarzek - Profit Curves - 4-18-19 - Image 16The Pie Chart is the only artifact that is refreshed based on the selections of the Category Analysis menu while the Profit Curve remains constant based on the selections in the POV and filter criteria.

While all users of PCMCS can generate/launch Profit Curve reports and export their associated Analysis Views, in order to create and set up a Profit Curve report, the PCMCS administrator must update the requesting user’s permissions. As with all Intelligence screens within PCMCS, the Viewer role allows the use of these artifacts, not its creation or setup.

Concluding Thoughts About OOTB Features: Profit Curves

If you have been following the posts in this blog series, you’ve become aware of the dashboarding opportunities at your disposal with a PCM subscription. The listing of PCMCS OOTB features is a good starting point for comparing any other profitability and cost management tools on the market, regardless of vendor and technology employed.

Creating insightful dashboards is now at the tips of end users’ fingers, no longer involving complex requirements gathering processes and iterating between different display options. PCMCS users have the ability to build and customize their own dashboards. As a result, IT staff is no longer burdened with reporting requests or artifact migration between environments.

Subscribe to our mailing list for updates on the next blog post covering Traceability, Queries, and KPIs. Don’t think the PCMCS OOTB features blog series will stop at the Intelligence menu options! There is more to come on Model Validation, System Reports used for maintenance and troubleshooting, Integration with Cloud Data Management, and the Application Backup and Restore functionality. All this and more will be covered in future blog posts, so watch this space for updates.  If there is a PCMCS-related topic that you would like to see covered in more depth, email us at infosolutions@alithya.com.

Out-of-the-Box Features: Profitability and Cost Management Cloud Service (PCMCS) – Intelligence and Dashboarding: Analysis Views and Scatter Analysis

PCMCS Out-of-the-Box (OOTB) Features:  2. Intelligence and Dashboarding – Analysis Views and Scatter Analysis

Two teams of consultants with similar amounts of experience and prestige guarantee that they can perform an application implementation to the highest quality: one at a higher cost, but shorter timeframe; and the other at a lower cost, but in a longer timeframe?  All other considerations being equal, should I save money, or should I save time?

A few days ago, I released my first blog post on PCMCS, covering Rule Balancing reports usage and customization. This post builds on that first post to cover intelligence capabilities, some of which are only available in the Cloud version of the PCM software.

There are 6 menu options when accessing the Intelligence menu within PCMCS.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 1  1.  Analysis Views

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 2  2.  Scatter Analysis

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 3  3.  Profit Curves

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 4  4.  Traceability

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 5  5.  Queries

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 6  6.  Key Performance Indicators

This post covers the first two menu options to explain how to set up Analysis Views and how to use Scatter Analysis.

Analysis Views

Analysis Views are the first set of reports available to end users within the PCMCS user interface.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 7

These views represent a way to predefine and save intersections of members for future review.  The selections within Analysis Views are open to all dimensions within the PCMCS application at various levels within the hierarchies. This is the first step you need to take towards building or defining a dashboard for your PCMCS application.

If you cannot create or edit an analysis view, then you need to reach out to your PCMCS administrator in order to review and adjust your security settings.

The example Analysis Views for this post are based on the “Demo Bikes” application that can be deployed with a few clicks in your PCMCS instance BksML30.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 8

A data slice is a combination of rows and columns along with the page selection, which, in this case, is the Period dimension.

Any dimension that is not specified in any of the 3 areas (row, column, page) will be read at top level and will be displayed in the settings menu.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 9

The Add Filter section allows you to filter the columns based on specific numerical values. In this case, the columns are represented by the Product dimension selections.

To create an analysis view, click on the plus (+) sign on the main menu. The three tabs displayed will allow you to define a name and description as well as the setup for row and column dimensions. You cannot select more than a dimension for either rows or columns.

Within the Row dimension selection, you can leverage different formulas applicable to the hierarchies within PCM such as Children of member, Member and children, Level 0 descendants, etc.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 10

Columns do not have options for member formulas beyond the usage of User preferences.

The row dimension will allow you to display further information such as generation or level details. For example, for the Product dimension, we can display the generation 3 and 4 information alongside the level 0 members, allowing us to expand our analysis to different product categories, or types.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 11

Selecting new members within the Analysis Views will not impact the original data definition. If you choose to display data for any month other than the one that was setup and saved in the Analysis view, you can do so because the Page parameter is open to end user modifications. If; however, you want to update and store a selection change within the analysis view, you must perform such update via the Edit menu instead of simply selecting a new parameter on the screen in view mode.

You may need to utilize the concept of period ranges when using Analysis Views in order to dynamically reference specific members of your Period dimension.

Defining a current period for the application is mandatory in order to be able to create formulas dependent on time. This action is available via the Application menu by selecting the Edit application option and navigating to the tab called Dimension settings. Here is where you can define the current Period and the Current Year for your PCMCS application.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 13

These settings will be applied when using the “Single…” or “Current” selection options within Analysis Views. Single (-1) Level 0 selection represents, in this case, the month of May, since the current Period selection for the PCMCS application is June. The Single (-1) Level 1 selections return Q1, since June is in Q2.

Scatter Analysis

Scatter Analysis graphs will compare one member’s values against another member’s values. The two members selected must be within the same dimension. Your PCMCS Demo application may not have any sample Scatter Analysis graphs. However, you can create one by leveraging the Analysis Views at your disposal.

You can launch Analysis Views from within Scatter graphs.

Note that saved Scatter Analysis cannot be reused or referenced in dashboards. You should use this section to create graphs for ad-hoc use outside of the dashboarding capability.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 14

If you need to include Scatter Analysis within your dashboards, you will have a corresponding menu item that allows you to create dashboards within the list of available items.

You can select an existing Analysis view, but you must reselect your X-axis and Y-axis dimension references.

Alex Mlynarzek - Analysis Views and Scatter Analysis - 2-28-19 - Image 15

Conclusion:  PCMCS Intelligence – Analysis Views and Scatter Analysis

While there are many alternative reporting solutions to use in conjunction with PCMCS applications, assuming that both time and money are of essence in any project implementation, it is safe to conclude that using the PCMCS OOTB reporting features would be cost effective as well as efficient. The Intelligence screens shared in this post are included in the PCMCS subscription cost, and any end user of a PCMCS application with the right level of access can take charge and build the desired reports, saving end users in a location accessible to their peers while spending no time in iterations of reporting requirements and data validations.

The PCMCS OOTB reporting features support not only troubleshooting, but also detailed analysis and reporting within one screen.  Such capabilities should not be ignored as they will surely add meaningful insight into finance teams’ day-to-day use of PCMCS.

If you need advice and guidance on how to leverage the PCMCS reporting capabilities for existing or future applications, reach out to our team of PCMCS experts at infosolutions@alithya.com.

The remaining intelligence menus will be covered in subsequent posts over the next few weeks. If you are interested in receiving notifications of such posts, subscribe to notifications.

Out-of-the-Box Features: Profitability and Cost Management Cloud Service (PCMCS) – Rule Balancing Reports

PCMCS Out-of-the-Box (OOTB) Features:  1. Rule Balancing Reports

The other day, I was thinking about the times I used to study Finance, and specifically about a course regarding Interest and how it represents the value of Time. What is the cost, or value, of one’s time? – is it high, resulting in a higher interest rate per period, or is it low, resulting in a low interest rate per period? How much time am I willing to spend working in order to get that new car? How much time do I have before that competitor will outrun me and snatch that market share from me?

This was how I started thinking about various out-of-the-box features (OOTB). Such features are often key in deciding whether to acquire a software/service/product because the one resource that we constantly complain about not having enough of is “time.”

You are now reading the first blog post on OOTB features in PCMCS covering one of the most used Reports for data analysis as well as troubleshooting profitability calculation results. At the end of this blog post, you should know what Balancing reports are, where to find them, how to use them, and also how to further expand them with minimal time and effort invested.

What are Rule Balancing Reports?

Rule Balancing reports provide quick insight into the validity of the application results. These reports are powerful OOTB artifacts that can be further configured to cater to any custom application requirements in order to support validation of calculation results as well as contribution analysis and traceability.

The PCMCS OOTB Rule balancing report is initially based on a Default Model View with a standard selection of upper level members for each dimension. Starting from this Default Model View, the administrators or users of the PCMCS applications can perform a deep dive analysis on more granular intersections and configure detailed reports for a ruleset or a group of rulesets they choose to investigate.

The Default Rule Balancing report is available as soon as the application has been deployed, and it can be accessed via the Main Navigator menu found under the Manage section.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 1I will be using the default BikesML30 application to demonstrate the capabilities of the Rule Balancing reports. If you have loaded your sample application and cannot see any results in the Rule Balancing reports, check that you ran your end-to-end calculations for any given POV from the Manage Calculation Menu. The POV I have chosen for this demonstration is FY16, January, Actual Scenario.

As you open the Rule Balancing menu, the Default Model View is the only view available when you initially set up your application and your allocation rules. Any other Rule Validation reports that you see within the Demo application besides the Default Model View have been built and configured outside of the out-of-the-box list of features.

What are PCMCS Model Views?

A Model View represents a predefined data slice within the PCMCS application; consider the model views as a set of selections of members for each dimension that displays only the relevant data points for a required intersection.

Rule Balancing Report Example

After running the entire set of allocation rules within the Demo BksML30 application, the Rule Balancing report should look like this:

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 2

The description of each rule selected will be displayed along with the rule number. The rules will be displayed in the order that they were launched following the user-defined sequencing, regardless of the actual Rule Number/Rule ID that has been assigned.

  • The “Input” column enables users to confirm that what was loaded into the application matches the expected values received from the source system.
  • The Allocation In and Allocation Out columns validate the allocations performed by the application from both a balance perspective (Allocation In should be equal and opposite to Allocation Out) and a numeric one.  The balance aspect is particularly of interest when allocations are executed with custom calculation rules.  In these cases, two separate rules are typically required, one for the “credit out” and one for the “debit in.”  As such, there is a greater risk that the formulas for the outbound and inbound values will not produce amounts equal and opposite in total, thereby causing an undesired imbalance.  In these situations, the Allocation In and Allocation Out values are shown on two separate rows, and they quickly illustrate to the user the success of their calculations.

Rule Balancing and Smart View Ad Hoc Reports

Any highlighted data point/data value in the Balancing screen will allow you to further investigate the allocation step through a Smart View ad hoc report. These hyperlinks represent pre-built/pre-defined queries that point directly to the Essbase database, allowing you to further expand the analysis of a selected data point.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 3

When you click on the highlighted number, a Smart View link will be downloaded to your workstation.

As an example, you can see how the detail for Net Change looks like for the Custom calculation rule R0001 – Utilities Expense Adjustment in a Linked report in Smart View.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 4

The column headers for the Rule Balancing report will list the relevant Balance dimension members. If there are members that are not populated, these will be automatically filtered out of the view. You can choose to display them by selecting View -> Columns and tagging the members you would like to display on your report – whether they have data or not.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 5

For further information on what each of these Balance dimension members represent, check out my blog post on Demystifying the Balance dimension in PCMCS.

You can view and edit the model view definition in the collapsed area between the POV and the Balancing report.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 6

The Input data on this customized Model View is pertinent only to Operating Expenses rather than the entire pool of data. This is the reason that the total USD value may be different from data displayed on the Default Model View report.

You can perform ad hoc edits to the Model View as you are using it, but none of the newly made selections will be stored. If you want to apply permanent changes to a specific Model View selection, you will have to edit the Model View in the corresponding menu.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 7

Your Model Views can be defined in the same order of operations as your allocations, or you can choose to create Model Views that are more detailed and dive deeper into a custom grouping of rules, regardless of the ruleset to which they might belong. The only dimensions displayed in your Model View selection are the Business dimensions. POV, Balance, Rule, and Attribute dimensions are not represented and therefore are not open for selection. The data points you define in the Model view will apply to all relevant rules IDs that generated the new cells.

Enhancing and Customizing Your Rule Balancing Reports

In the Demo BikesML30 application, there are several standard Rule Balancing reports that are split by Ruleset while others are named “Trace.” The Trace Model views are built in order to support point troubleshooting of allocation areas that are either complex or open to high variation during each run.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 8

If you want to use the Rule Balancing report values outside of the ad hoc capacity, you can export the report into XLS, but remember that such an export will not represent a Smart View report – it will simply be a listing of the information presented on the Rule Balancing screen, as some members displayed here do not have a direct equivalent in the application (Running Remainder, Running Balance). This export option can be found in the Actions menu, export to Excel, or by selecting the button in the below screen capture.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 10

A new workbook is downloaded called RuleBalance, and the entire set of data displayed on your screen will be available in XLS.

Alex Mlynarzek - PCM Rule Balancing - 2-7-19 - Image 9

PCMCS Rule Balancing Drawbacks

Rule Balancing does not allow filtering based on Attributes, UDAs, or Names.

Rule Balancing hyperlinks open SmartView tabs called Linked View, and any new selections of links within the Rule Balancing report will overwrite the contents of the existing tab. If you start developing a report by using Rule Balancing, remember to always rename the tab in case you want to kick off another report for a secondary data point within the same workbook.

Common Issues When Using Rule Balancing Reports

“Rule Balancing Report Links Don’t Work”

Your workstation must have Smart View installed before using the hyperlink feature within PCMCS. The latest Smart View version is available for download through the Navigator main menu under the Installations section.  For more guidance on generic EPM product patching, read the blog post Patch Today! Don’t Delay!

When selecting a hyperlink in the Rule Balancing report, you should be able to see that a download has started. As you click on the downloaded content, a new Excel tab will open, and you will be prompted to enter your Cloud credentials in order to have access to the requested data point intersections. If you do not have Excel open at the time you are accessing the downloaded content, the prompt to enter your Cloud credentials may not appear on the screen.

“I Can’t See Any Data in the PCMCS Rule Balancing Report.”

If data is not displayed on the screen, you are looking at one of the following situations:

  1. There is no data loaded and/or calculated for the POV at the intersections you have defined in the Rule Balancing report. Check your job console to see if such tasks have been triggered and completed successfully.
  2. Your security setup is restricting you from seeing any data values. Reach out to your administrator to adjust data grants or application access.
  3. (This used to happen occasionally during on-premise implementations) If your Business dimensions are tagged as Label Only, check that the first child contains values. You may be able to see data at base level intersections within your application, yet the Rule Balancing report shows no vales due to the Dimension Type, Member Storage, or Aggregation operators you have defined in the metadata.

“I Can’t Create a PCMCS Model View.”

This restriction is based on provisioning. Reach out to your PCMCS Administrator for assistance with your profile or settings.

Rule Balancing Wrap Up

Rule Balancing reports are easy to set up and use.  They retrieve data quickly, are accessible to all application users through the same menu, and they should be the first stop during a model run to quickly identify if there were any issues with data allocations.

Because Rule Balancing is a fast reporting tool with a predefined template OOTB, it is one of the commonly used troubleshooting reports for PCMCS, which can be leveraged for quick balance checks. It is also a mechanism for quick report building at detailed Rule level, a faster alternative to reading the Rule definition and manually replicating the intersections in a Smart View report.   Because these reports are system generated and their hyperlinks are based on application and rules set-up, there is no room for manual errors when building validations.

Save precious time by leveraging the PCMCS OOTB functionality. The next post in this series covers Intelligence screens – Analysis Views and Scatter Analysis.  If you have further questions on the usage of Balancing Reports within PCMCS, please reach out to our team of PCMCS experts at infosolutions@alithya.com.

Implementing Zero Based Budgeting: Setting Up Your Environment

The previous post – Implementing Zero-Based Budgeting: The Requirements – outlined two key components of a successful zero-based budgeting program:  a culture change and a centralized system. We recommended creating a centralized system with Oracle Planning and Budgeting Cloud Service (PBCS)/Enterprise Plainning and Budgeting Cloud Service (EPBCS) because of the many advantages it provides such as an environment with data depth.

Even with a zero-based budgeting blueprint, many companies are still hesitant to go “all in” thinking that a zero-based budgeting program implementation requires too much time and resources. The introduction of Cloud services such as Oracle PBCS/EPBCS makes the implementation of a centralized financial system easier than ever, greatly reducing the barrier to entry.

This final post in this series shares the power of a PBCS/EPBCS environment to achieve the greatest success with a newly implemented zero-based budgeting program.

How Can PBCS/EPBCS Environments Enhance the ZBB Experience?

There are four key ways to gain the most from a PBCS or EPBCS environment, including the setup of targets and accountability metrics that offer more meaningful data and greater transparency when making budgeting decisions.

Clients are often given target settings goals in management meetings or over the phone, but we demonstrate for them how to integrate this into their budgeting systems. On numerous occasions, Alithya has been contracted to implement target settings where leadership sets growth targets and the systems flows down the revenue by service, product line, etc. In turn, analysts match the underlying details.

Not surprisingly, this is a common request because target setting has been a long-time tradition during the budget process. By setting up this target setting process in PBCS\EPBCS, an off-line process is instead online and is molded with the overall budgeting system process.  Combining that with the zero-based budgeting mantra allows targets to be set and provides analysts with their needed baseline.  Moreover, analysis can be done on departments that take the typical “reduce expenses by 10% approach” to archive the target number instead of the more insightful zero-based budget journey.  Yes, target setting in a centralized system is easier, but the benefit of a centralized system is the ability to see how teams react to the new target.  Did they take the traditional “reduce budget percentages to fit the numbers,” or did they look at their budget as a whole and analyze each line item and question the numbers organically?

After targets are set and the budget is approved, we look at the said cost saving come to fruition.  A centralized system allows capital projects or initiatives to be tracked to help systematically measure the expenditures of cost savings activities found during the zero-based budget discovery. This provides a clear picture of what each department is doing and holds them more accountable for project decisions. It is an achievement to complete a zero-based budget “diet,” but holding teams accountable brings them to the next level of the zero-based budget “lifestyle.”

In essence, this new budgeting environment provides better insight into data – insight that ultimately allows savings to be found more effectively. For example, if you want to see the cost of direct materials, this centralized system can be set up to capture the costs in order to analyze and keep track of the different KPIs that reduce or increase overall costs.

Another example of how this works is by segmenting down employee costs such as travel. Instead of having a run rate of 10% of direct labor or travel costs, determine what job or tasks required that travel and use this KPI to negotiate travel expenses to further drive down costs.  Essentially, use PBCS/EPBCS as a tool to capture KPIs (e.g. travel costs by job) and determine the best use of travel dollars and – more importantly – negotiate with vendors on key travel.

Lastly, a budgeting environment provides clarity to help teams make better informed decisions about future initiatives. With the ability to see all of the underlying data points in a single location, it is possible to identify past sales and marketing campaigns and expenditures that led to profitable customers. Therefore, zero-based budgeting teams that took the initiative to determine the best sales and marketing costs to benefit analysis from the ground up are able to dedicate more resources (e.g. dollars, people, etc.) to winning strategies.  This is in contrast to the traditional budgeting approach of “10% rate of marketing spend year-of-year” that often masks the winning and more importantly losing marketing initiatives. Moreover, such planning and availability of different data points helps draw key inferences that allow sales and marketing teams to be more successful.

Summary 

Utilizing a Cloud service such as Oracle PBCS/EPBCS makes it easier for companies to implement a centralized system and achieve success with a zero-based budgeting program. PBCS/EPBCS environments can and should be set up in a way that enhances the zero-based budgeting experience. This is achieved by integrating target setting goals and establishing accountability metrics that allow a deeper dive into budget data while providing greater transparency to make better informed decisions.

To learn more about zero-based budgeting best practices and to get professional help with your Oracle PBCS/EPBCS environments, feel free to contact our team of experts.

Implementing Zero-Based Budgeting: The Requirements

A Culture Change and a Centralized System

The first post in this 3-post series – Implementing Zero-Based Budgeting: Benefits, Myths, and Goals – covers the benefits of zero-based budgeting. To summarize, it enables you to achieve long-term savings that result in sustainable growth and holds your financial analysts accountable for the cost figures they approve and how they are managing the overall budget. This allows more effective recognition of any unwanted costs and how you that money can be shifted into other growth areas within the company.

However, to reap the benefits of a zero-based budgeting program, a culture change is needed first at certain levels within the company. The goal is to eventually have the entire company complete this culture shift, but it is best to start small. Along with a change in culture, a centralized reporting system needs to be created as well to provide teams the ability to share real-time numbers with each other to achieve the goals of this new budgeting program.

Better Than a Quick Fix

What exactly is meant by a culture change? This means starting small and fostering this culture change in other departments starting with Finance. To be successful with this new program, other departments will eventually have to jump on board with this new budgeting approach. These departments will need to step up in analyzing their own costs and how they can save more without diminishing their capabilities.

For example, while financial analysts talk to the shop floor to see where costs can be reduced, the HR department should work with Finance to determine how it can become leaner. Moreover, the IT department should take the lead on negotiating with its vendors to find any areas that can be saved. These are just a few examples of how different departments can step up to the plate; implementing a successful zero-based budgeting program will requires team effort.

Changing the culture doesn’t happen overnight. Senior leaders should take the lead in fostering this change. To ensure that everyone is on the same page, managers need advocate the new approach within their respective departments.

Incentives also help teams to buy into this new budgeting approach.  Although incentives for growth metrics may already exist, additional incentives can effectively encourage staff to find ways to reduce costs for the metrics they manage.

Some examples of incentive metrics are the realized ROI based on the requested capital expenditure and the total cost saving dollars resulting from a zero-based budgeting program. For the former, this can mean moving to the Cloud to save money or reducing redundant tasks by introducing centralized software. For the latter, it can be exemplified by achieving a 10% cost reduction per phone.

Best Practice to Achieve Success

A crucial component of the success of a zero-based budgeting program is an officer who governs the entire process from start to finish. This individual (or team) should contain deep knowledge of the budgeting process. Naturally, s/he will not know the ins and outs of each department, so that is why s/he needs to be an ambassador to department leaders. The officer will also provide oversight to ensure that past bad habits of budgeting do not return to plague this new program. And lastly, s/he must be dedicated to the craft of continuous improvement which means seeking outside counsel when needed.

As mentioned earlier in the post, a culture change needs to be accompanied by a centralized reporting system. Alithya has helped clients implement Oracle Planning and Budgeting Cloud Service (PBCS) and Enterprise Planning and Budgeting Cloud Service (EPBCS) and overcome the deficiencies of Excel-based models. These models lose sight of what the true cost numbers are because past budgets are simple anchors of history rather than detailed breakdowns of cost. Moreover, these numbers become siloed within the vast library of Excel models. With Oracle PBCS or EPBCS, budgets can be highly surgical and help leaders in the company pinpoint reductions.

A centralized system allows the capture of all changes in a single location in real-time, and it provides insight into how effectively managers seek cost savings. This can be used as a key indicator to determine if their actions are in line with this new methodology.

Furthermore, centralization not only holds managers more accountable, but it also empowers them to create innovative cost-saving solutions. Driven by incentives, staff will burn with a clear purpose to find new ways to achieve sustainable growth for the company and be rewarded for hard work.

Recapping What It Takes to Achieve ZBB Success

The goal is to create a cost savings culture that allows more capital to be invested into growing parts of the company. To be successful, follow the best practices outlined, starting with a culture change within the company and giving your teams a centralized PBCS and EPBCS system to more clearly see all data points. The hard work does not stop here, though! The next post delves into setting up a zero-based budgeting system.

Implementing Zero-Based Budgeting: Benefits, Myths, and Goals

If you are in the finance world, then you probably have heard of zero-based budgeting. Investopedia defines zero-based budgeting as “a method of budgeting in which all expenses must be justified for each new period. The process…starts from a “zero base,” and every function within an organization is analyzed for its needs and costs.”

There are many reasons that financial professionals decide to use zero-based budgeting. For one thing, it goes hand-in-hand with a centralized system where information can be shared – something at which Excel spreadsheets are terrible. Furthermore, developing a centralized system enables you to scale to your needs as your company grows. Lastly, it enables financial analysts to spend more of their work week analyzing data instead of curating a financial system and worrying if the numbers match.

At Alithya, we have found with our past clients that a successful zero-based budgeting implementation resolves numerous problems. The two main things clients hope to achieve is growth across multiple business units and developing sustained cost reduction. With zero-based budgeting, you can earn long-term savings that can directly translate into sustainable growth.

Earning Long-Term Cost Savings

Zero-based budgeting becomes a daily exercise in cost savings for your financial teams. One method in achieving cost savings is renegotiating costs. For example, instead of taking the run-rate of 3% from last year’s numbers, perhaps you can contact your vendors to bargain for a better deal or switch to a different vendor with a more competitive price. Or how about having your analysts ask the IT department why it costs $38.03 per phone? What makes up that entire $38.08? Don’t assume that there aren’t any negotiable components of a cost.

The reason zero-based budgeting is so effective at long-term savings is that it is not a one-off fix. Many teams tend to implement one-off fixes, and then find that those fixes do not provide sustainable cost savings. A common example is offshoring your call center which might get you an immediate win in the cost column. However, this strategy typically reduces customer service quality while also limiting your ability to evolve with your business as it grows.

When enacting this type of program, you will analyze the costs of your business at every level. This may seem tedious, but what you will find is a clearer understanding of where your money is going. This can mean acquiring a greater understanding of contract labor costs as well as improving purchasing and procurement procedures, just to name a few. Moreover, when properly implemented, zero-based budgeting can reduce SG&A costs by 10 to 25 percent, often within as little as six months,” according to McKinsey & Company.

Debunking Myths Surrounding Zero-Based Budgeting

There are many myths surrounding zero-based budgeting that have sadly created an artificial barrier that CFOs and their teams do not want to cross. Many financial professionals think that it means cutting the budget down to the bare bones, but rather, a zero-based budgeting program analyzes costs from the top-down. Moreover, it is the CFOs’ duty to outline cost-cut targets so that their team’s efforts are focused.

Another misconception is that zero-based budgeting only helps with cutting the costs of SG&A. Actually, it can do much more, such as breaking down the Cost of Goods Sold (COGS) and help teams make investment choices on the capital expenditure with the greatest ROI.

Just because your business is not in decline or stagnating doesn’t mean that you can’t adopt a zero-based budgeting program. If you are already achieving growth, you can use this type of budgeting method to keep the overall business leaner so that you can provide more runway for growing business units.

Do you really start from zero? This is a common question that we are asked, and many people think because of its name that you do always start from zero. Technically, this is true, but this is the core component that drives the cost management culture change that will be introduced in the next post in this series.

However, not all things have to start from zero. At Alithya, we have been through many implementations where parts of the P&L are driver-based or zero-based. This can be achieved with a detailed, structured, and interactive system (like Oracle PBCS/EPBCS) that gives you real-time feedback.

How Does Oracle PBCS and EPBCS Help Achieve ZBB goals?

The main feature you acquire when you implement an Oracle PBCS or EPBCS system with your zero-based budgeting program is deeper analytics. This data enables you to dig into the “why and how” of your P&L.

For example, you could pose the question what driver did they use? Did they just simply take last year’s actuals and add 3%? Did they take a cost-per-head and budget it manually, or did they take the easy way out? All are important questions that force finance teams to be more accountable when it comes to everyday decisions.

Recapping the Benefits of ZBB

By implementing a zero-based budgeting program with a centralized system, you can hold your analysts more accountable to cost figures while making them own up to how the costs are managed. It allows you to recognize any unwanted costs that can be diverted into certain growth areas as well as breed a culture of cost reduction and visibility. The latter requires that you to start a culture change within your team. It is an essential part of having success with a zero-based budgeting program which is why we will cover it in greater detail in the next post.

Automation in Account Reconciliation Cloud Service (ARCS): At Its Finest

In the previous post, Redesign in Account Reconciliation Cloud Service (ARCS): From the Ground Up, I showed you how to rebuild ARCS down to the Profile Segments to speed things up. This time we’re slowing everything down…

So grab a glass of wine and throw on your Marvin Gaye vinyl because we’re getting it on with ~~automation~~. Oh yeahhh…

The sexiest topic of account reconciliations (didn’t think you’d ever see that sentence, did ya?) consistently revolves around automation. Yes, ARCS provides a central repository. Yes, ARCS is auditable. YES, ARCS shows a traceable workflow throughout the reconciliation cycle. All of these features are highly useful and absolutely a prerequisite to an enterprise worthy solution, but if you want to really grab people’s attention in a design session, start talking about the things they won’t have to do. ARCS provides both out-of-the-box functionality as well as customizable tools that help preparers  focus on high-importance reconciliations rather than spending time on low value-add or monotonous items.

Automation occurs in two areas: outside of ARCS (e.g. data feeds) and within ARCS (e.g. auto reconciliations and rules). Setting up the former enhances the latter. Either Cloud Data Management (CDM) or Financial Data Quality Management Enterprise Edition (FDMEE) can be used to load data to ARCS, albeit in different manners, but how this is accomplished is beyond the scope of this post. This data can be sourced from a variety of general ledgers and sub ledgers/subsystems including Financials Cloud, E-Business Suite (EBS), PeopleSoft, JD Edwards, and even *gasp* Excel (…if we have to…). By automating these data feeds directly from the source, management can be confident in the validity of the data (e.g. accuracy, no manual intervention or “massaging,” live, etc.) and, with scheduling, administrators have one or more fewer task(s) to worry about. The latest application data is up-to-date by the time the office doors open. Additionally, data refreshes can occur multiple times throughout the reconciliation cycle without concern for loss of work. ARCS will only update reconciliations with differences from the last data load and will change the workflow status if data has been modified and needs to be looked at again.

Within ARCS, the “bread and butter” for gaining efficiencies in the reconciliation cycle is through utilizing the out-of-the-box auto reconciliation method property on the Profiles. This will set the conditions under which the reconciliation will automatically change the workflow status to “closed,” allowing preparers to focus on the remaining “open” reconciliations that require attention. Which conditions are available for selection depends on the Format type. Furthermore, this field can be easily updated after-the-fact. Using the Actions pane, this property can be updated to a mass of Profiles based on custom filtering.

Automation in ARCS 1

[Screenshot 10a: The “Set Attribute…” functionality from the Actions pane is a powerful tool that can be used to make mass updates from the user interface.]

 

Automation in ARCS 2

[Screenshot 10b: In this example, the “Set Attribute…” functionality can be used to make updates to the Auto Reconciliation Method property for all Profiles, selected Profiles, or Profiles that fit customized criteria.]

The “Set Attribute” functionality is a powerful tool for making changes across multiple Profiles within the ARCS user interface. In many instances, this is a preferable alternative to extracting the Profiles to a text file to modify offline. Screenshots 10a – 10b show how it can be used to update the Auto Reconciliation Method attribute specifically, but there are a plethora of other attributes that can be updated in this manner.

The last puzzle piece to the trinity of automation is customized rules. Similar to custom attributes, rules can be added in a variety of places within your reconciliations to further enhance and streamline the process for both end-users and application administrators. Attributes, formats, profiles, and even specific transaction types (ex. on Subsystem Adjustments, but not on Source System Adjustments) can contain separate sets of rules.

Automation in ARCS 3

[Screenshot 11a: Rules can be added at a Format level.]

 

Automation in ARCS 4

[Screenshot 11b: Different Rule resolutions will be available depending on where the rule is created. This screenshot, for example, shows the options for rules created at a Format level.]

 

Automation in ARCS 5

[Screenshot 11c: Rules can be added at a specific transaction type. In this screenshot, any rules created here would only affect Subsystem Adjustments and would not affect System Adjustments.]

 

Automation in ARCS 6

[Screenshot 11d: Different Rule resolutions will be available depending on where the rule is created. This screenshot, for example, shows the options for rules created at a specific transaction type level.]

Thus, rules can be used for anything from sweeping, application-wide changes down to differences at a transaction-by-transaction basis, as seen in Screenshots 11a – 11d. If ARCS is a suit, then rules are custom tailoring; they are made to fit your company’s specific needs.

The most common rule I see relates to Auto-Submission (as opposed to Auto Reconciliation). The out-of-the-box auto reconciliation methods previously discussed are set on Profiles and can be used to “close” a reconciliation for the period if the criteria is met. However, sometimes a reconciliation still needs reviewing such as if it is considered higher risk or only during certain periods in the fiscal year. Customized rules can dynamically determine which reconciliations can skip the preparer and be assigned directly to the reviewer, and which are clear to be automatically “closed” for the month (e.g. without approval by a preparer or reviewer). Tailoring rules in this manner still helps the preparers reduce their workload while giving management the confidence that the higher priority reconciliations are being reviewed – the best of both worlds!

No Mistakes with Modularity from “Day 1” to “Day 100”
So, there you have it: the four main manifestations of ARCS’ modularity. While nothing will replace proper planning, ARCS does not permanently punish any application decisions you (or your partner) have made in the past. The tool is able to grow with your company and accommodate your needs as they arise. There’s no reason to pick “today” or “tomorrow” – have them both.

Am I right? Am I off my rocker? You tell me! Answer in the comments below if ARCS’ (or ARM! We haven’t forgotten you…) has been able to accommodate the changes with your company’s growth.

If you like what you’ve read, please consider sharing this article through social media. And let me know in the comments what topic(s) you would like to see covered in future posts.

*Screenshots taken from the patch 1806 release.

Laser Tag for Cloud Analytics

A friendly game of laser tag between out-of-shape technology consultants became a small gold mine of analytics simply by combining the power of Essbase and the built-in data visualization features of Oracle Analytics Cloud (OAC)! As a “team building activity,” a group of Edgewater Ranzal consultants recently decided to play a thrilling children’s game of laser tag one evening.  At the finale of the four-game match, we were each handed a score card with individual match results and other details such as who we hit, who hit us, where we got hit, and hit percentage based on shots taken.  Winners gained immediate bragging rights, but for the losers, it served as proof that age really isn’t just a number (my lungs, my poor collapsing lungs).  BUT…we quickly decided that it would be fun to import this data into OAC to gain further insight about what just happened.

Analyzing Results in Essbase

Using Smart View, a comprehensive tool for accessing and integrating EPM and BI content from Microsoft Office products, we sent the data straight to Essbase (included in the OAC platform) from Excel, where we could then apply the power of Essbase to slice the data by dimensions and add calculated metrics. The dimensions selected were:

  • Metrics (e.g. score, hit %)
  • Game (e.g.Game 1, Game 2, Total),
  • Player
  • Player Hit
  • Target (e.g. front, back, shoulder)
  • Bonus (e.g. double points, rapid fire)

With Essbase’s rollup capability, dimensions can be sliced by any one item or at a “Total” level. For example, the Player dimension’s structure looks like this:

  • Players
    • Red Team
      • Red Team Player 1
      • Red Team Player 2
    • Blue Team
      • Blue Team Player 1
      • Blue Team Player 2

This provides instant score results by player, by “Total” team, or by everybody. Combined with another dimension like Player Hit, it’s easy to examine details like number of times an individual player hit another player or another team in total. You can drill in to Red Team Player 1 shot Blue Team or Red Team Player 1 shot Blue Team Player 1 to see how many times a player shot an individual player. A simple Smart View retrieval along the Player dimension shows scores by player and team, but the data is a little raw. On a simple data set such as this, it’s easy to pick out details, but with OAC, there is another way!

Laser Tag 1

Even More Insight with Oracle Analytics Cloud (OAC)

Using the data visualization features of OAC, it’s easy to build queries against the OAC Essbase cube to gain interesting insight into this friendly folly and, more importantly, answer the questions everybody had: what was the rate of friendly fire and who shot who? Building an initial pivot chart by simply dragging and dropping Essbase dimensions onto the canvas including the game number, player, score, and coloring by our Essbase metric “Bad Hits” (a calculated metric built in Essbase to show when a player hit a teammate), we discovered who had poor aim…

Laser Tag 2

Dan from the Blue team immediately stands out as does Kevin and Wayne from the Red team!  This points us in the right direction, but we can easily toggle to another visualization that might offer even more insight into what went on. Using a couple of sunburst type data visualizations, we can quickly tie who was shooting and who was getting hit – filtered by the same team and then weight by the score (and also color code it by team color).

Laser Tag 3

It appears that Wayne and Kevin from the Red Team are pretty good at hitting teammates, but it is also now easy to conclude that Wayne really has it out for Kevin while Kevin is an equal opportunity shoot-you-in-the-back kind of teammate!

Reimagining the data as a scatter plot gives us a better look at the value of a player in relation to friendly fire. By dragging the “Score” Essbase metric into the size field of the chart, correlations are discovered between friendly fire and hits to the other team.  While Wayne might have had the highest number of friendly fire incidents, he also had the second highest score for the Red team.  The data shows visually that Kevin had quite a few friendly fire incidents, but he didn’t score as much (it also shows results that allow one to infer that Seema was probably hiding in a corner throughout the entire game, but that’s a different blog post).

Laser Tag 4

What Can You Imagine with the Data Driving Your Business?

By combining the power of Essbase with the drag-and-drop analytic capabilities of Oracle Analytics Cloud, discovering trends and gaining insight is very easy and intuitive. Even in a simple and fun game of laser tag, results and trends are found that aren’t immediately obvious in Excel alone.  Imagine what it can do with the data that is driving your business!

With Oracle giving credits for a 30-day trial, getting started today with OAC is easy. Contact us for help!

A Comparison of Oracle Business Intelligence, Data Visualization, and Visual Analyzer

We recently authored The Role of Oracle Data Visualizer in the Modern Enterprise in which we had referred to both Data Visualization (DV) and Visual Analyzer (VA) as Data Visualizer.  This post addresses readers’ inquiries about the differences between DV and VA as well as a comparison to that of Oracle Business Intelligence (OBI).  The following sections provide details of the solutions for the OBI and DV/VA products as well as a matrix to compare each solution’s capabilities.  Finally, some use cases for DV/VA projects versus OBI will be outlined.

For the purposes of this post, OBI will be considered the parent solution for both on premise Oracle Business Intelligence solutions (including Enterprise Edition (OBIEE), Foundation Services (BIFS), and Standard Edition (OBSE)) as well as Business Intelligence Cloud Service (BICS). OBI is the platform thousands of Oracle customers have become familiar with to provide robust visualizations and dashboard solutions from nearly any data source.  While the on premise solutions are currently the most mature products, at some point in the future, BICS is expected to become the flagship product for Oracle at which time all features are expected to be available.

Likewise, DV/VA will be used to refer collectively to Visual Analyzer packaged with BICS (VA BICS), Visual Analyzer packaged with OBI 12c (VA 12c), Data Visualization Desktop (DVD), and Data Visualization Cloud Service (DVCS). VA was initially introduced as part of the BICS package, but has since become available as part of OBIEE 12c (the latest on premise version).  DVD was released early in 2016 as a stand-alone product that can be downloaded and installed on a local machine.  Recently, DVCS has been released as the cloud-based version of DVD.  All of these products offer similar data visualization capabilities as OBI but feature significant enhancements to the manner in which users interact with their data.  Compared to OBI, the interface is even more simplified and intuitive to use which is an accomplishment for Oracle considering how easy OBI is to use.  Reusable and business process-centric dashboards are available in DV/VA but are referred to as DV or VA Projects.  Perhaps the most powerful feature is the ability for users to mash up data from different sources (including Excel) to quickly gain insight they might have spent days or weeks manually assembling in Excel or Access.  These mashups can be used to create reusable DV/VA Projects that can be refreshed through new data loads in the source system and by uploading updated Excel spreadsheets into DV/VA.

While the six products mentioned can be grouped nicely into two categories, the following matrix outlines the differences between each product. The following sections will provide some commentary to some of the features.

Table 1

Table 1:  Product Capability Matrix

Advanced Analytics provides integrated statistical capabilities based on the R programming language and includes the following functions:

  • Trendline – This function provides a linear or exponential plot through noisy data to indicate a general pattern or direction for time series data. For instance, while there is a noisy fluctuation of revenue over these three years, a slowly increasing general trend can be detected by the Trendline plot:
Figure 1

Figure 1:  Trendline Analysis

 

  • Clusters – This function attempts to classify scattered data into related groups. Users are able to determine the number of clusters and other grouping attributes. For instance, these clusters were generated using Revenue versus Billed Quantity by Month:
Figure 2

Figure 2:  Cluster Analysis

 

  • Outliers – This function detects exceptions in the sample data. For instance, given the previous scatter plot, four outliers can be detected:
Figure 3

Figure 3:  Outlier Analysis

 

  • Regression – This function is similar to the Trendline function but correlates relationships between two measures and does not require a time series. This is often used to help create or determine forecasts. Using the previous Revenue versus Billed Quantity, the following Regression series can be detected:
Figure 4

Figure 4:  Regression Analysis

 

Insights provide users the ability to embed commentary within DV/VA projects (except for VA 12c). Users take a “snapshot” of their data at a certain intersection and make an Insight comment.  These Insights can then be associated with each other to tell a story about the data and then shared with others or assembled into a presentation.  For those readers familiar with the Hyperion Planning capabilities, Insights are analogous to Cell Comments.  OBI 12c (as well as 11g) offers the ability to write comments back to a relational table; however, this capability is not as flexible or robust as Insights and requires intervention by the BI support team to implement.

Figure 5

Figure 5:  Insights Assembled into a Story

 

Direct connections to a Relational Database Management System (RDBMS) such as an enterprise data warehouse are now possible using some of the DV/VA products. (For the purpose of this post, inserting a semantic or logical layer between the database and user is not considered a direct connection).  For the cloud-based versions (VA BICS and DVCS), only connections to other cloud databases are available while DVD allows users to connect to an on premise or cloud database.  This capability will typically be created and configured either by the IT support team or analysts familiar with the data model of the target data source as well as SQL concepts such as creating joins between relational tables.  (Direct connections using OBI are technically possible; however, they require the users to manually write the SQL to extract the data for their analysis).  Once these connections are created and the correct joins are configured between tables, users can further augment their data with data mashups.  VA 12c currently requires a Subject Area connected to a RDBMS to create projects.

Leveraging OLAP data sources such as Essbase is currently only available in OBI 12c (as well as 11g) and VA 12c. These data sources require that the OLAP cube be exposed as a Subject Area in the Presentation layer (in other words, no direct connection to OLAP data sources).  OBI is considered very mature and offers robust mechanisms for interacting with the cube, including the ability to use drillable hierarchical columns in Analysis.  VA 12c currently exposes a flattened list of hierarchical columns without a drillable hierarchical column.  As with direct connections, users are able to mashup their data with the cubes to create custom data models.

While the capabilities of the DV/VA product set are impressive, the solution currently lacks some key capabilities of OBI Analysis and Dashboards. A few of the most noticeable gaps between the capabilities of DV/VA and OBI Dashboards are the inability to:

  • Create the functional equivalent of Action Links which allows users to drill down or across from an Analysis
  • Schedule and/or deliver reports
  • Customize graphs, charts, and other data visualizations to the extent offered by OBI
  • Create Alerts which can perform conditionally-based actions such as pushing information to users
  • Use drillable hierarchical columns

At this time, OBI should continue to be used as the centerpiece for enterprise-wide analytical solutions that require complex dashboards and other capabilities. DV/VA will be more suited for analysts who need to unify discrete data sources in a repeatable and presentation-friendly format using DV/VA Projects.  As mentioned, DV/VA is even easier to use than OBI which makes it ideal for users who wish to have an analytics tool that rapidly allows them to pull together ad hoc analysis.  As was discussed in The Role of Oracle Data Visualizer in the Modern Enterprise, enterprises that are reaching for new game-changing analytic capabilities should give the DV/VA product set a thorough evaluation.  Oracle releases regular upgrades to the entire DV/VA product set, and we anticipate many of the noted gaps will be closed at some point in the future.

Oracle Business Intelligence – Synchronizing Hierarchical Structures to Enable Federation

More and more Oracle customers are finding value in federating their EPM cubes with existing relational data stores such as data marts and data warehouses (for brevity, data warehouse will refer to all relational data stores). This post explains the concept of federation, explores the consequences of allowing hierarchical structures to get out of synchronization, and shares options to enable this synchronization.

In OBI, federation is the integration of distinct data sources to allow end users to perform analytical tasks without having to consider where the data is coming from. There are two types of federation to consider when using EPM and data warehouse sources:  vertical and horizontal.  Vertical federation allows users to drill down a hierarchy and switch data sources when moving from an aggregate data source to a more detailed one.  Most often, this occurs in the Time dimension whereby the EPM cube stores data for year, quarter, and month, and the relational data sources have details on daily transactions.  Horizontal federation allows users to combine different measures from the distinct data sources naturally in an OBI analysis, rather than extracting the data and building a unified report in another tool.

Federation makes it imperative that the common hierarchical structures are kept in sync. To demonstrate issues that can occur during vertical federation when the data sources are not synchronized, take the following hierarchies in an EMP application and a data warehouse:

Figure 1: Unsynchronized Hierarchies

Jason Hodson Blog Figure 1.jpg

Notice that Colorado falls under the Western region in the EPM application, but under the Southwestern region in the data warehouse. Also notice that the data warehouse contains an additional level (or granularity) in the form of cities for each region.  Assume that both data sources contain revenue data.  An OBI analysis such as this would route the query to the EPM cube and return these results:

Figure 2: EPM Analysis – Vertical Federation

Jason Hodson Blog Figure 2

However, if the user were to expand the state of Washington to see the results for each city, OBI would route the query to the data warehouse. When the results return, the user would be confronted with different revenue figures for the Southwest and West regions:

Figure 3: Data Warehouse – Vertical Federation

Jason Hodson Blog Figure 3

When the hierarchical structures are not aligned between the two data sources, irreconcilable differences can occur when switching between the sources. Many times, end users are not aware that they are switching between EPM and a data warehouse, and will simply experience a confusing reorganization in their analysis.

To demonstrate issues that occur in horizontal federation, assume the same hierarchies as in Figure 1 above, but the EPM application contains data on budget revenue while the data warehouse contains details on actual revenue. An analysis such as this could be created to query each source simultaneously and combine the budget and actual data along the common dimension:

Figure 4: Horizontal Federation

Jason Hodson Blog Figure 4

However, drilling into the West and Southwest regions will result in Colorado becoming an erroneously “shared” member:

Figure 5: Colorado as a “Shared” Member

Jason Hodson Blog Figure 5

In actuality, the mocked up analysis above would more than likely result in an error since OBI would not be able to match the hierarchical structures during query generation.

There are a number of options to enable the synchronization of hierarchical structures across EPM applications and data warehouses. Many organizations are manually maintaining their hierarchical structures in spreadsheets and text files, often located on an individual’s desktop.  It is possible to continue this manual maintenance; however, these dispersed files should be centralized, a governance processes defined, and the EPM metadata management and data warehouse ETL process redesigned to pick up these centralized files.  This method is still subject to errors and is inherently difficult to properly govern and audit.  For organizations that are already using Enterprise Performance Management Architect (EPMA), a scripting process can be implemented that extracts the hierarchical structures in flat files.  A follow on ETL process to move these hierarchies into the data warehouse will also have to be implemented.

The best practices solution is to use Hyperion Data Relationship Management (DRM) to manage these hierarchical structures. DRM boasts robust metadata management capabilities coupled with a system-agnostic approach to exporting this metadata.  DRM’s most valuable export method allows pushing directly to a relational database.  If a data warehouse is built in tandem with an EPM application, DRM can push directly to a dimensional table that can then be accessed by OBI.  If there is a data warehouse already in place, existing ETL processes may have to be modified or a dimensional table devoted to the dimension hierarchy created.  Ranzal has a DRM accelerator package to enable the synchronization of hierarchical structures between EPM and data warehouses that is designed to work with our existing EPM application DRM implementation accelerators.  Using these accelerators, Ranzal can perform an implementation in as little as six weeks that provides metadata management for the EPM application, establishes a process for maintaining hierarchical structure synchronization between EPM and the data warehouse, and federation of the data source.

While the federation of EPM and data warehouse sources has been the primary focus, it is worth noting that two EPM cubes or two data warehouses could be federated in OBI. For many of the reasons discussed previously, data synchronization processes will have to be in place to enable this federation.  The previous solutions for maintaining metadata synchronization may be able to be adapted to enable this federation.

The federation of EPM and data warehouse sources allows an enterprise to create a more tightly integrated analytical solution. This tight integration allows users to transverse the organization’s data, gain insight, and answer business essential questions at the speed of thought.  As demonstrated, mismanaging hierarchical structures can result in an analytical solution that produces unexpected results that can harm user confidence.  Enterprise solutions often need enterprise approaches to governance; therefore, it is often imperative to understand and address shortcomings in hierarchical structure management.  Ranzal has a deep knowledge of EPM, DRM, and OBIEE, and how these systems can be implemented to tightly work together to address an organization’s analytical and reporting needs.