OPA! The Future of Cloud Integration – Important Updates Are Coming

Much to the chagrin of Product Management, I often abbreviate Cloud Data Management to CDM.  Why do they not like that I do this?  Well there is a master data management tool for Customer data that you can guess also uses the same acronym.  While I understand the potential confusion, since I’m telling you up front, there should be no confusion when I use CDM throughout this post.

I recently had the opportunity to meet with Oracle Product Management and Development for FDMEE/CDM to get a preview of what’s coming to the product and offer feedback for additional functionality that would benefit the user community.  We generally get together about once a year; however, it’s been a bit longer than that since our last meeting, so I was excited to hear what interesting things Oracle’s been working on and what we may see in the product in the future.

Now any good Oracle roadmap update would not be complete without a safe harbor reminder.  What you read here is based on functionality that does not yet exist.  The planned features described may or may not ever be available in the application – at the sole discretion of Oracle. No buying decisions should be made based on the information contained in this post.

Ok, now that we have that out of the way, let’s get into the fun stuff.  There are a number of enhancements coming and planned, but today I am going to focus on two significant ones:  performance and ground to cloud integration.

Performance Enhancements

We’re all friends here, so we can be honest with each other.  CDM (and FDMEE) isn’t an ETL tool in the truest sense of the word. It is not designed to handle the massive data volumes that more traditional ETL can and does.  You might think to yourself thanks for the info there Tony, but we all know that, and you wouldn’t be wrong, but I like to set the stage a bit.

If you know the history of FDMEE, you know that it was originally designed to integrate with Hyperion Enterprise and then HFM.  Essbase and Planning became targets later.  Integrating G/L data is far different than the more operational data that is often needed by targets like EPBCS and PCMCS.  While CDM (and FDMEE) can technically handle the volume of data with this more granular data, the performance of those integrations are sometimes less than optimal.  This dynamic has plagued users of CDM for years.  It has only been exacerbated when integrations are built that do not have a deep understanding of how to tune CDM (and FDMEE) processes to achieve the highest level of performance within the constructs of the application. As CDM has grown in popularity (owing to the growth of Oracle EPM Cloud), the problem of performance has become more visible.

To address performance concerns, Oracle is planning to support 3 workflow methods:

  • Full – No change from legacy process
  • Full, No Archive – Same workflow as today but data is deleted from the data table (tDataseg) after a successful export.  This means the data table will contain less rows and should allow new rows to be added faster (inserts during the workflow process).  The downside of this method is that drill through is not available.
  • Simple – Same workflow as today but data is never moved from the staging/processing table (tDataSeg_T) to the data storage table (tDataSeg).  This is the most expensive (in terms of time) action in the workflow process so eliminating it will certainly improve performance. The downside is that data can never be viewed in the Workbench and Drill Through is not available.

Oracle has begun testing and has seen performance improvements in the range of 50% in data sets as large as 2 million rows.  To achieve that metric required the full complement of the new features of Data Integrations (i.e., Expressions) to be utilized. That said, this opens up a world of possibility for how CDM can potentially be used.

If you have integrations that are currently less than optimal in terms of performance, continue monitoring for this enhancement.  If you need assistance, feel free to reach out to us to connect with our team of data integration experts.

On-Premise Agent

Ground to cloud integration is one of the most important capabilities to consider when implementing Oracle EPM Cloud.  As the Oracle EPM Cloud has evolved, so too has the complexity of the solutions deployed within it which has steadily increased the complexity of the integrations needed to support solutions.  While integration with on-premises has always been supported through EPM Automate, this requires a flat file to be generated by the system from which data will be sourced. The file is then loaded to the cloud and processed by CDM.  This is very much a push approach to data integration.

The ability of the cloud to pull data from on-premises systems simply did not exist. For integrations with this requirement, FDMEE (or some other application) was needed. Well as the old saying goes, the only thing constant is change.

Opa! – a common Greek emotional expression. It is frequently used during celebrations.  Well it’s time to celebrate because Oracle will soon (CY19) be introducing an on-premises agent (OPA) for CDM!

This agent will allow a workflow to be initiated from CDM, communicate back to the on-premises systems, initialize and then upload an extract to the cloud. The extract will be natively imported by CDM.  This approach is similar to how the FDMEE SAP adaptor currently works.  From an end user perspective, they click Import on the Data Load Workbench and after some time, data appears in the application. What’s happening in the background is that the adaptor is initializing an extract from SAP and writing the results to a flat file which is then imported by the application. OPA will function in an almost identical way.

OPA is a light weight JAVA utility that requires no additional software (other than JAVA) that will be installed on local systems. It will support both Microsoft and Linux operating systems. Like all Oracle on-premises utilities (e.g., EPM Automate), password encryption will be supported. The only port(s) which are required to be opened are 80 (HTTP) or 443 (HTTPS).  A customer can then use an externally facing web server to redirect to an internal port for the agent to receive the request.  This is true only if the customer wants to run the agent on a port other than 80 or 443 and do not want to open that port on their enterprise firewall.  If the customer wants to run the agent on port 80 or 443 and either of those ports are open, then no firewall action would be required.

The on-premises agent will have native support for Oracle EBS and PeopleSoft GL – meaning the queries are prebuilt by Oracle.  Additionally, OPA will support connecting to on-premises relational data sources.  Currently Oracle, SQL Server and MySQL drivers are bundled natively but additional drivers can be deployed as needed meaning systems such as Teradata will be able to be leveraged as data sources.

OPA will also provide an ability to execute scripts (currently planned for JAVA but discussions for Groovy and Jython are in flight) before and after the on-premises extract process.  This is similar to how the BefImport and AftImport event scripts are currently used in FDMEE.  This will allow the agent to perform pre and post processing such as running a stored procedure to populate a data view from which CDM will source data.

The pre and post events of OPA really open up a world of opportunity and lay the foundation for CDM to support scripting.  How you might ask?  In v1.0, OPA is intended to provide a mechanism to load on-premises data to the cloud.  But in theory, CDM could make a call to OPA at the normal workflow events (of FDMEE) and instead of waiting for a data file, simply wait for an on-premises script to return an execution code.  This construct would eliminate the security concerns that prevented scripting from being deployed in CDM as the scripts would execute locally instead of on the cloud.

The OPA framework is really a game changer and will greatly enhance the capability of CDM to provide Oracle EPM Cloud customers a true “all cloud” deployment.  I am thrilled and can’t wait to get my hands on OPA for beta testing.  I’ll share my updates once I get through testing over the next couple of months.  I’ll also be updating the white paper I authored back in December of 2017 once OPA is released to the general public.  Stay tuned folks and feel free to let out a little exclamation about these exciting coming enhancements…OPA!

Oracle Announces Removal of Support for Transport Layer Security Protocol 1.0 and 1.1; How Does that Affect Me?

Oracle has announced that as of May 3, 2019, the use of Transport Layer Security Protocols 1.0 and 1.1 will no longer be supported.  Communications to Cloud products will only be supported with TLS1.2.

The announcement was made in the following February What’s New communications from Oracle:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 1

The WHAT has come; now WHO is affected?

There are many ways to connect to the Cloud, so to better understand them, let’s break down the more popular ways of connecting and the common technology that these tools use for their connection, HTTPS:

  1. EPMAutomate
  2. Web Browsers
  3. cURL / PowerShell
  4. Financial Data Quality Management, Enterprise Edition (FDMEE)

EPMAutomate is pretty much a done deal.  If there are issues or fixes needed, Oracle will be releasing an update to go along with the Cloud deployment.  Keep an eye out on the What’s New pages as well as a notification when running EPMAutomate itself.

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 2

Both recent versions of Internet Explorer and FireFox both support TLS1.2 out-of-the-box.  It might not be enabled based on IT policies, but the functionality is present and easy to check.

Internet Explorer > Tools > Internet Options > AdvancedWayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 3

Firefox > about:config > security.tls.versionWayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 4a

Value 1 =  TLS-1.0 and a Value 4 = TLS-1.3

Now if you have ventured out into custom scripting, EPMAutomate doesn’t count in this situation, but have fully embraced REST…. then cURL and PowerShell might need some tweaks as well.  This is the start of the real reason why Oracle has started to outline and share information with the end-user community.

As a result, these solutions will need to be updated and retested.  For this purpose, Oracle has stated that you can early request, via Oracle Support, a TLS1.2-only POD for testing.  I highly recommend this, as it has provided some great insight for Alithya.  We were also able to pass along our findings to Oracle early to help stream-line the patching process of FDMEE; more on this later.

cURL scripts will need to be updated to use the ”–tlsv1.2” command when being invoked.

For PowerShell, you will need to add the following line in your scripts:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

The thing that really got me excited: FDMEE!

The last topic that Oracle mentions is if you use FDMEE on-premise.  If you are like me, an FDMEE fanatic, then you’ll know that this caused all the triggers in my brain to start firing.  All the things that I do in FDMEE will need to be tested to make sure they comply and work.  The things I use in my daily activities are:

  1. JSON based RestFUL API calls in Jython scripts
  2. Target Application Registrations to Cloud Applications

I quickly shot off an Oracle Support ticket to get myself a TLS1.2 POD.  Oracle responded in relatively short time and stated that my POD was ready, and I had it for roughly two weeks for testing.  Without any changes to my virtual-lab, I attempted to connect to see what happens.  Sure enough, I received an error:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 5

I also imported an LCM of a previous Cloud application to get around this error to see what a set of custom Jython scripts with JSON/RestFUL API would produce and received similar errors:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 6

…as well as the out-of-the-box Refresh Metadata & Refresh Members options:

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 7

I confirmed with my colleagues in Development that this was the expected result when TLS is not at the right levels and all the appropriate patches are set up and configured.  Knowing this, I also tested with my browser option disabled and received the same result.  So now that I know I have a good starting point, I was off to the races to figure out how to continue.

Unfortunately, the links that Oracle provided in the What’s New announcement appear to be broken and not public.  As a result, I had to create an SR to gain access to the information.  After I received them and did some light reading, I was able to formulate a patch strategy, apply the necessary patches, apply the registry updates, and test again.

This time I was able to run successful tests of both FDMEE scripts and Oracle adaptor connections to the Cloud.

Wayne Paffhausen - Transport Layer Security Protocol Support Removal - 2-28-19 Image 8

Great… Now what do we do?

Patching the environments was not always an easy task.  It took quite a bit of time to complete as there were multiple products that needed updates.  Most of them weren’t standard EPM (HFM, Planning, etc.) products that needed updating:  WebLogic, JRockit, JDK, OHS, etc. all needed to be updated, but because these are the building blocks on which the EPM suite runs, they caused update dependencies into the EPM products we used.

Oracle has stated that this is going into effect on May 3rd which is right around the corner.  Alithya, an Oracle Platinum Partner, is here to help you assess your current EPM installation and build that patch plan.

Even if you don’t use the Cloud today but are thinking about moving to the Cloud at some point, it is important to make sure your environment is ready and that you have the necessary support.

For more information, contact us at infosolutions@alithya.com.