Tag Archive for: EPM

Serverless Backup & Restore operations in EPM Cloud

 

Have you ever wanted to automate migrations from your production to your non-prod environement?  And do it without using a standalone server to run EPMAutomate?

You can write Groovy scripts to run select EPM Automate commands directly in Oracle Enterprise Performance Management Cloud, without installing EPM Automate client on a client machine. Refer to Running Commands without Installing EPM Automate and Supported Command in Working with EPM Automate for Oracle Enterprise Performance Management Cloud for information on which EPM Automate commands can be run via Groovy and example scripts.

Oracle supports two types of Groovy rules:

  • Rules that can dynamically generate calc scripts at runtime based on context other than the runtime prompts and return the calc script which is then executed against Oracle Essbase.
  • Pure Groovy rules that can, for example, perform data validations and cancel the operation if the data entered violates company policies.

Use Case

Run a buiness rule from production to clone a backup to a non prod environment.


High Level Steps
  • Set the service admin user ID for target (Test) instance within the rule. Encrypt the password using EPMAutomate explicitly and copy to password variable in the rule explicitly. You must deploy the rule post any such changes to the rule. Once deployed user can Launch the rule.
  • User needs to enter the specific date in yyyy-mm-dd format to select the required date specific snapshot to restore and hit Launch.
  • If Check is selected as true then the rule will run to list all backup snapshot names available from object storage path and exit, else by default it will continue to next step.
  • Deletes the selected file if already present from migration path.
  • Restores the selected snapshot file to migration path with the date stamp suffixed to snapshot file name and proceeds to next step.
  • Clone the target (Test) instance with the restored backup snapshot. The cloning process takes ~ 30-50 mins to complete, depending on the data and artifacts residing in the snapshot.
  • User can check the status of cloning in production under Clone Environment menu or using REST API for status check.

 

Run time prompts

 

RTP NAME TYPE DEFAULT VALUE DESCRIPTION
RTP_Date String Use Last Entered Enter date string in yyyy-mm-dd format only.
RTP_Check Boolean false If selected true, then rule will only list all the backup snapshot names available in object storage path and exit without cloning activity.

This provision is just for user reference to check if the date provided is valid for the range of 60 days.

 

Code Sample

/* RTPS: {RTP_Date} {RTP_Check} */

// Timestamp definition
String getUserName = operation.getUser().getName()
String TrgURL = "" // target instance to clone
String adm_id = "" // service admin
String adm_passwd = "" // encrypted password

DateFormat tFormat = new SimpleDateFormat("yyyy-MM-dd")
tFormat = new SimpleDateFormat("yyyy-MMM-dd,EEE hh:mm:ss a zzz")
def tstamp = tFormat.format(new Date())
String dt = rtps.RTP_Date
def archive_date = dt
int d_limit = 0
int sts_code = rtps.RTP_Check.toString() as int
String sts = ""
boolean REST_Status = false
boolean List_Check = sts_code == 0 ? false : true

println "[$tstamp] : Backup file date $archive_date selected by $getUserName."

try {
d_limit = new Date() - new Date().parse('yyyy-MM-dd', dt)
} catch (Exception e) {
sts = "Please check the entered date $dt is in yyyy-mm-dd format."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Error $e - $sts"
throwVetoException("$sts")
}

if(!(d_limit>0 && d_limit<60)) {
sts = "Please select date range within last 60 days from today."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Error - $sts"
throwVetoException("$sts")
}

//***************List all existing backup files****************
String ConnectionName = "REST-EPM-MIGRATION-PROD"
String api_version = '/v2'
String api_resource_path = '/backups/list'
String jRSrc = api_version + api_resource_path

HttpResponse<String> jsonGetBackupListResponse = operation.application.getConnection(ConnectionName)
.get(jRSrc)
.asString()

sts = JsonPath.parse(jsonGetBackupListResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonGetBackupListResponse.body).read('$.status')
def fileList = JsonPath.parse(jsonGetBackupListResponse.body).read('$.items') as String[]
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonGetBackupListResponse.statusText - List Files - $api_resource_path - $sts. ${fileList.size()} snapshot files available. Below files available :"

String dte = archive_date.toString().trim()
String BackupFilePath = ""
String TargetFile = "Artifact_Snapshot_" + ConnectionName.substring(ConnectionName.length()-4) + "_$archive_date"

fileList.eachWithIndex { String i, int j ->
println i
if(dte.equals(i.substring(0,10)) && !REST_Status) {
BackupFilePath = i
sts = "$sts $BackupFilePath file found."
}
}

if(BackupFilePath == "") {
sts = "No file selected for date $archive_date. Please select date range within last 60 days from today."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonGetBackupListResponse.statusText - $sts"
throwVetoException("$sts")
} else {
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonGetBackupListResponse.statusText - $sts"
}

if(List_Check) {
sts = "Exiting program as just file check option selected. ${fileList.size()} backup snapshots available in object storage."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status OK - $sts"
throwVetoException("$sts")
}

//***************Delete existing backup file from Prod migration path if any****************
api_version = '/11.1.2.3.600'
api_resource_path = '/applicationsnapshots/' + TargetFile
jRSrc = api_version + api_resource_path
delFile(ConnectionName, jRSrc, tFormat)

//***************Restore existing backup files****************
api_version = '/v2'
api_resource_path = '/backups/restore'
jRSrc = api_version + api_resource_path

HttpResponse<String> jsonRestoreBackupResponse = operation.application.getConnection(ConnectionName)
.post(jRSrc)
.header("Content-Type", "application/json")
.body(json(["backupName":"$BackupFilePath", "parameters":["targetName":"$TargetFile"]]))
.asString()

String op = "Restore Backup"
sts = JsonPath.parse(jsonRestoreBackupResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonRestoreBackupResponse.body).read('$.status')
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonRestoreBackupResponse.statusText - $op - $api_resource_path - $sts"

//***************Restore backup files status check****************
statusCheck(jsonRestoreBackupResponse, ConnectionName, api_version, op, tFormat)

//***************Clone snapshot to source****************
api_version = '/v1'
api_resource_path = '/services/clone'
jRSrc = api_version + api_resource_path

HttpResponse<String> jsonCloneBackupResponse = operation.application.getConnection(ConnectionName)
.post(jRSrc)
.header("Content-Type", "application/json")
.body(json(["targetURL":"$TrgURL", "targetUserName":"$adm_id", "targetEncryptPassword":"$adm_passwd", "parameters":["snapshotName":"$TargetFile", "migrateUsers":"false", "maintenanceStartTime":"true", "dataManagement":"true", "jobConsole":"false", "applicationAudit":"false", "storedSnapshotsAndFiles":"false"]]))
.asString()

op = "Clone Backup"
sts = JsonPath.parse(jsonCloneBackupResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonCloneBackupResponse.body).read('$.status')
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCloneBackupResponse.statusText - $op - $api_resource_path - $sts"

//***************Clone backup status check****************
//statusCheck(jsonCloneBackupResponse, ConnectionName, api_version, op, tFormat)
op = "Clone Status Check"
api_version = '/v1'
api_resource_path = '/services/clone/status'
jRSrc = api_version + api_resource_path

HttpResponse<String> jsonCloneStatusResponse = operation.application.getConnection(ConnectionName)
.get(jRSrc)
.asString()

sts = JsonPath.parse(jsonCloneStatusResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonCloneStatusResponse.body).read('$.status')
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCloneStatusResponse.statusText - $op - $api_resource_path - $sts"
//sts = getImportJobStatus(ConnectionName, jRSrc)
REST_Status = awaitCompletion(jsonCloneStatusResponse, "$ConnectionName", "$op", "$jRSrc")
sts = getImportJobStatus(ConnectionName, jRSrc)
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCloneStatusResponse.statusText - $op \n$sts"

//************EPM Helper Functions****************
// Delete existing file
def delFile(String ConnectionName, String jRSrc, DateFormat tFormat) {
HttpResponse<String> jsonFileDeleteResponse = operation.application.getConnection(ConnectionName)
.delete(jRSrc)
.header("Content-Type", "application/json")
.asString()

int sts_code = JsonPath.parse(jsonFileDeleteResponse.body).read('$.status')
String sts = JsonPath.parse(jsonFileDeleteResponse.body).read('$.details').toString() + "."
//sts = sts + JsonPath.parse(jsonFileDeleteResponse.body).read('$.links[0].href').toString()
def tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonFileDeleteResponse.statusText - Delete Snapshot - $jRSrc - $ConnectionName $sts"
}
// v2 REST status check
def statusCheck(HttpResponse<String> jsonResponse, String ConnectionName, String api_version, String opr, DateFormat tFormat) {
String StatusURL = opr=="Clone Backup" ? JsonPath.parse(jsonResponse.body).read('$.links[0].href').toString() : JsonPath.parse(jsonResponse.body).read('$.links[1].href').toString()
String api_resource_path = StatusURL.substring(StatusURL.indexOf(api_version)+3, StatusURL.length())
String jRSrc = api_version + api_resource_path

HttpResponse<String> jsonCheckResponse = operation.application.getConnection(ConnectionName)
.get(jRSrc)
.asString()

int sts_code = JsonPath.parse(jsonCheckResponse.body).read('$.status')
String sts = JsonPath.parse(jsonCheckResponse.body).read('$.details').toString() + "."
def tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCheckResponse.statusText - $opr - $api_resource_path - $ConnectionName $sts"
boolean REST_Status = awaitCompletion(jsonCheckResponse, "$ConnectionName", "$opr", "$jRSrc")
}
// Await till REST completes
def awaitCompletion(HttpResponse<String> jsonResponse, String connectionName, String opr, String jrSrc) {
DateFormat tFormat = new SimpleDateFormat('yyyy-MMM-dd,EEE hh:mm:ss a zzz')
final int IN_PROGRESS = -1
int status = JsonPath.parse(jsonResponse.body).read('$.status')
def tstamp = tFormat.format(new Date())
if (!(200..299).contains(jsonResponse.status))
throwVetoException("Error : $status occured to execute $opr. $jsonResponse.statusText.")

// Parse the JSON response to get the status of the operation. Keep polling the REST call until the operation completes.
String j_Id = jrSrc.substring(jrSrc.lastIndexOf('/') + 1, jrSrc.length())
for (long delay = 500; status == IN_PROGRESS; delay = Math.min(2000, delay * 2)) {
sleep(delay)
status = getJobStatus(connectionName, jrSrc, status)
}
String Resp_Details = JsonPath.parse(jsonResponse.body).read('$.details').toString()
def itms = (List)JsonPath.parse(jsonResponse.body).read('$.items')
/*if (opr == 'Clone Backup') {
if (itms.size() > 0) {
itms.eachWithIndex { r, i ->
String StatusURL = r['links']['href'].toString().replace(']', '').replace('[', '')
String api_resource_path = StatusURL.substring(StatusURL.indexOf('/v2'))
StatusURL = getImportJobStatus(connectionName, api_resource_path)
if (StatusURL.length() > 0) {
String info = r['destination'].toString() + ' - ' + StatusURL //getImportJobStatus(connectionName, jRSrc)
tstamp = tFormat.format(new Date())
println("[$tstamp] : $i - $info")
}
}
}
def info = (List)JsonPath.parse(jsonResponse.body).read('$.intermittentStatus')
tstamp = tFormat.format(new Date())
println("[$tstamp] : Status $jsonResponse.statusText - $opr - $info")
}*/
tstamp = tFormat.format(new Date())
println("[$tstamp] : Status $jsonResponse.statusText - $opr - $jrSrc - ${status == 0 ? "successful for Job Id $j_Id" : "failed for Job Id $j_Id"}. \n$Resp_Details")
return status
}
// Poll the REST call to get the job status
int getJobStatus(String connectionName, String jobId, int get_sts) {
HttpResponse<String> pingResponse = operation.application.getConnection(connectionName)
.get(jobId)
.asString()
get_sts = JsonPath.parse(pingResponse.body).read('$.status')
if (get_sts == -1) {
return getJobStatus(connectionName, jobId, get_sts)
} else {
return get_sts
}
}
// Poll the REST call to get the import job details
String getImportJobStatus(String connectionName, String jobId) {
String get_sts = ''
HttpResponse<String> pingResponse = operation.application.getConnection(connectionName)
.get(jobId)
.asString()
int sts_code = JsonPath.parse(pingResponse.body).read('$.status')
if(sts_code == -1) {
sleep(1000)
get_sts = getImportJobStatus(connectionName, jobId)
} else {
//get_sts = JsonPath.parse(pingResponse.body).read('$.items[*].msgText').toString()
get_sts = JsonPath.parse(pingResponse.body).read('$.intermittentStatus').toString()
return get_sts
}
}

 


Anand Gaonkar

 


Where Can I Learn More About Groovy Rules?

 

Your Goal Learn More
Watch videos and tutorials that teach best practices when implementing and using Groovy rules
Create Groovy business rules using Calculation Manager See Designing with Calculation Manager for Oracle Enterprise Performance Management Cloud
Connect to the Java APIs used for creating Groovy rules See Java API Reference for Groovy Rules
Edit the script for a Groovy business rule or template using Calculation Manager. See Designing with Calculation Manager for Oracle Enterprise Performance Management Cloud

See how GenAI will work in your ERP and EPM

Oracle is delivering AI as part of the Oracle Fusion Cloud Applications Suite quarterly updates, and that means exciting changes are in store for Oracle Cloud EPM and ERP.

GenAI in Narrative Reporting in Oracle EPM

This demo shows how AI can help financial managers respond to changing circumstances by updating roll-forward reports used for financial planning.

 

Why it matters: This vision demo shows where Oracle is headed with embedded generative AI in Oracle Cloud EPM. Note that AI-generated output isn’t just reviewable by end users; it is also editable. This preserves oversight by skilled financial professionals. The demo also shows how easy it is to incorporate AI-generated insights into reports: GenAI drafts the content based on simple prompts, live data, and forecasts, and it is capable of creating charts and updating summaries based on new information provided.

 

 

GenAI in Project Planning in Oracle ERP

This demo shows how AI in Oracle Cloud ERP Project Management helps build a dynamic project plan that optimizes resources to meet both financial objectives and customer needs.

 

Why it matters: This vision demo shows how project managers get a fast and efficient way to develop plans with a high degree of flexibility, all with the benefit of human oversight and control. Operations and financial managers will appreciate how AI delivers early visibility into each project’s financial viability and risk profile. And everyone involved will value the ease with which resources and schedules can be adjusted to meet project needs

 

Oracle Fusion Cloud ERP Generative ERP Feature: Project Planning

 

Summing up

Oracle are using AI to reshape Fusion Apps, creating valuable use cases across ERP, EPM, and other functional areas.  Don’t miss the recent announcement covering new GenAI capabilities, and be ready for more to come.

Contact us for additional information on how you can use AI as part of an Oracle Fusion Cloud ERP subscription

EPM Data Integration – Pipeline

 

Case Study

A client expressed the need for an interface with functionality that would allow non-technical professionals to run daily batches. These batches could include tasks like pulling Actuals, updating the Chart of Accounts (CoA), or refreshing the Cost Centre structure and running the business rules, among others, from the source System.

While seeking a solution, we explored numerous alternatives within Data Integration. However, the challenge emerged as several intricate steps were involved, necessitating individuals to possess a certain level of technical understanding of the Data Integration tool.

Solution

Exciting developments ensued when Oracle introduced a new feature known as “Pipeline.”

 


Pipeline in Data Integration

This innovative addition empowers users to seamlessly orchestrate a sequence of jobs as a unified process. Moreover, the Pipeline feature facilitates the orchestration of Oracle Enterprise Performance Management Cloud jobs across instances, all from a single centralized location.

By leveraging the power of the Pipeline, you can gain enhanced control and visibility throughout the entire data integration process, encompassing preprocessing, data loading, and post-processing task.

Yet, this merely scratches the surface. The Pipeline introduces a multitude of potent benefits and functionalities. We’re delving into an in-depth exploration of this novel feature to uncover its potential in revolutionizing your data integration process.

Enterprise data within each application is grouped as multiple dimensions. Each Dimension has its own Data chain. Registering New application results in the creation of various objects and associated dimensions. An A

Note the following Pipeline considerations

  • Only administrators can create and run a Pipeline definition.
  • Pipeline is a replacement for the batch functionality in Data Management and can be migrated automatically to the Pipeline feature in Data Integration.
  • For file-based integrations to a remote server in the Pipeline when a file name is specified in the pipeline job parameters., the system copies any files automatically from the local host to the remote server automatically under the same directory.

This function applies to the following Oracle solutions:

  • Financial Consolidation and Close
  • Enterprise Profitability and Cost Management
  • Planning
  • Planning Modules
  • Tax Reporting


Proof of Concept

  EPM batches to run sequentially are:

Stage 1 – Load Metadata
  1. Load Account Dimension
  2. Load Entity Dimension
  3. Load Custom Dimension
  4. Clear current month Actuals (to remove any nonsense numbers if any)
Stage 2 – Load Data
  1. Load Trial balance from Source
Stage 3 – Run Business Rule
  1. Run Business rule to perform Aggregate & Calculations.


The workflow for creating and running a Pipeline process is as follows:

  1. Defining Pipeline

  1. Pipeline Name, Pipeline Code, maximum Parallel Jobs
  2. Variable page to set the out-of-box (global values) for Pipeline are available from which you can set parameters at runtime. Variables can be pre-defined types like: “Period”, “Import Mode” etc.

 

  1. You can utilize Stages in the Pipeline editor to cluster similar or interdependent Jobs from various applications together within a single unified interface. Administrators can efficiently establish a comprehensive end-to-end automation routine, ready to be executed on demand as part of the closing process.

Pipeline Stages & Container for multiple jobs as shown below:

Stages & Jobs example

 

The new stages can be added by simply using the Plus card located at the end of the current card sequence.

 

  1. On the Run Pipeline page, Complete the variable runtime prompts and then click As shown below:

 

 

Variable Prompts

 

When the Pipeline is running, you can click the status icon to download the log. Customers can also see the status of the Pipeline in Process Details. Each individual job in the Pipeline is submitted separately and creates a separate job log in Process Details.

Users can also schedule the Pipeline with the help of Job Scheduler.

Variable Prompt

Review

 


Amir Kalawant

Oracle Fusion Cloud EPM – 23.08 Update

EPM Cloud August Update

Test Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 4, 2023.


Production Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 18, 2023.


RELEASE HIGHLIGHTS


HELPFUL INFORMATION

The Oracle Help Center provides access to updated documentation. The updates will be available in the Help Center on Friday, August 4, 2023.

NOTE: Some of the links to new feature documentation included in this readiness document will not work until after the Oracle Help Center update is complete.

Updated documentation is published on the Oracle Help Center on the first Friday of each month, coinciding with the monthly updates to Test environments. Because there is a one-week lag between the publishing of the readiness documents (What’s New and New Feature Summary) and Oracle Help Center updates, some links included in the readiness documents will not work until the Oracle Help Center update is complete.

https://docs.oracle.com/en/cloud/saas/epm-cloud/index.html


FIXED ISSUES AND CONSIDERATIONS

Software issues addressed each month and considerations are posted to a knowledge article on My Oracle Support. Click here to review. You must have a My Oracle Support login to access the article.

NOTE: Fixed issues for EPM Cloud Common components (Smart View for Office, EPM Automate, REST API, Migration, Access Control, Data Management/Data Integration, Reports, Financial Reporting, and Calculation Manager) are available in a separate document on the My Oracle Support “Release Highlights” page.

The full Oracle advisory note can be found here

Introduction to Oracle Tax Reporting Cloud

 

 

Tax reporting (from Oracle) is an incredible application to increase the efficiency of the tax function.

With the rise of digital economies, governments across the world are finding ways to tax the digital income generated from their country. OECD is working on details of the digital tax so that companies and governments can work more efficiently. Meanwhile, France/UK are ready to tax digital companies with their own digital taxes.

In this world of uncertainty, governments are looking to increase their tax revenue and making changes to the tax laws accordingly. It is expected that companies should calculate their tax obligation correctly as per the latest tax codes in that jurisdiction.

However, tax functions in many companies are still using Microsoft spreadsheets to prepare tax calculations. These calculations might be prepared at an entity level in a spreadsheet and then sent to a regional or global the tax function using email. Tax experts review tax calculations at group level and if there are issues with the tax calculation of an entity or formula errors, the spreadsheet is sent back to the local team for correcting or updating. There is so much to and fro of the spreadsheets that tax and finance teams can easily lose track of the versions and corrections. Sometimes tax calculation models refer to many linked spreadsheets which creates further complexity when the spreadsheets need to be updated for a new account, changes in legislation or accounting standards. Formula errors can exist in the spreadsheets that are difficult to identify and correct. The other issue is that the tax models may be maintained and updated by one team member. If that team member leaves the company, there is a big risk to the tax process and financial close. These are a few of the pain points, bottlenecks and risks with using spreadsheets to prepare tax calculations.

Tax reporting is a web-based cloud solution, which has inbuilt functionality for:

  • Configurable tax calculation rules
  • Automatic calculation of tax expense, DTA/DTL
  • Approval process
  • Roll forwarding of tax accounts
  • Loading trial balance data
  • Load fixed assets data
  • Currency translations
  • Consolidation
  • Calculate Effective Tax Rate
  • Reports on local/ regional /state / national tax data
  • Produce tax accounting journal entries
  • Country by country reporting
  • Capture supplemental data for tax calculations and additional disclosure
  • Maintained by the tax and finance users

 

 

The tax reporting solution provides tax departments with the ability to meet global tax reporting requirements on an ongoing basis and ensure compliance with changing tax regulations.

We can help with implementing Oracle’s tax reporting solution for your organization and provide guidance on how to get the maximum value out of it.