Setup & Configuration of EPM Integration Agent as a Windows service

The EPM Integration Agent can be installed as a Windows service. This enables you to run the agent in using its’ own Windows session. This is especially beneficial when the windows servers are restarted we can automatically start the EPM agent service. Further, as needed, we can additionally pause and manually restart the EPM agent windows service. On Linux computers, the EPM Integration Agent is started as a background process.

Please note the following:

  • The agent name must be specified in the params.ini file. It cannot be passed as a parameter in the Install command.
  • During execution the service logs are present in the agents EPM_APP_DATA_HOME\logs folder with the name <serviceName>_<agent_name>_Service_<date>.log. This log contains all the console.
  • To display help, use the option EPMAgentService.exe -help or double click EPMAgentService.exe in Windows Explorer.
  • When installed, you can start and stop the service as a Windows service from the Windows services console.
  • Multiple agent services with a different service name, agent name and port can be created and run simultaneously.
  • Always check the log file after starting the service.

 


To setup up the Windows Service:

  1. Open Command Prompt as an Administrator. Type below command and hit enter.

 

  1. You will see the service created in service.msc as below.

 

 

 

  1. The status for the agent will be stored in the log under the path and filename.

 If successful, the agent should be available as a Windows service.

The display name will be generated with same as the service name. The service is set to manual and the account which runs the service will be local system, so you may want to update the service configuration to fit your requirements.

 

 

  1. Once the agent has been started, there will be “EPMAgentService.exe” as a process which will have the agent Java process running as a child process.

  1. It is possible to run multiple agents as Windows services on the same machine.

 

Multiple services will now be available.

 

  1. There will be a separate agent log for each Windows service.


To change the dislay name of the service:

  1. If you want to update the display name for the Windows service, this can be achieved easily through command line.

 

The display name will be updated.


To remove the service:

  1. To remove an agent as a Windows service, the following syntax can be used:EPMAgentService -uninstall <windows_service_name>

 

A log will have been generated with further details.

 


 

Pratik Alde

 

 

 

Links

Serverless Backup & Restore operations in EPM Cloud

 

Have you ever wanted to automate migrations from your production to your non-prod environement?  And do it without using a standalone server to run EPMAutomate?

You can write Groovy scripts to run select EPM Automate commands directly in Oracle Enterprise Performance Management Cloud, without installing EPM Automate client on a client machine. Refer to Running Commands without Installing EPM Automate and Supported Command in Working with EPM Automate for Oracle Enterprise Performance Management Cloud for information on which EPM Automate commands can be run via Groovy and example scripts.

Oracle supports two types of Groovy rules:

  • Rules that can dynamically generate calc scripts at runtime based on context other than the runtime prompts and return the calc script which is then executed against Oracle Essbase.
  • Pure Groovy rules that can, for example, perform data validations and cancel the operation if the data entered violates company policies.

Use Case

Run a buiness rule from production to clone a backup to a non prod environment.


High Level Steps
  • Set the service admin user ID for target (Test) instance within the rule. Encrypt the password using EPMAutomate explicitly and copy to password variable in the rule explicitly. You must deploy the rule post any such changes to the rule. Once deployed user can Launch the rule.
  • User needs to enter the specific date in yyyy-mm-dd format to select the required date specific snapshot to restore and hit Launch.
  • If Check is selected as true then the rule will run to list all backup snapshot names available from object storage path and exit, else by default it will continue to next step.
  • Deletes the selected file if already present from migration path.
  • Restores the selected snapshot file to migration path with the date stamp suffixed to snapshot file name and proceeds to next step.
  • Clone the target (Test) instance with the restored backup snapshot. The cloning process takes ~ 30-50 mins to complete, depending on the data and artifacts residing in the snapshot.
  • User can check the status of cloning in production under Clone Environment menu or using REST API for status check.

 

Run time prompts

 

RTP NAME TYPE DEFAULT VALUE DESCRIPTION
RTP_Date String Use Last Entered Enter date string in yyyy-mm-dd format only.
RTP_Check Boolean false If selected true, then rule will only list all the backup snapshot names available in object storage path and exit without cloning activity.

This provision is just for user reference to check if the date provided is valid for the range of 60 days.

 

Code Sample

/* RTPS: {RTP_Date} {RTP_Check} */

// Timestamp definition
String getUserName = operation.getUser().getName()
String TrgURL = "" // target instance to clone
String adm_id = "" // service admin
String adm_passwd = "" // encrypted password

DateFormat tFormat = new SimpleDateFormat("yyyy-MM-dd")
tFormat = new SimpleDateFormat("yyyy-MMM-dd,EEE hh:mm:ss a zzz")
def tstamp = tFormat.format(new Date())
String dt = rtps.RTP_Date
def archive_date = dt
int d_limit = 0
int sts_code = rtps.RTP_Check.toString() as int
String sts = ""
boolean REST_Status = false
boolean List_Check = sts_code == 0 ? false : true

println "[$tstamp] : Backup file date $archive_date selected by $getUserName."

try {
d_limit = new Date() - new Date().parse('yyyy-MM-dd', dt)
} catch (Exception e) {
sts = "Please check the entered date $dt is in yyyy-mm-dd format."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Error $e - $sts"
throwVetoException("$sts")
}

if(!(d_limit>0 &amp;&amp; d_limit&lt;60)) {
sts = "Please select date range within last 60 days from today."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Error - $sts"
throwVetoException("$sts")
}

//***************List all existing backup files****************
String ConnectionName = "REST-EPM-MIGRATION-PROD"
String api_version = '/v2'
String api_resource_path = '/backups/list'
String jRSrc = api_version + api_resource_path

HttpResponse&lt;String> jsonGetBackupListResponse = operation.application.getConnection(ConnectionName)
.get(jRSrc)
.asString()

sts = JsonPath.parse(jsonGetBackupListResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonGetBackupListResponse.body).read('$.status')
def fileList = JsonPath.parse(jsonGetBackupListResponse.body).read('$.items') as String[]
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonGetBackupListResponse.statusText - List Files - $api_resource_path - $sts. ${fileList.size()} snapshot files available. Below files available :"

String dte = archive_date.toString().trim()
String BackupFilePath = ""
String TargetFile = "Artifact_Snapshot_" + ConnectionName.substring(ConnectionName.length()-4) + "_$archive_date"

fileList.eachWithIndex { String i, int j ->
println i
if(dte.equals(i.substring(0,10)) &amp;&amp; !REST_Status) {
BackupFilePath = i
sts = "$sts $BackupFilePath file found."
}
}

if(BackupFilePath == "") {
sts = "No file selected for date $archive_date. Please select date range within last 60 days from today."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonGetBackupListResponse.statusText - $sts"
throwVetoException("$sts")
} else {
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonGetBackupListResponse.statusText - $sts"
}

if(List_Check) {
sts = "Exiting program as just file check option selected. ${fileList.size()} backup snapshots available in object storage."
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status OK - $sts"
throwVetoException("$sts")
}

//***************Delete existing backup file from Prod migration path if any****************
api_version = '/11.1.2.3.600'
api_resource_path = '/applicationsnapshots/' + TargetFile
jRSrc = api_version + api_resource_path
delFile(ConnectionName, jRSrc, tFormat)

//***************Restore existing backup files****************
api_version = '/v2'
api_resource_path = '/backups/restore'
jRSrc = api_version + api_resource_path

HttpResponse&lt;String> jsonRestoreBackupResponse = operation.application.getConnection(ConnectionName)
.post(jRSrc)
.header("Content-Type", "application/json")
.body(json(["backupName":"$BackupFilePath", "parameters":["targetName":"$TargetFile"]]))
.asString()

String op = "Restore Backup"
sts = JsonPath.parse(jsonRestoreBackupResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonRestoreBackupResponse.body).read('$.status')
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonRestoreBackupResponse.statusText - $op - $api_resource_path - $sts"

//***************Restore backup files status check****************
statusCheck(jsonRestoreBackupResponse, ConnectionName, api_version, op, tFormat)

//***************Clone snapshot to source****************
api_version = '/v1'
api_resource_path = '/services/clone'
jRSrc = api_version + api_resource_path

HttpResponse&lt;String> jsonCloneBackupResponse = operation.application.getConnection(ConnectionName)
.post(jRSrc)
.header("Content-Type", "application/json")
.body(json(["targetURL":"$TrgURL", "targetUserName":"$adm_id", "targetEncryptPassword":"$adm_passwd", "parameters":["snapshotName":"$TargetFile", "migrateUsers":"false", "maintenanceStartTime":"true", "dataManagement":"true", "jobConsole":"false", "applicationAudit":"false", "storedSnapshotsAndFiles":"false"]]))
.asString()

op = "Clone Backup"
sts = JsonPath.parse(jsonCloneBackupResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonCloneBackupResponse.body).read('$.status')
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCloneBackupResponse.statusText - $op - $api_resource_path - $sts"

//***************Clone backup status check****************
//statusCheck(jsonCloneBackupResponse, ConnectionName, api_version, op, tFormat)
op = "Clone Status Check"
api_version = '/v1'
api_resource_path = '/services/clone/status'
jRSrc = api_version + api_resource_path

HttpResponse&lt;String> jsonCloneStatusResponse = operation.application.getConnection(ConnectionName)
.get(jRSrc)
.asString()

sts = JsonPath.parse(jsonCloneStatusResponse.body).read('$.details').toString() + "."
sts_code = JsonPath.parse(jsonCloneStatusResponse.body).read('$.status')
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCloneStatusResponse.statusText - $op - $api_resource_path - $sts"
//sts = getImportJobStatus(ConnectionName, jRSrc)
REST_Status = awaitCompletion(jsonCloneStatusResponse, "$ConnectionName", "$op", "$jRSrc")
sts = getImportJobStatus(ConnectionName, jRSrc)
tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCloneStatusResponse.statusText - $op \n$sts"

//************EPM Helper Functions****************
// Delete existing file
def delFile(String ConnectionName, String jRSrc, DateFormat tFormat) {
HttpResponse&lt;String> jsonFileDeleteResponse = operation.application.getConnection(ConnectionName)
.delete(jRSrc)
.header("Content-Type", "application/json")
.asString()

int sts_code = JsonPath.parse(jsonFileDeleteResponse.body).read('$.status')
String sts = JsonPath.parse(jsonFileDeleteResponse.body).read('$.details').toString() + "."
//sts = sts + JsonPath.parse(jsonFileDeleteResponse.body).read('$.links[0].href').toString()
def tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonFileDeleteResponse.statusText - Delete Snapshot - $jRSrc - $ConnectionName $sts"
}
// v2 REST status check
def statusCheck(HttpResponse&lt;String> jsonResponse, String ConnectionName, String api_version, String opr, DateFormat tFormat) {
String StatusURL = opr=="Clone Backup" ? JsonPath.parse(jsonResponse.body).read('$.links[0].href').toString() : JsonPath.parse(jsonResponse.body).read('$.links[1].href').toString()
String api_resource_path = StatusURL.substring(StatusURL.indexOf(api_version)+3, StatusURL.length())
String jRSrc = api_version + api_resource_path

HttpResponse&lt;String> jsonCheckResponse = operation.application.getConnection(ConnectionName)
.get(jRSrc)
.asString()

int sts_code = JsonPath.parse(jsonCheckResponse.body).read('$.status')
String sts = JsonPath.parse(jsonCheckResponse.body).read('$.details').toString() + "."
def tstamp = tFormat.format(new Date())
println "[$tstamp] : Status $jsonCheckResponse.statusText - $opr - $api_resource_path - $ConnectionName $sts"
boolean REST_Status = awaitCompletion(jsonCheckResponse, "$ConnectionName", "$opr", "$jRSrc")
}
// Await till REST completes
def awaitCompletion(HttpResponse&lt;String> jsonResponse, String connectionName, String opr, String jrSrc) {
DateFormat tFormat = new SimpleDateFormat('yyyy-MMM-dd,EEE hh:mm:ss a zzz')
final int IN_PROGRESS = -1
int status = JsonPath.parse(jsonResponse.body).read('$.status')
def tstamp = tFormat.format(new Date())
if (!(200..299).contains(jsonResponse.status))
throwVetoException("Error : $status occured to execute $opr. $jsonResponse.statusText.")

// Parse the JSON response to get the status of the operation. Keep polling the REST call until the operation completes.
String j_Id = jrSrc.substring(jrSrc.lastIndexOf('/') + 1, jrSrc.length())
for (long delay = 500; status == IN_PROGRESS; delay = Math.min(2000, delay * 2)) {
sleep(delay)
status = getJobStatus(connectionName, jrSrc, status)
}
String Resp_Details = JsonPath.parse(jsonResponse.body).read('$.details').toString()
def itms = (List)JsonPath.parse(jsonResponse.body).read('$.items')
/*if (opr == 'Clone Backup') {
if (itms.size() > 0) {
itms.eachWithIndex { r, i ->
String StatusURL = r['links']['href'].toString().replace(']', '').replace('[', '')
String api_resource_path = StatusURL.substring(StatusURL.indexOf('/v2'))
StatusURL = getImportJobStatus(connectionName, api_resource_path)
if (StatusURL.length() > 0) {
String info = r['destination'].toString() + ' - ' + StatusURL //getImportJobStatus(connectionName, jRSrc)
tstamp = tFormat.format(new Date())
println("[$tstamp] : $i - $info")
}
}
}
def info = (List)JsonPath.parse(jsonResponse.body).read('$.intermittentStatus')
tstamp = tFormat.format(new Date())
println("[$tstamp] : Status $jsonResponse.statusText - $opr - $info")
}*/
tstamp = tFormat.format(new Date())
println("[$tstamp] : Status $jsonResponse.statusText - $opr - $jrSrc - ${status == 0 ? "successful for Job Id $j_Id" : "failed for Job Id $j_Id"}. \n$Resp_Details")
return status
}
// Poll the REST call to get the job status
int getJobStatus(String connectionName, String jobId, int get_sts) {
HttpResponse&lt;String> pingResponse = operation.application.getConnection(connectionName)
.get(jobId)
.asString()
get_sts = JsonPath.parse(pingResponse.body).read('$.status')
if (get_sts == -1) {
return getJobStatus(connectionName, jobId, get_sts)
} else {
return get_sts
}
}
// Poll the REST call to get the import job details
String getImportJobStatus(String connectionName, String jobId) {
String get_sts = ''
HttpResponse&lt;String> pingResponse = operation.application.getConnection(connectionName)
.get(jobId)
.asString()
int sts_code = JsonPath.parse(pingResponse.body).read('$.status')
if(sts_code == -1) {
sleep(1000)
get_sts = getImportJobStatus(connectionName, jobId)
} else {
//get_sts = JsonPath.parse(pingResponse.body).read('$.items[*].msgText').toString()
get_sts = JsonPath.parse(pingResponse.body).read('$.intermittentStatus').toString()
return get_sts
}
}

 


Anand Gaonkar

 


Where Can I Learn More About Groovy Rules?

 

Your Goal Learn More
Watch videos and tutorials that teach best practices when implementing and using Groovy rules
Create Groovy business rules using Calculation Manager See Designing with Calculation Manager for Oracle Enterprise Performance Management Cloud
Connect to the Java APIs used for creating Groovy rules See Java API Reference for Groovy Rules
Edit the script for a Groovy business rule or template using Calculation Manager. See Designing with Calculation Manager for Oracle Enterprise Performance Management Cloud

See how GenAI will work in your ERP and EPM

Oracle is delivering AI as part of the Oracle Fusion Cloud Applications Suite quarterly updates, and that means exciting changes are in store for Oracle Cloud EPM and ERP.

GenAI in Narrative Reporting in Oracle EPM

This demo shows how AI can help financial managers respond to changing circumstances by updating roll-forward reports used for financial planning.

 

Why it matters: This vision demo shows where Oracle is headed with embedded generative AI in Oracle Cloud EPM. Note that AI-generated output isn’t just reviewable by end users; it is also editable. This preserves oversight by skilled financial professionals. The demo also shows how easy it is to incorporate AI-generated insights into reports: GenAI drafts the content based on simple prompts, live data, and forecasts, and it is capable of creating charts and updating summaries based on new information provided.

 

 

GenAI in Project Planning in Oracle ERP

This demo shows how AI in Oracle Cloud ERP Project Management helps build a dynamic project plan that optimizes resources to meet both financial objectives and customer needs.

 

Why it matters: This vision demo shows how project managers get a fast and efficient way to develop plans with a high degree of flexibility, all with the benefit of human oversight and control. Operations and financial managers will appreciate how AI delivers early visibility into each project’s financial viability and risk profile. And everyone involved will value the ease with which resources and schedules can be adjusted to meet project needs

 

Oracle Fusion Cloud ERP Generative ERP Feature: Project Planning

 

Summing up

Oracle are using AI to reshape Fusion Apps, creating valuable use cases across ERP, EPM, and other functional areas.  Don’t miss the recent announcement covering new GenAI capabilities, and be ready for more to come.

Contact us for additional information on how you can use AI as part of an Oracle Fusion Cloud ERP subscription

Managing your risk – Why you need a Financial Consolidation system?

It is continually surprising to observe the number of ASX-listed organizations lacking robust systems to generate consolidated financial statements and notes to the accounts. Directors of companies are obligated to exercise due care and diligence under the Corporations Act 2001, which includes financial reporting responsibilities. Financial statements and notes to accounts are required to adhere to accounting standards and accurately reflect the company’s consolidated financial position and performance.

Many organizations depend on Excel as the primary tool for preparing statutory consolidated financial statements. Although Excel is a useful tool, it is often not the most suitable system for producing consolidated financial reports. Excel is not a robust option, and numerous well-documented cases in Australia and abroad have shown that errors in Excel have led to significant financial reporting mistakes.

Dedicated applications designed for preparing consolidated financial accounts come with standard features and functionalities that enable financial accounting teams to prepare the accounts confidently. Without such an application, Financial Controllers or CFOs may encounter several challenges when producing consolidated financial accounts in Excel, including:

Reporting Challenges of Excel

  • Multiple General Ledger Systems – It is very common that an organisation has multiple General Ledger systems, each containing its own unique Chart of Accounts (CoA). These Trial Balances need to be extracted and mapped to a common CoA.  These ledger systems are typically always changing, for example the addition of new accounts or legal entities.  Financial consolidation applications have tools designed for accountants to manage this CoA data mapping process.
  • Currency Conversion – Many organisations have legal entities that report in other currencies and these need to be translated into AUD for consolidation purposes. Financial consolidation applications provide standard functionality to perform this currency conversion and report in multiple reporting currencies, for example AUD and USD.
  • Preparation of Notes to Accounts – The notes are used to make important disclosures that explain the numbers in the financial statements of a company. Common notes to the financial statements include accounting policies, depreciation of assets, inventory valuation, subsequent events, etc. These notes are typical complex to prepare and often require data from a variety of different data sources.
  • Intercompany Eliminations – Excel does not automatically post intercompany elimination and consolidation journals. In Excel, there is no standard ability to produce a mismatch report or maintain an audit trail of changes made to your data.
  • Data Validation – Most consolidation applications have multiple layers of data validation in order to minimise reconciliation issues. If Excel is used, it is difficult to achieve a similar level of the checks and balances required to have complete confidence in your financial reports.
  • Partial Ownership – Many companies have various subsidiaries/entities that are not fully consolidated into financial statements. Most consolidation applications have the ability to apply appropriate accounting equity rules to ensure that entities are correctly consolidated. Excel typically does not have this functionality.
  • Workflow – Financial consolidation reports are typically prepared by a team of financial accountants. Workflow is difficult to manage in Excel as Excel files need to be distributed to various stakeholders and you cannot control last minute adjustments and changes.  A dedicated consolidation application not only improves the efficiency and workflow of staff to assist with closing financial accounts, but can also ensure last minute or post close adjustments are not made without approval.
  • Security & Audit Features – When it comes to accountability, tracking user actions is crucial. This cannot be achieved in Excel.
  • Financial Statement Reporting – When preparing the financial statements there are typically many different views of the data that need to be represented. This may be by Product Groups, or Regions.  A dedicated consolidation application can make this data easy to present in a variety of reporting views.

Reporting Challenges of only using an ERP

Can you implement a financial consolidation system within ERP applications?  Some ERP applications contain some basic consolidation functionality, however there are a number of typical shortcomings with this approach:

  • Mergers, Acquisitions & Disposals – often many large companies are undertaking M&A activities. If you have just made a large acquisition, how will you produce your financial consolidated statements?  Your ultimate goal might be to move the organisation to your core ERP, however these projects can typically take many months or years.  Consolidation applications have tools that allow you to easily extract the Trial Balance from the General Ledger of a new organisation and map the CoA to your common consolidated financial reporting CoA.  Compared to an ERP migration, this can be done in a very short time frame.
  • Notes to the Accounts – Some ERP applications provide functionality to post elimination journal entries, but they do not provide a facility to create notes to the accounts. Again, we too often see organisations attempting to perform this task within Excel with its associated short comings.
  • Flexibility – reporting requirements and accounting standards are continually changing and evolving. Typically, ERP applications are not very agile environments and change is often difficult, time consuming, and costly to implement.
  • Costs – We are often surprised by the large investments that organisations make to enable financial consolidation within an ERP application. An application designed specifically for financial consolidations can be implemented for a fraction of the cost.

 

Specific financial consolidation applications are not costly to implement or maintain. Such applications can significantly lower audit fees for an organization. When standard end-of-year journal entries are utilized within these applications, they usually require only a single audit and approval by external auditors. In contrast, when spreadsheets are employed, they often necessitate a full audit annually.

As a Financial Controller, CFO, or Director of a major ASX listed company, I would avoid risking my personal or the organization’s reputation by relying on Excel for producing consolidated financial reports. The stakes are simply too great.


DAMIAN TIMMS

 

EPM Data Integration – Pipeline

 

Case Study

A client expressed the need for an interface with functionality that would allow non-technical professionals to run daily batches. These batches could include tasks like pulling Actuals, updating the Chart of Accounts (CoA), or refreshing the Cost Centre structure and running the business rules, among others, from the source System.

While seeking a solution, we explored numerous alternatives within Data Integration. However, the challenge emerged as several intricate steps were involved, necessitating individuals to possess a certain level of technical understanding of the Data Integration tool.

Solution

Exciting developments ensued when Oracle introduced a new feature known as “Pipeline.”

 


Pipeline in Data Integration

This innovative addition empowers users to seamlessly orchestrate a sequence of jobs as a unified process. Moreover, the Pipeline feature facilitates the orchestration of Oracle Enterprise Performance Management Cloud jobs across instances, all from a single centralized location.

By leveraging the power of the Pipeline, you can gain enhanced control and visibility throughout the entire data integration process, encompassing preprocessing, data loading, and post-processing task.

Yet, this merely scratches the surface. The Pipeline introduces a multitude of potent benefits and functionalities. We’re delving into an in-depth exploration of this novel feature to uncover its potential in revolutionizing your data integration process.

Enterprise data within each application is grouped as multiple dimensions. Each Dimension has its own Data chain. Registering New application results in the creation of various objects and associated dimensions. An A

Note the following Pipeline considerations

  • Only administrators can create and run a Pipeline definition.
  • Pipeline is a replacement for the batch functionality in Data Management and can be migrated automatically to the Pipeline feature in Data Integration.
  • For file-based integrations to a remote server in the Pipeline when a file name is specified in the pipeline job parameters., the system copies any files automatically from the local host to the remote server automatically under the same directory.

This function applies to the following Oracle solutions:

  • Financial Consolidation and Close
  • Enterprise Profitability and Cost Management
  • Planning
  • Planning Modules
  • Tax Reporting


Proof of Concept

  EPM batches to run sequentially are:

Stage 1 – Load Metadata
  1. Load Account Dimension
  2. Load Entity Dimension
  3. Load Custom Dimension
  4. Clear current month Actuals (to remove any nonsense numbers if any)
Stage 2 – Load Data
  1. Load Trial balance from Source
Stage 3 – Run Business Rule
  1. Run Business rule to perform Aggregate & Calculations.


The workflow for creating and running a Pipeline process is as follows:

  1. Defining Pipeline

  1. Pipeline Name, Pipeline Code, maximum Parallel Jobs
  2. Variable page to set the out-of-box (global values) for Pipeline are available from which you can set parameters at runtime. Variables can be pre-defined types like: “Period”, “Import Mode” etc.

 

  1. You can utilize Stages in the Pipeline editor to cluster similar or interdependent Jobs from various applications together within a single unified interface. Administrators can efficiently establish a comprehensive end-to-end automation routine, ready to be executed on demand as part of the closing process.

Pipeline Stages & Container for multiple jobs as shown below:

Stages & Jobs example

 

The new stages can be added by simply using the Plus card located at the end of the current card sequence.

 

  1. On the Run Pipeline page, Complete the variable runtime prompts and then click As shown below:

 

 

Variable Prompts

 

When the Pipeline is running, you can click the status icon to download the log. Customers can also see the status of the Pipeline in Process Details. Each individual job in the Pipeline is submitted separately and creates a separate job log in Process Details.

Users can also schedule the Pipeline with the help of Job Scheduler.

Variable Prompt

Review

 


Amir Kalawant

Oracle Fusion Cloud EPM – 23.08 Update

EPM Cloud August Update

Test Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 4, 2023.


Production Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 18, 2023.


RELEASE HIGHLIGHTS


HELPFUL INFORMATION

The Oracle Help Center provides access to updated documentation. The updates will be available in the Help Center on Friday, August 4, 2023.

NOTE: Some of the links to new feature documentation included in this readiness document will not work until after the Oracle Help Center update is complete.

Updated documentation is published on the Oracle Help Center on the first Friday of each month, coinciding with the monthly updates to Test environments. Because there is a one-week lag between the publishing of the readiness documents (What’s New and New Feature Summary) and Oracle Help Center updates, some links included in the readiness documents will not work until the Oracle Help Center update is complete.

https://docs.oracle.com/en/cloud/saas/epm-cloud/index.html


FIXED ISSUES AND CONSIDERATIONS

Software issues addressed each month and considerations are posted to a knowledge article on My Oracle Support. Click here to review. You must have a My Oracle Support login to access the article.

NOTE: Fixed issues for EPM Cloud Common components (Smart View for Office, EPM Automate, REST API, Migration, Access Control, Data Management/Data Integration, Reports, Financial Reporting, and Calculation Manager) are available in a separate document on the My Oracle Support “Release Highlights” page.

The full Oracle advisory note can be found here

Enterprise Data Management Series – Part 2

In the first part, we learned an overview of EDMCS. In this part, we will discuss more on “What EDMCS has to offer and How”. Unlike DRM where we have version, hierarchy, and nodes; Oracle has introduced View, Viewpoint, and Data chain. Let us go through the basic structure of EDMCS.

 

Figure 1 EDM Model

Figure 1 EDM Model

 

Enterprise data within each application is grouped as multiple dimensions. Each Dimension has its own Data chain. Registering New application results in the creation of various objects and associated dimensions. An Application consists of connected views, dimensions, and associated viewpoints:

  • The View is a collection of Viewpoints.
  • Viewpoints are where users view and work with application data.
  • Each dimension contains a series of related data objects called data chains, which consist of node types, hierarchy sets, node sets, and viewpoints.

 

The above objects are the building blocks of the EDMCS as shown and explained below.

 

Information Model

Figure 2 Information Model

 

Application

  • An application models each connected system as an application. You can click on Register to create a new application.

 

Application

Figure 3 Application

 

Dimension

  • Enterprise data is grouped as dimensions such as Account, Entity, and Movement.

Figure 4 Dimension

 

Figure 5 Data Chain Flow

 

Node Type

  • Collection of nodes that share a common business purpose, like Department, Entities.
  • Defines Property for Associated nodes. For Example, Product node type can include properties like Name, Description, Cost, etc.

Figure 6 Node Type

 

Hierarchy Set

  • The hierarchy set defines parent-child relationships for nodes. Example Employees to Department or Vehicles rollup to Automobiles etc.
  • It can define own hierarchy sets using different relationships between node types.

Figure 7 Hierarchy Set

 

Node Set

  • Defines a group of nodes available in Viewpoints and consists of hierarchies or lists. Example Hierarchy of Cost Centre or List of Country codes.
  • Node sets are the only group of hierarchy sets that are required in Viewpoints. Consider the below figure where only Marketing and Finance are included, and the Marketing hierarchy excluded.

Figure 8 Node Set

 

Viewpoint

  • Viewpoints are used for managing data like comparing, sharing/mapping, and maintaining a dimension across applications such as viewing a list of accounts or managing a product hierarchy or exporting an entity structure.
  • Viewpoints are organized into one or more views. Each viewpoint uses a node-set and controls how users work with data in that node-set in a specific view.

Figure 9 Viewpoint

 

 View

  • A group of viewpoints such as managing data for a dimension across applications or integrating data from and to an external system.
  • Users can define additional views of their own to view and manage data for specific business purposes.

 

Figure 10 View Dashboard

 

 

Integration Benefits

Oracle has taken a major leap improving Integration in EDMCS. When in DRM, Integration to other Hyperion modules can only be possible through Table, Flat file, or API integration or involving custom code development. EDMCS has introduced various components Adapter like PBCS, FCCS, EBS to help make a connection directly to the respective component. Note: Adapter for some components is yet to be deployed from Oracle. However, you can always integrate using the standard flat file export.

 

Migration made simple

Existing on-premise Data Relationship Management can be migrated to EDMCS. The administrator needs to register DRM application in EDMCS as custom application and then import dimensional structure. Note: Data Relationship Management 11.1.2.4.330 or higher is supported for on-premise to cloud migration.

 

Governance at a different level

Previously, on-premise DRM had a separate Data Relationship Governance (DRG) interface but in EDMCS it included governance as part of an application. In EDMCS, organizations use request workflows to exercise positive control over the processes and methods used by their data stewards and data custodians to create and maintain high-quality enterprise data assets. Workflow stages are similar like Submit, Approve, and Commit. Finally, before committing changes, users can visualize changes and their effect on Hierarchy.

 

Oracle Narrative Reporting – Part 2

Overview of the Report Package

  • EPRCS operates with the “Reporting Package” feature that provides the ability to merge with Microsoft Office data and documents. EPRCS can also be combined with on-premise software, cloud data sources or other Oracle EPM applications.
  • Report packages provide a secure, collaborative, and process-driven approach for defining, authoring, reviewing and publishing financial and management reports.
  • With report packages, one can organize the content, delegate roles to authors and reviewers, manage their collaboration/workflow approvals, and sign-off process to create a structured document.

 

Figure 1 Report Package Features

 

How to create a Report Package

While creating a report package we need to provide the following details:

  • Enter Properties
    In the properties section, we need to provide the Name, Description, Report Type, Style Sample and Save To fields of a report package.

Figure 2 Enter Properties

 

Define Process

Apply the respective development phases and define the timeline for each phase.

 

  • Author Phase
    Once this phase is enabled, click on the Calendar icons to define the following dates: Start Author Phase On, Submit Doclets By and End Author Phase On.

 

Figure 3 Define Process: Author Phase

 

  • Review Phase
    Once the “Review Phase” has been enabled, click on the Calendar icons to define the dates: Start Review On, End Review Cycle 1 On and End Review On.

 

Figure 4 Define Process: Review Phase

 

  • Sign-Off Phase
    Once the “Sign-Off Phase” has been enabled, click on the Calendar icons to define the following dates: Start Sign Off On and End Sign Off On.

Figure 5 Define Process: Sign-Off Phase

 

  • Assign Users
    Next, we need to assign users and groups to the following report package responsibilities such as Owners, Reviewers, Signers, and Viewers.

 

Figure 6 Assign Users

 

  • Define Options
    The last step is to define the options for a report package such as Format Options, Shared Folder, and Doclet Versions.

 

Figure 7 Define Options

 

At last click on “Finish” to complete the report package setup. Next, we will discuss the workflow process.

Collaboration and Workflow of Report Packages

The three phases that a report package includes are the Author, Review and Sign-Off phases. For a report package, one or more of the phases can be selected.

  • Author Phase
    Within this phase content, comments and supporting details are updated to help collaborate with other users. It can be applied to an entire report package, a section, or individual doclets.

 

Figure 8 Author Phase

 

  • Review Phase
    The Review Phase is a review cycle where the reviewers can view the current status of the Doclets and input comments on the drafts through the commentary feature if needed and eventually mark their view as complete.

Figure 9 Review Phase

 

  • Sign-Off Phase
    In the sign-off phase, anyone designated as a signer for that review package formally reviews the fully completed report package one final time and other sign-off or rejects it.

Figure 10 Sign-Off Phase

In the next blog, we will discuss integration/extension of EPRCS with Microsoft office products.

 

Oracle Data Integrator Cloud Service (ODICS) – PART 2

Oracle is one of the prominent leaders in providing comprehensive data integration solutions that includes Oracle Data Integrator Cloud Service (ODICS), Oracle Data Integration Platform Cloud, Oracle Golden Gate, Oracle Enterprise Data Quality, Oracle Enterprise Metadata Management, and Oracle Stream Analytics.  ODICS provides continuous access to timely, reliable and heterogeneous data from both on-site and cloud solutions to support analytical and operational business needs.

 

New key investment areas ensure Oracle Data Integrator Cloud Services continues to support clients during their business growth and transformation process. ODICS introduces new functionality in the following areas:

 

Oracle Object Storage and Oracle Object Storage Classic

  • Oracle Object Storage and Object Storage Classic provide fast, stable and secure cloud storage and now ODICS will integrate Oracle Cloud Infrastructure (OCI) seamlessly with them.
  • ODICS comes with a collection of Knowledge Modules (KMs) that can be used to link to Oracle Object Storage and Object Storage Classic in Mappings and Packages to manage files within the local archive or the Hadoop Distributed File System (HDFS).

 

ODI Object Storage

Figure 1 ODI Object Storage

 

Autonomous Databases

  • ODICS now comes with optimized Loading and Integration Knowledge Modules (KMs) that are certified with Oracle Autonomous databases such as:
    • Oracle Autonomous Data Warehouse Cloud (ADW)
    • Oracle Autonomous Transaction Processing (ATP)
  • ODICS works easily with ADW and ATP to achieve better performance in a fully managed environment that is configured for specific workloads by integrating with Autonomous Databases.
  • Both ADW and ATP use the same set of Knowledge Modules and utilize the updated native integration of Oracle Object Storage and Oracle Object Storage Classic.
  • Additionally, Oracle Data Integrator users can also use native integration between Oracle Autonomous Data Warehouse and Oracle Object Storage to allow fast data transmission to ADW or ATP and simplify the entire loading process.

 

ODI Autonomous Data Warehouse

Figure 2 ODI Autonomous Data Warehouse

 

 

Oracle Enterprise Resource Planning (ERP) Cloud

  • The new release also provides a new Infrastructure and Software Platform for Oracle Enterprise Resource Planning (ERP) Cloud, a suite of cloud apps for accounting, project management, sourcing, risk management, and operations.
  • ODICS works seamlessly into the Oracle Enterprise Resource Management (ERP) platform which allows companies to incorporate their ERP data into their data warehouses, data marts. The native application also lets ODICS customers load data into Oracle’s ERP Cloud.

 

ODICS ERP

Figure 3 ODICS ERP

 

 

In the next post, we will discuss more key features such as Oracle Sales Cloud, Oracle Service Cloud, GIT Offline Support, and SAP Delta Extraction.

 

 

Oracle Data Integrator Cloud Service (ODICS) – PART 1

Oracle is one of the prominent leaders in providing comprehensive data integration solutions that includes Oracle Data Integrator Cloud Service (ODICS), Oracle Data Integration Platform Cloud, Oracle Golden Gate, Oracle Enterprise Data Quality, Oracle Enterprise Metadata Management, and Oracle Stream Analytics.  ODICS provides continuous access to timely, reliable and heterogeneous data from both on-site and cloud solutions to support analytical and operational business needs.

ODICS Overview:

  • ODICS provides high-performance data transformation capabilities with its transparent E-LT architecture and extended support for cloud and big data applications.
  • ODICS supports all the features included in Oracle Data Integrator Enterprise Edition within its’ heterogeneous cloud service.
  • ODICS provides an easy-to-use interface to improve productivity, reduce development costs and decrease the total cost of ownership.
  • Oracle Data Integrator Cloud Platform is fully integrated with Oracle Process as a Service (PaaS) platform, such as Oracle Database Cloud Service, Oracle Database Exadata Cloud Service and/or Oracle Big Data Cloud Service to deliver data needs.
  • ODICS can work with third-party systems as well as Oracle solutions as shown in the below screenshot.

ODI On-Premises Integration with Cloud Services

ODI On-Premises Integration with Cloud Services

 

Cloud E-LT Architecture for High Performance vs Traditional ETL Approach:

  •  Traditional ETL software is based on proprietary engines that execute row by row data transformations, thus limiting performance.
  • We can execute data transformations on the target server by implementing an E-LT architecture based on your existing RDBMS engines and SQL.
  • The E-LT architecture gathers data from different sources, loads into the target and performs transformations using the database power.
  • While utilizing existing environment data infrastructures, Oracle Data Integrator delivers flexibility by using target server for data transformations thereby minimizing network traffic.
  • The new E-LT architecture ensures the highest performance possible.

ODICS ELT vs ETL Architecture Differences

ODICS ELT vs ETL Architecture Differences

 

Oracle Data Integrator Architecture Components:

The Oracle Data Integrator (ODI) architecture components include the below feature sets.

 

ODI SDK Java-based API for run time and scheduling Operations.
ODI Studio Designers’ studio to manage connections, interface designs, development, and automation including scheduling.
ODI Standalone Agent It can be configured in a standalone domain and managed by WebLogic Management Framework.
ODI J2EE:

 

This is the Java EE agent based on the Java EE framework that runs on a WebLogic domain a Managed Server configured in a WebLogic domain. This feature set only comes with Enterprise Installation.
ODI Standalone Agent Template Domain files that are required when Oracle WebLogic Server is not handling your Oracle Data Integrator installation. This feature set is accessible only with the type of Standalone Install.
ODI Console As an alternative to certain features of ODI Studio, we can access the web-based console available to assigned users.
FMW Upgrade This is the upgrade assistant used to upgrade the Oracle Data Integrator version from 11g to 12c.
Repository Creation Utility The Repository Creation Utility (RCU) is used to create database schemas and included with the Standalone Installation type. Enterprise Installation does not include RCU but RCU is included with the installation of Oracle Fusion Middleware infrastructure distribution.

 

ODICS Architecture

ODICS Architecture

 

New / Enhanced Big Data and Cloud Features within ODICS:

 ODICS continues to evolve with technological advancements for Big Data and Cloud Knowledge Modules for better transformations.

Big Data Features:

  • Spark Knowledge Modules (KM) Improvement: The emphasis was on producing high-performance, and easy-to-read code (Spark) instead of handwritten scripts. Spark KMs now leverage the latest features such as Dataframes from Apache Spark 2.x to speed up the ODI processes.
  • Spark KMs support in Knowledge Module Editor: The Spark KMs are now fully supported and can be customized as per specific needs.
  • Hadoop Complex Types Enhancements: ODI enhances its support capability to Apache HDFS and Kafka Architecture.
  • Big Data Configuration Wizard: The Big Data Configuration Wizard is now updated with new templates for the current Cloudera distribution.

Spark KMs In Knowledge Module Editor

Spark KMs In Knowledge Module Editor

 

Cloud Features:

  • RESTful Service Support: ODICS can invoke RESTful Service in Topology configurations that include RESTful Service connectivity, resource URI, methods, and parameters.
  • Business Intelligence Cloud Service (BICS) Knowledge Modules: BICS is now supported out of the box in ODICS.
  •  Connectivity with Salesforce: ODICS is fully certified with Salesforce.com and now includes a JDBC driver for this technology out of the box.

ODI Integration With Salesforce

ODI Integration With Salesforce

 

In the next part, we will focus on more key feature highlights within ODICS.