Introduction to Oracle Tax Reporting Cloud

 

 

Tax reporting (from Oracle) is an incredible application to increase the efficiency of the tax function.

With the rise of digital economies, governments across the world are finding ways to tax the digital income generated from their country. OECD is working on details of the digital tax so that companies and governments can work more efficiently. Meanwhile, France/UK are ready to tax digital companies with their own digital taxes.

In this world of uncertainty, governments are looking to increase their tax revenue and making changes to the tax laws accordingly. It is expected that companies should calculate their tax obligation correctly as per the latest tax codes in that jurisdiction.

However, tax functions in many companies are still using Microsoft spreadsheets to prepare tax calculations. These calculations might be prepared at an entity level in a spreadsheet and then sent to a regional or global the tax function using email. Tax experts review tax calculations at group level and if there are issues with the tax calculation of an entity or formula errors, the spreadsheet is sent back to the local team for correcting or updating. There is so much to and fro of the spreadsheets that tax and finance teams can easily lose track of the versions and corrections. Sometimes tax calculation models refer to many linked spreadsheets which creates further complexity when the spreadsheets need to be updated for a new account, changes in legislation or accounting standards. Formula errors can exist in the spreadsheets that are difficult to identify and correct. The other issue is that the tax models may be maintained and updated by one team member. If that team member leaves the company, there is a big risk to the tax process and financial close. These are a few of the pain points, bottlenecks and risks with using spreadsheets to prepare tax calculations.

Tax reporting is a web-based cloud solution, which has inbuilt functionality for:

  • Configurable tax calculation rules
  • Automatic calculation of tax expense, DTA/DTL
  • Approval process
  • Roll forwarding of tax accounts
  • Loading trial balance data
  • Load fixed assets data
  • Currency translations
  • Consolidation
  • Calculate Effective Tax Rate
  • Reports on local/ regional /state / national tax data
  • Produce tax accounting journal entries
  • Country by country reporting
  • Capture supplemental data for tax calculations and additional disclosure
  • Maintained by the tax and finance users

 

 

The tax reporting solution provides tax departments with the ability to meet global tax reporting requirements on an ongoing basis and ensure compliance with changing tax regulations.

We can help with implementing Oracle’s tax reporting solution for your organization and provide guidance on how to get the maximum value out of it.

Oracle Narrative Reporting – Part 1

In this blog post, we will be focusing on EPRCS Security access rights and roles available.

EPRCS Overview

  • Oracle Enterprise Performance Reporting Cloud Service (EPRCS) is a Cloud solution for management and narrative reporting. It provides a secure and integrated solution that offers a collaborative and process-driven approach. Cloud maintenance, patching, and back-ups are managed by Oracle.
  • The workflow process provides collaboration, commentary, and delivery of management reporting through EPRCS objects which are stored in a Library. The library can be organized by folder and managed through security.
  • EPRCS allows users to easily combine both data and narrative content on report objects called Doclets. Doclets are grouped together in a report package. “Doclet” is used to perform check-in and check-out process to manage versions.
  • Extended Microsoft Office tools (Word, PowerPoint, etc.) can be used to provide the output of management reports. This includes an intelligent and intuitive simplified UI that can be accessed via desktop, mobile, and the web.
  • User Roles: Provides user types such as owners, authors, and approvers and role-based security and auditable access on desktop and mobile devices.

Figure 1 EPRCS Architecture

 

 

EPRCS Security

EPRCS enables secure collaboration between users. One can control which users can edit which Doclets. This allows users from various departments and areas to all contribute to the same report package thereby safeguarding sensitive data. Inside EPRCS, security is provided at three levels:

 

Figure 2 EPRCS Security Roles

System-Level Roles:

  • For EPRCS environments two sets of roles are created. One set is for Production and another set is for Pre-Production. Pre-Production allows EPRCS customers to keep security differentiated for testing purposes.
  • Roles can also be combined into groups under My Services via the Custom Roles tab. It is considered best practice for assigning security to users based on groups, rather than individually. The five predefined roles are as below:
Roles Access based on roles
Service Administrators Create and maintain all aspects of the system, except for user management
Reports Administrators Create and manage report packages, management reporting definitions, and Disclosure Management documents
Application Administrators Create and maintain all application artifact’s, such as applications, models, dimensions, and data grants
Library Administrators Create and manage folders including root-level folders
Users The minimum role required to log-in and participate in the reporting cycle, and to view artifacts’ to which the user has access

 

 

Artifact-level Security:

Figure 3 Artifact level Security

 

 

You automatically have permission to edit, delete, and maintain that artifact when you create an artifact (report package, folder, application). You can grant security by users and groups to the artifacts created. Users without access cannot see or access that artifact. Artifacts can be given the following forms of permissions:

 

Roles Application Report Packages Third-party Artifacts or Folders
Administer Y Y Y
Use Y
View  – Y Y
Write  – Y

 

  • Administer: Unrestricted view and change privilege to all artifacts.
  • Write: Enables users to add folder content.
  • View: Enables users to view only the artifact.
  • Use: Enables the user to see the application in the library.

 

Figure 4 Access in Report Package

Figure 5 Access to Application


Data Security:

 

Figure 6 Data level Security

It determines the level of security, in which data access permissions can be granted to users. Data level security can be set through Dimension-based access: either by setting the READ access or NONE access.

Figure 7 Dimension-based access

 

Grant access to parts of data in a model. This grant may be at an individual level or by cross-dimension/intersections.

Figure 8 Data Grant level access

To Summarize:

  • EPRCS is a powerful reporting solution that is secure, collaborative and intuitive meant to complement and combine reporting from various types of technologies.
  • Simplify the report creation and distribution process.
  • Collaborate with content contributors and reviewers.
  • Access through Mobile or desktop – when you want, how you want.
  • Publish book – quality financial and management reports.
  • EPRCS has a wide variety of functions that can take it anywhere in an organization to meet unique reporting needs with its cost-effective pricing and is a great entry point for new cloud customers.

 

 

Part 2 will be on “EPRCS Workflow Setup and Details”: where we will talk about the workflow approvals and sign-off process related to the publishing of the reports.

Enterprise Data Management Series – Part 1

Welcome to our initial post in a series of about the world of Metadata Management, Enterprise Data management provides a new way to manage your data assets. You can manage application artefacts that include properties such as Master data (members that represent data and includes dimensions, hierarchies), Reference Data (such as page drop-downs for ease of filtering in frontend), and Mappings (master data member relationships). Using these pre-built functions, you will be able to track master data changes with ease.

If your business is restructured to align entities such as Accounts, Products, Cost Center and Sales across multiple organizational units, you can create model scenarios, rationalize multiple systems, compare within system etc. You can maintain alternate hierarchies for reporting structures which differ from current ERP system structure. In the case of migrating an application to the cloud, you can define target structures and data mappings to accelerate the process. EDMCS also provides the ability to sync up applications from on-premise to cloud or across cloud applications. The process involves the below four C’s:

  • Collate: The first process involves collecting and combining data sources through the process of application registration, data import, and automation.
    • The registration process refers to establishing application connections with a new data source(s).
    • The Import process refers to the loading of data into the application view(s).
    • Automation is built using the REST API features.

Figure 1 Collate Process: Register

 

Figure 2 Collate Process: Import

 

  • Curate: Curate helps organize the source data through a Request mechanism to help load or change existing data. Request mechanism involves 4 steps: Record, Visualize, Validate and Submit.
    • Record: All appropriate actions are Recorded together under the Request Pane with primary action mentioned first.
    • Visualize: This allows you to visualize all the changes prior to committing them. This enables us to view model changes as per request, then study the impact and make any final modifications to the request. Changes made to a view are visible in unique colours and icons so that you can see which parts of the hierarchy or list were changed and what areas may be affected by the change.
    • Validate: Maintain the integrity of data during data entry with real-time validation that checks for duplicates, shared nodes, data type or lookup violation. You can run validation through the Record session as well.
    • Submit: Upon user validations, you can commit them.

Figure 3 Request Process

 

  • Conform: Conform is the process to help standardize rules and policies through Validation, Compare and Map process tasks.
    • You can run validations between and across viewpoints. Share application data within and across the application to create consistency and build alignment.
    • Compare viewpoints side-by-side and synchronize using simple drag and drop between and across applications.
    • Map nodes across viewpoints to build data maps. Construct an alternate hierarchy across viewpoints using the copy dimension feature. Now the question arises “What is the viewpoint”? Viewpoint is a subset of the nodes for you to work with.

 

Figure 4 Viewpoint and Various Actions

 

  • Consume: Consume defines where you move the changes to a target application. You can achieve this by download, export, and automate.
    • Download a viewpoint to make changes offline or make bulk updates.
    • Export moves changes to other application(s) or sync updated data to external target applications.
    • Using EPM automation, you can load and extract dimensions from and within the EPM cloud application

 

Figure 5 EPMAutomate

 

 

Complete and Connected EPM cloud applications

 

Oracle has raised the bar with its latest complete and integrated cloud enterprise performance management applications, In the latest Open World, Larry Ellison (founder of Oracle) restated the mission statement of oracle to

“help people see data in new ways, discover insights, unlock endless possibilities”.

This shows the importance of data and its usage from the visionary technology leader. Oracle is changing from on-premise software provider to the cloud-oriented company. It is the only company in the business software business, which has offerings in Infrastructure as a service(IaaS), Platform as a service(PaaS) and Software as a service(SaaS). Oracle has rewritten the code of its cloud EPM applications to be optimised for the cloud. If you can control the infrastructure of software, then it gives advantage in terms of fine-tuning the performance and security of software applications. Certainly, Oracle has a competitive advantage on its competitors.

 

Oracle EPM Cloud applications help the users in below-mentioned business processes:

  • Connected Planning: Planning application, Profitability and Cost Management application
  • Comprehensive Financial Close: Account Reconciliation, Financial Consolidations and Close, Tax Reporting
  • Reporting: Narrative reporting application
  • Data Management: Enterprise Data Management

 

 

Let us evaluate Oracle EPM applications on cloud partner selection criteria

 

Cloud application availability and scalability

since oracle has presence and customers across the globe, these applications are available in most of the countries. Scalability is inbuilt into the EPM applications and it has been tested against extreme scenarios. Oracle has used simulated SaaS environment to test the scalability of these applications.

Cloud partner completeness

Oracle has comprehensive cloud applications with breadth and depth of across EPM processes as written above. Additionally, these connected cloud applications are built on a common platform. This can really help the customer in coordination with a single partner and getting the latest innovations across the applications. Example: if user like chatbot functionality in the planning application, then oracle can quickly roll out the same functionality in narrative reporting or enterprise data management.

The strategic focus of the cloud partner:

As part of its strategy, oracle has re-written optimised code for its cloud applications. So, customers can get best in class user experience, and functionality. Oracle is continuously spending dollars on research and development to provide latest machine learning, AI and analytics innovations in its cloud applications.

Cloud partner ecosystem

Being one of the oldest players in the business software market, oracle has established an ecosystem of implementation and support consultants. These consultants were helping clients to solve their problems using on-premise Oracle technology. Now, they are focussing on cloud applications.

Customer focus

As part of continuous development, oracle is collaborating with its customers and on-ground consultants to add new features in these cloud applications.

 

So, oracle provides complete and connected EPM cloud applications, which can be configured as per customer business processes. We know that moving to clouds application and choosing a cloud partner should be a strategic decision. It is best to trust your business to the trusted market leader. We can help with devising your cloud strategy and then implementing that strategy using best in class EPM cloud applications.

Little Known EPM tools: Using JHAT

JHAT tool is another way to automate HFM tasks using a batch file and the HFM API. It is the former HAT updated to be compliant with the last HFM release. JHAT offers the opportunity to use any scheduler to launch HFM tasks and provide better flexibility than Task Flows.

JHAT utility is present here (and the batch file embeds all libraries, paths and other references to execute HFM tasks);

 Drive:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Server\jhat.bat

In this example set, we are going to make use of the power of PowerShell scripting and the functionality provided by JHat to perform an HFM metadata scan.

Running an HFM Metadata Scan using JHat

To begin with, let’s create an external variables file called  PRD_External_Vars.ps1 and define all our required variables here.

You could also create 1 single file and declare the variables at the beginning, but I just like it to keep it separate as it becomes easy to manage and generally a cleaner approach.

 #HFM Variables
 $HFM_user_jh = '"hfmadmin"' #Username to login 
 $HFM_Password_jh = '"p@ssw0rd"' #Password for the user 
 $HFM_Server_jh = '"HFMPrd"' #HFM cluster name 
 $HFM_Application_jh = '"Demo"' #Application Name
 $Delimiter = '";"' #App file delimited 
 $HFMScanMode = '"Replace"' #Use the 'Scan' or 'Replace'

Now that we have all our variables declared, let’s just get it going with the JHat script…

Let’s begin by importing all the variables that we declared above in out JHat script

#Create External Variable File Name (as per the environment)
$dir_ext_var = "mydrive:\mydir\PRD_External_Vars.ps1"

#Retrieve Variables from External Variable File
. $dir_ext_var

While we are at it, let’s also declare few additional variables for the log and properties life.

#Log file to log all the steps being executed by JHat
$HFMBatchLog = "mydrive:\\mydir\\HFMMetadata\\HFM_Metadata_Update_Load.log"

#Location for the properties file that will be created by PowerShell on the fly. This will be used by JHat
$OutPath="mydrive:\mydir\HFMMetadata"

#Location of the jhat file installed on the Financial Management server
$JHatLocation = "D:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Server"

#Location of the properties file. This would be passed as a parameter to the jhat batch
$InputFileLocation = "mydrive:\\mydir\\HFMMetadata\\hfm_md_load.properties"

#Temporary location of the jhat log file
$LogFileLocation_jh = "mydrive:\\mydir\\HFMMetadata\\hfmJH.log"

#Temporary location of the powershell log file
$LogFileLocation_ps = "mydrive:\\mydir\\HFMMetadata\\\hfmPS.log"

#Log file which can be reviewed later after the job execution is completed.
$LogPath_jh="mydrive:\\mydir\\HFMMetadata\\PROD_HFMMDScanJH.log"

#Location of the powershell if there are any errors with powershell execution.
$LogPath_ps="mydrive:\\mydir\\HFMMetadata\\PROD_HFMMDScanPS.log"

#App file that will be used to perform the scan and/or load of HFM metadata (the file can be XML too)
$DimensionFile="mydrive:\\mydir\\HFMMetadata\\HFM_MetadataFile.app"

The next interesting bit is to create the HFM Metadata load properties file on the fly. This file would be used by JHat utility to perform the metadata scan…

What we are doing below is to create a properties file that would be used by JHat to;

  1. Login to the application,
  2. Open a session for the application,
  3. Perform a metadata scan,
  4. Store the output in a log file,
  5. Close the session and
  6. Logout of the application.

#Clear contents of existing .properties file on the fly
Clear-Content $OutPath\hfm_md_load.properties

#Available Functions
#Function Name: Logon - Login to the application
#Function Name: OpenApplication - Open a session to the specified application
#Function Name: LoadMetadata – Scan and/or load HFM metadata into the specified application
#Function Name: CloseApplication – Close the session opened
#Function Name: Logout – Log out of the application

#Create .properties file on the fly
Add-Content -Path $OutPath\hfm_md_load.properties -Value "Logon(""False"","""",$HFM_user_jh,$HFM_Password_jh);"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "OpenApplication($HFM_Server_jh,$HFM_Application_jh);"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "LoadMetadata($DimensionFile,$LogPath_jh,$Delimiter,$HFMScanMode,""True"",""True"",""True"",""True"",""True"",""True"",""True"",""True"",""True"",""True"",""False"",""False"",""False"",""True"");"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "CloseApplication();"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "Logout();"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "End"

#Call Jhat api
#The jhat batch requires the log file location and the inputfile location as the parameter
Start-Process -FilePath "$JHatLocation\jhat.bat" -ArgumentList "-O$LogFileLocation_jh -I$InputFileLocation"

Finally, let’s run the PowerShell now.

Once the execution is complete, checking in Consolidation Administration, we can see that the Metadata load started and completed without any errors.

There are various other functions available with JHat,

Running An InterCompany report using JHat


#Function Name: GenerateReport – To Generate ICP report
#Arg0 = Path (Path of the document in document manager)
#Arg1 = docName (Name of the document)
#Arg2 = reportType (valid options - intercompany, journal, EPU, ICTransactions, IC Match By Account, IC Match by ID)
#Arg3 = reportFormat (HFM_FORMAT)
#Arg4 = reportFile (location of the file where report must be stored)
#Arg5 = overriddenPOV (specify the POV to override it with)

GenerateReport("\\\","Monitoring_REP_Plug_Acct_Matching", "intercompany","HFM_FORMAT","D:\Oracle\Temp\Workspace\Intercompany\InterCompany.html","S#Scenario.Y#2019.P#Jun.W#YTD.V#<Entity Curr Total>.E#{Example.[Base]}");

 

 

How Account Reconciliation Cloud Service(ARCS) from Oracle can improve efficiency of the finance users

 

In this world of cloud computing and mobile applications, we have software applications to increase the efficiency of our personal and official tasks. Software applications are being used for a grocery list, tracking heart rate, and energy burned etc. It is interesting to know that more than 60% of organizations are still performing manual reconciliations between systems and subsystems. The approval process is managed using emails and there is limited visibility on the accountability. Lack of governance in the account reconciliation process cause problems in the audit. Accountants are not leveraging technology to decrease the burden during the monthly close process. With new IFRS standards and new requirements from the regulator, this reconciliation burden is going to increase more. This account reconciliation is one of the major bottlenecks in the close process.

Good news is that Account Reconciliation Cloud Service (ARCS) from Oracle can help the finance users to perform automatic account reconciliations, efficiently manage the process and avoid security risks. Using ARCS, finance users can close their books early and spend time on analytical activities. Since this is a cloud-based solution, there is no capital expenditure required. Once ARCS are implemented, it is deployed on the Oracle cloud and the IT department does not need to maintain it.

 

This best in class reconciliation solution has Reconciliation Compliance and Transaction Matching modules in a single instance.

  1. Reconciliation Compliance: This module helps in managing reconciliation processes and ensure that compliance requirements are met.
  2. Transaction Matching: This module increases efficiency and saves cost by automatic matching of the bulk and labour-intensive transactions.

The reconciliation process is executed in 5 steps in the Reconciliation Compliance module:

  1. Load Balances: Balances from source systems and subsystems can be loaded directly into ARCS. This loading of data can be scheduled as per the client requirement.  Once data is loaded to ARCS, it can be drilled back to the source system. It has prebuilt integrations with EBS, JDE, PeopleSoft, and other market-leading ERP systems.
  2. Automation Executed: Once data for source systems and subsystems is loaded, ARCS performs auto reconciliations based on the account analysis, balance comparison and custom reconciliation criteria. Custom attributes will be assigned based on the rules defined. Amortization is calculated automatically. Email notifications are sent to the preparers and reviewers.
  3. Prepare Reconciliations: Preparers work on the assigned reconciliations by proving account explanations (Prepaid, Accruals) and doing balance to balance comparison (AP/AR/Inventory/Bank Reconciliations/Depreciation). It has prebuilt 20 common reconciliations templates. The administrator can create new templates as per organization’s requirements.
  4. Review Reconciliations: User can put comments and add attachments to each reconciliation. User needs to provide sign off on the reviewed reconciliations. It has multiple level approval and rejection workflow.
  5. Monitor the compliance process: ARCS give instant visibility to the progress and status of the reconciliation life cycle. It shows status charts on reconciliations pending with preparer, pending with reviewer and closed reconciliations. Ageing analysis dashboard gives information on the ageing of all transactions and ageing buckets. Each reconciliation (including attachments and invoice level details) can be extracted in a report binder and these report binders can be shared with the auditors.

In this way, ARCS efficiently manage reconciliation process with visibility and automation for improved accuracy, security and reduced risks.

If your organizations have a manual account reconciliations process and want to have an insight into the real-time status of the reconciliation and close process, ARCS can help in streamlining the reconciliation process.

Transaction Matching: This component helps in the automatic matching of high volume, efforts intensive transactions across one or more data sources. Examples: POS to Bank, POS to cash and credit card, and intercompany transactions (AP/AR) bulk transactions matching.

With customized rules having a lot of flexibility, the majority of the transactions are auto-matched. ARCS intelligence engine provides suggested matches for remaining un-matched transactions.

Transaction matching has 4 high-level steps:

  1. Load Data: Transaction data from source systems (like POS and bank) can be uploaded manually or scheduled using data management. ARCS has the capability to upload pre-mapped transaction data directly to it.
  2. Run Auto Match: User-defined custom rules run and automatically match the transactions between 2 systems.
  3. Confirm Suggested Matches: For un-matched transactions, ARCS intelligence engine suggests matches. User can accept those suggestions or modify the suggestions to match transactions.
  4. Create Manual Matches: User can manually match transactions by providing reason code or adding adjustments. Above mentioned steps are repeated for every data load.

 

ARCS provide reconciliation balancing report for operational controls. It has prebuilt reports on open/closed adjustments and supported transactions.

With above-mentioned capabilities, ARCS can really help in decreasing the burden of finance users and can increase the efficiency of the close process.

We can help with implementing this best in class solution as per the business requirements of your company.

Important Software EOL (End of Life) Dates (Microsoft & Oracle)

Microsoft and Oracle have some critical end of life dates quickly approaching.  Any customers running on the platforms listed below should begin to prepare for these upcoming support changes.  Contact us today to discuss how these support policy changes might affect your Financial Close, Budgeting and Reporting systems.

 

Important Software EOL (End of Life) Dates

The following list represents products reaching end of support in the next year. For a comprehensive list of Microsoft products and their lifecycle policy timelines, please search the Microsoft Lifecycle Product Database.

  • Microsoft Windows 2008 Server – January 2020
  • Microsoft SQL Server 2008 – July 2019
  • Microsoft Office 2010 – October 2020

Lifetime Support dates for EPM System release 11.1.2.x have been extended.  The new, extended dates are publicly available in the newly published Lifetime Support Policy: Oracle Fusion Middleware Products.

  • 11.2.4.x – December 2020, extended support to December 2021

 

11.2.x Release Date & Rumours

  • 11.2.x – To be released September 2019, end of life date of December 2030
  • Oracle has moved the Essbase development work out of EPM and into the Oracle Database development team.
  • When EPM 11.2 comes out, Essbase will initially remain 11.1.2.4 technology under the covers: we won’t be getting the new Sandbox features introduced with Essbase 12 in on-premises OBI12 / cloud OAC.
  • OAC will no longer have Essbase bundled with it, effective immediately for all new customer-managed OAC implementations. Essbase “12c” will have to be installed separately as a standalone instance, and then Essbase cubes would need to be migrated from the old Essbase instance into a new standalone Essbase 12c instance.
  • A new Essbase, Essbase 19c, is under development for 11.2. It is expected to come out sometime next year.  Essbase 19c will be for on-premises only.
  • 11.1.2.4 on-premises Essbase patch development has apparently stopped and will not continue.
  • It is rumoured that 11.2 will first come out for Windows only, perhaps as early as Sept 2019 (SAFE HARBOR APPLIES), and a Linux version will not come out until Q1 or Q2 2020 (SAFE HARBOR APPLIES).

 

11.2.x Discontinued Features

  • Hyperion Financial Management
    • Financial Management Analytics
    • Essbase Analytics Link for HFM
    • Quantitative Management and Reporting for Solvency
  • Hyperion Planning
    • Hyperion Strategic Finance (HSF)
    • Simplified User Interface (SUI)
    • Workforce Planning
    • Capital Expense Planning
    • Project Financial Planning
    • Offline Planning
  • Hyperion BI +
    • Interactive Reporting (IR)
    • Production Reporting (SQR)
    • Web Analysis (WA)
  • Other
    • Disclosure Management
    • EPM Mobile

 

This means that staying on-premises is still a choice that will continue to be fully supported until 2030 with some conditions. If you make this decision, you will continue to miss out on the constant product enhancements available in the EPM cloud solution. This also can be as expensive of an option as the cloud solution, because you will incur further costs such as hardware and software upgrades and patches for both the EPM applications and the associated third-party components (e.g., Oracle and Microsoft SQL databases).

To determine the best path forward, owners should figure out the costs both from a short-term and long-term perspective, as well as comparing the difference in functionality between the two offerings.

Need help defining your EPM solutions roadmap?   Contact our team at [email protected] for more information

 

Disclaimers:

  1. Safe Harbour applies. Some of Oracle’s comments are directional in nature and may change in the future.

IFRS-16

IFRS 16 is a new accounting standard for leases which goes into effect as of January 2019, and requires lessees to recognise most leases on their balance sheets. This involves much more work in collecting, calculating, reporting, and disclosing data than you might think.

But with the right solutions, you can fulfil all these requirements while maintaining the high quality of your data.

At J&M, we are dedicated to making your journey towards IFRS 16 an easy one. In addition to knowing everything you need to do to prepare your systems for the reporting standard, we have been entrusted to provide end-to-end services for Oracle’s EPM Solution.