DM Migration Status
DM Migration Status
Phase 1: Planning
Define the objectives and scope of the data migration
Identify the source and target systems for the migration and draft an inventory of your data ecosyste
Determine
Analyze thethe data
data migration
to be timeline
migrated, and its
including deadlines
size,
complexity, and quality
Identify any data dependencies or relationships that need to
be maintained
Determine the migration approach (e.g., big bang(when the
entirety of your day is migrated all in a single operation.),
phased, or parallel)
Identify the data migration, including personnel, hardware,
and software
Develop a risk management plan for the data migration,
including contingency
Determine plans for
the data mapping andunexpected issues
transformation
requirements
Develop a testing plan to ensure the accuracy and
completeness of the
Identify any legal migrated data
or regulatory requirements that must be
considered during the migration
Create a communication plan to keep stakeholders informed
throughout the
Determine the data
data retention
migrationrequirements
process after the
migration
Develop a is complete
plan for training users on the new system and
processes
Identify the key performance indicators (KPIs) that will be
used to measure the success of the data migration
Identify the chain of custody requirements for the data
migration project, including any legal or regulatory
requirements that must be met
Phase 2: Analysis
Develop a plan for data analysis and reporting, both during
and after the data migration process
Define the data analysis tools and techniques to be used,
including data profiling, data quality analysis, and data
lineage
Create aanalysis
data validation plan to ensure the accuracy and
completeness of the migrated data
Developany
Identify a plan forgaps
data dataorreconciliation andand
inconsistencies error resolution
develop a plan
to address them
Determine the data governance requirements for the
migrated data, including data ownership, data privacy, and
data security
Develop a plan for data integration and consolidation, if
necessary
Identify any
Develop data
a plan forcleansing or dataorenrichment
data archiving requirements
data deletion, if
necessary
Determine the reporting requirements for the migrated data
and develop a plan for creating reports
Identify any data visualization requirements and develop a
plan for creating visualizations
Create a plan for data governance and maintenance,
includingadata
Develop planquality monitoring
for data and dataand
analytics training stewardship
support for end-
users
Test and validate all analytics components and processes
Phasea3:
Select Pilot migration
representative subset of data to migrate during the
pilot phase
Develop a pilot migration plan, including timelines and
milestones
Establish a test environment for the pilot migration
Test the migration tools and processes on the pilot data set
Verifythe
Test themigrated
migrateddata
datafor
inaccuracy,
the targetcompleteness,
system and
consistency
Validateany
Identify the issues
data governance andpilot
or gaps in the maintenance
migrationprocesses
and develop
a plan to address them
Monitor the performance of the migrated data in the target
system
Collect feedback from end-users on the migrated data and
processes
Assess the success of the pilot migration against the defined
KPIs
Analyze the results of the pilot migration to identify any areas
for improvement
Refine the migration processes and tools based on the results
of the pilot
Create migration
a plan to address any issues or gaps identified during
the pilot migration
Prepare for the full-scale migration based on the lessons
learned during the pilot migration
Phase 4: Full migration
Follow full
Identify themigration plan,toincluding
defined data timelines
be migrated and milestones
and verify that it is
complete and accurate
Verify the readiness of the target system for the migration
Confirm the migration approach defined in the migration plan
(e.g., big bang, parallel, or phased)
Verify the defined data mapping plan to ensure the data is
mapped correctly to the target system
Verify the defined data transformation plan and steps to
ensurethe
Verify thedefined
data is validation
transformed correctly
plan for the
to ensure the migrated
target system
data
is complete
Verify and accurate
the defined data reconciliation plan to ensure any data
inconsistencies are resolved
Verify the migrated data in the target system
Test and validate the migrated data for accuracy,
completeness, and consistency per the data validation
requirements
Follow any
Monitor thedata governance
performance andmigrated
of the maintenance processes
data in the target
system
Collect feedback from end-users on the migrated data and
processes
Assess the
Analyze thesuccess
results of
of the
the migration
migration against theany
to identify defined
areasKPIs
of
improvement
Refine the migration processes and tools based on the results
of the migration
Create a plan to address any issues or gaps identified during
the migration
Implement chain of custody procedures for the full migration
process
Prepare for post-migration activities, such as data governance
and maintenance, data quality monitoring, and user training
and support
Phasethat
Verify 5: Post-migration
all data has been successfully migrated to the
target system
Validate that the migrated data is complete, accurate, and
consistent
Conduct data quality monitoring to ensure the migrated data
continuesthat
Validate to meet the governance
the data required quality
and standards
maintenance
processes
Monitor theare working as of
performance expected
the migrated data in the target
system
Collect feedback from end-users on the migrated data and
processes
Analyze the feedback received and identify areas for
improvement
Conduct user training and support sessions to help end-users
become proficient in using the new system and processes
Provide ongoing support to end-users to help them
troubleshoot any issues they encounter
Conduct periodic data reviews to ensure that the migrated
data continues to meet the required quality standards
Conduct periodic data audits to ensure that the migrated data
is being used
Establish and maintained
a process correctly
for handling any data-related issues that
may arise
Continuously monitor and improve the data governance and
maintenance processes
Update the data governance policies and procedures to
reflect any changes resulting from the migration
Conduct a post-migration review to assess the success of the
migration
Verify the against the of
completion defined
chain KPIs
of custody for the migrated
data
Owner Completed? Risks
Notes
Component
Location
Grade
Department
Position
Shared HR – Employee
Customer
Header
Customer
Accounts
Customer Sites
Customer
Customers Referance
Accounts
Customer
Contacts &
Relationships
Customer
Profiles
Customer Bank
Accounts
Suppliers
Supplier
Addresses
Suppliers
Supplier Sites
Supplier Site
Suppliers Assignments
Supplier Site
Contacts
Supplier
Products and
Services
Categories
Supplier Tax
profiles
GL Balances –
Opening
General Ledger Balances
GL History -
Period Balance
six (2) Years
COA
• Mass
Additions
Fixed Assets • Mass
Addition
Distributions
• Invoice Lines
Customer Invoices • Invoice
Distributions
Customer Receipts
• Header
• Requisition
Purchase Requisitions - Open Line
• Requisition
Distribution
Purchase Orders - Open
• Purchase
Order Header
• Purchase
Order Lines
• Purchase
Order Schedules
• Purchase
Order
Distributions
Purchase Receipts – Open TBD
Buyer's List
user Information and User roles
Project
Contracts - Subscription
Contracts and Project - PPM
Items
SOW Scope
Source
MSD
Only active External Bank and Branches will be
created in Oracle
MSD
MSD
NA
Yet To Finalize
Purchasing
Coupa
Contracts - Subscription
Contracts and Project - PPM
Items
Tool
Rapid Implementation
DMF / Rapid
Implementation
DMF / HDL
DMF / HDL
FBDI
DMF / FBDI
DMF / FBDI
DMF / FBDI
FBDI
DMF / FBDI
DMF / FBDI
DMF / FBDI
Data Migration Approach
• Any Internal bank accounts will be created as part of
Eand
configuration activity. Eand will be
• Bank information requires the country, name, and number. providing the
• Branch information requires name, number, BIC code, and necessary extract
alternate name. information to
• Account information requires name, number, currency, legal Infolob for the
entity, type, and IBAN. purpose of data
• Existing bank accounts cannot be modified through Rapid Eand will be This
verification.
Implementation. step is crucial
providing the to
• Payment Documents and Formats will have to be manually ensure that
necessary extract all
configured in Cloud. data is accurate
information to
• Additional configuration such as Bank Statement before
Infolob for thewe
Reconciliation, DFF’s, Security and Business Unit will have to be proceedof
purpose with
datathe
manually configured. loading process.
verification. This
• Internal bank accounts will be extracted as part of step is crucial to
configuration workbook and setup while environment build. ensure that all
data is accurate
• All active work structure components will be migrated as of before we
cut over date. proceed with the
• The order of loading of 6 basic work structures should be: loading process.
a) Location
Eand will be
b) Grade
providing the
c) Job Family
necessary extract
d) Job
information to
e) Departments with classifications
Infolob for the
f) Positions
purpose of data
• Other work structure components (like Department Tree,
verification. This
Position Hierarchy etc) should be loaded only after the above
step is crucial to
work structure components are loaded successfully.
ensure that all
• Most of the work structure objects are date tracked, so the
data is accurate
way EffectiveStartDate and EffectiveEndDate attributes are
before we
defined in SQLs should not be changed without proper
proceed with the
understanding.
loading process.
• Some of the inactive work structures may be assigned to
employee assignments in Oracle EBS as there is no check.
However, in fusion inactive work structures can't be assigned to
active assignments. This will create a conflict while loading
worker data in Fusion and some of the worker records may fail.
• If the project team encounters issue due to inactive work
structure, the corresponding work structures should be loaded
as Active first (or made Active from UI) and post worker
assignment load, the relevant work structures should be made
inactive.
• It is advised to load only the below mandatory worker
business objects in first round of each iteration:
o Worker
o Person Name
o Person Legislative Data
o Work Relationship
o Work Terms
o Assignment
• The other objects (like email, address, phone, supervisor etc)
should be loaded in second round once the mandatory worker
data is loaded successfully.
• Most of the worker objects are date tracked, so the way
EffectiveStartDate
• and EffectiveEndDate
All active customer accounts andattributes are defined
sites are
in SQLs should not be changed
considered as of cut over date. without proper understanding.
Any modification in standard worker queries will need rigorous
• New customers created in source system after
extract for production migration will be created
manually in Fusion.
• Customer bank accounts and receipt method would
be migrated only for Direct Debit customers.
• Customer contact details would be migrated.
• Customer Sharing - Multiple customer accounts to
be created for same party if customer is shared
between sub-processes. This step is crucial to
ensure that all data is
• It is recommended to migrate Customers prior to accurate before we proceed
with the loading process.
loading any other TCA entities like Suppliers,
Employees, etc. because Customer FBDI provides a
place holder for migrating Party Number
(REGISTRY_ID) to Fusion while for other TCA entities, it
is generated as a sequence value by Fusion. This will
make sure that the party numbers used by Customer
extract are not utilized while creating other TCA
entities.
• All data transformation and mapping will be
completed before it is loaded into the template (e.g.,
Customer Account type, Customer Site Purpose such
• Both PROSPECTIVE
as Bill-to, Ship-to). and SPEND AUTHORIZED suppliers will be Oracle will be
migrated. providing the
• If Paysafe uses supplier qualification process, then they necessary extract
should review and try to close any unapproved suppliers before information to
migration. If any PROSPECTIVE supplier at source is under
approval process during data migration cut over, then it will be
migrated as a PROSPECTIVE supplier only. Any incomplete
approval process to be re-initiated by Paysafe manually in
target system after data migration is over.
• All active Suppliers and Supplier Sites for the OU name from
the Source System to be considered.
• Employees will not be migrated as Suppliers.
• New suppliers created in Source System after extract for
production migration will be created manually in Oracle Fusion
should review and try to close any unapproved suppliers before
migration. If any PROSPECTIVE supplier at source is under
approval process during data migration cut over, then it will be
migrated as a PROSPECTIVE supplier only. Any incomplete
approval process to be re-initiated by Paysafe manually in
target system after data migration is over.
• All active Suppliers and Supplier Sites for the OU name from
the Source System to be considered.
• Employees will not be migrated as Suppliers.
• New suppliers created in Source System after extract for
production migration will be created manually in Oracle Fusion
cloud.
• Supplier addresses to have one to one mapping with site to
facilitate storage of tax code. Supplier addresses shall be same
as the site name to create unique address in Fusion.
5,000
All Active Suppliers
5,000
Based on country
validations, Bank
-Duplicates
Code/Number
-Country specific &
defaulting rules
Validation issues.
needs to be
agreed.
High
Notes
Loaded Loaded
Completed loaded
Extraction Yet to Send
Completed loaded
Extraction Yet to Send
Completed loaded
Extraction Yet to Send
Completed loaded
Extraction Yet to Send
No bank Accounts NA NA
Completed Loaded
Completed Loaded
Completed Loaded
Extraction Sent
Completed Loaded
Not Yet
Not Yet Started Data Received upto Aug-24
Started
Not Yet
Not Yet Started Data Received upto Aug-24
Started
1 1 1
1 1 1
1 1 1
1 1 1
1 1 2
s account is loaded