Search for:
HDL – Sample file to cancel Pending Worker Work Relationship

In case, you want to cancel work relationship of a Pending worker, below HDL can be used. Make sure to pass CancelWorkRelationshipFlag as Y.

Sample HDL:

METADATA|WorkRelationship|SourceSystemOwner|SourceSystemId|PersonId|CancelWorkRelationshipFlag
DELETE|WorkRelationship|FUSION|300000092554146|300000092554127|Y

You can extract the relevant Ids using below SQL:

select hikm.source_system_owner, hikm.source_system_id, ppos.person_id
  from per_periods_of_service ppos
      ,hrc_integration_key_map hikm
 where ppos.period_of_service_id = hikm.surrogate_id
   and ppos.person_id in (select person_id from per_all_people_f
where person_number = '1894')
BIP – Extract Absence Entry data in HDL Format

The below query extracts the minimum required attributes for absence entry update. In the current example, the absence status is updated to Withdrawn. You can make the changes as per your need:

SELECT DATA_ROW
FROM (
SELECT 'METADATA'
|| CHR (124) || 'PersonAbsenceEntry'
|| CHR (124) || 'PerAbsenceEntryId'
|| CHR (124) || 'AbsenceStatus'
|| CHR (124) || 'AbsenceTypeId'
|| CHR (124) || 'AssignmentId'
|| CHR (124) || 'EmployerId'
|| CHR (124) || 'PersonId'
|| CHR (124) || 'StartDate'
|| CHR (124) || 'EndDate'
|| CHR (124) || 'SourceSystemId' 
|| CHR (124) || 'SourceSystemOwner'  AS DATA_ROW
FROM DUAL
UNION all
select  'MERGE'
|| CHR (124) || 'PersonAbsenceEntry'
|| CHR (124) || apae.per_absence_entry_id
|| CHR (124) || 'ORA_WITHDRAWN'
|| CHR (124) || apae.absence_type_id
|| CHR (124) || apae.assignment_id
|| CHR (124) || apae.legal_entity_id
|| CHR (124) || apae.person_id
|| CHR (124) || to_char(apae.start_date,'RRRR/MM/DD')
|| CHR (124) || to_char(apae.end_date,'RRRR/MM/DD')
|| CHR (124) || hikm.source_system_id
|| CHR (124) || hikm.source_system_owner
FROM ANC_PER_ABS_ENTRIES apae,
	 HRC_INTEGRATION_KEY_MAP hikm
WHERE hikm.surrogate_id = apae.per_absence_entry_id
  --AND apae.absence_type_id = 3000012211
   AND apae.per_absence_entry_id = 300000079292534
)

Sample HDL File:

METADATA|PersonAbsenceEntry|PerAbsenceEntryId|AbsenceStatus|AbsenceTypeId|AssignmentId|EmployerId|PersonId|StartDate|EndDate|SourceSystemId|SourceSystemOwner
MERGE|PersonAbsenceEntry|300000079292534|ORA_WITHDRAWN|300000076789541||300000048608295|300000049253269|2014/06/11|2014/06/13|300000079292534|FUSION
HDL – Sample file to load Job Model profile data
METADATA|TalentProfile|ProfileId|ProfileCode|Description|PersonId|PersonNumber|Summary|ProfileTypeCode|ProfileTypeId|ProfileStatusCode|ProfileUsageCode|SourceSystemId|SourceSystemOwner
MERGE|TalentProfile||JOB_PROFILE_CODE1|Job Profile for CODE1||||JOB||A|M|TP_JOB_CODE1|HRC_SQLLOADER

METADATA|ProfileRelation|ProfileRelationId|BusinessUnitName|JobCode|JobFamilyName|JobSetCode|LocationCode|LocationSetCode|OrganizationName|PositionCode|ProfileCode|RelationCode|ProfileId|LearningItemNumber|ObjectEffectiveEndDate|ObjectEffectiveStartDate|JobFamilyId|JobId|LearningItemId|LocationId|OrganizationId|PositionId|SourceSystemId|SourceSystemOwner
MERGE|ProfileRelation|||CODE1||COMMON|||||JOB_PROFILE_CODE1|JOB||||2021/01/01|||||||PR_JOB_CODE1|HRC_SQLLOADER

METADATA|ProfileItem|ProfileCode|ContentItem|ContentItemId|ContentType|ContentTypeId|CountryCountryCode|CountryGeographyCode|DateFrom|Importance|InterestLevel|ItemDate1|ItemDate2|RatingModelCode1|RatingModelId1|SectionId|SectionName|DateTo|SourceSystemId|SourceSystemOwner
MERGE|ProfileItem|JOB_PROFILE_CODE1|Process Oriented||COMPETENCY||||2021/01/01|2||||PROFICIENCY|||||PI_JOB_CODE1|HRC_SQLLOADER
HDL – Sample HDL to load rehire Workers

Below is the sample file (tested on an older version) to load rehire worker records but the structure should remain same. Few attributes may be deprecated now, so you may get an error. Please remove those and it should work fine.

METADATA|Worker|PersonId|EffectiveStartDate|EffectiveEndDate|PersonNumber|StartDate|DateOfBirth|CountryOfBirth|TownOfBirth|CategoryCode|ActionCode|ReasonCode|GUID|SourceSystemOwner|SourceSystemId

MERGE|Worker||2002/10/07|4712/12/31|10090|2002/10/07|1976/01/02||||HIRE|||HRC_SQLLOADER|1190_PERSON



METADATA|PersonName|PersonNameId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|NameType|FirstName|MiddleNames|LastName|Honors|KnownAs|PreNameAdjunct|Suffix|MilitaryRank|PreviousLastName|Title|SourceSystemOwner|SourceSystemId

MERGE|PersonName||2002/10/07|4712/12/31|1190_PERSON||US|GLOBAL|Steven|Flord|Smith|||||||MR.|HRC_SQLLOADER|1190_PERSON_NAME


METADATA|WorkRelationship|PersonNumber|LegalEntityId|DateStart|PersonId(SourceSystemId)|PrimaryFlag|LegalEmployerName|ActionCode|WorkerType|ProjectedTerminationDate|ActualTerminationDate|SourceSystemOwner|SourceSystemId

MERGE|WorkRelationship|10090||2002/10/07|1190_PERSON|Y|ABC Limited|HIRE|E||2008/12/07|HRC_SQLLOADER|283_WORK_RELATIONSHIP

MERGE|WorkRelationship|10090||2015/02/02|1190_PERSON|Y|ABC Limited|REHIRE|E|||HRC_SQLLOADER|101062_WORK_RELATIONSHIP



METADATA|WorkTerms|AssignmentId|AssignmentNumber|EffectiveStartDate|EffectiveEndDate|EffectiveSequence|EffectiveLatestChange|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|LegalEmployerName|DateStart|AssignmentName|AssignmentStatusTypeCode|AssignmentType|BusinessUnitShortCode|PositionOverrideFlag|PrimaryWorkTermsFlag|ActionCode|WorkerType|PersonTypeCode|SystemPersonType|SourceSystemOwner|SourceSystemId

MERGE|WorkTerms||ET10090|2002/10/07|2008/12/07|1|Y|283_WORK_RELATIONSHIP|1190_PERSON||ABC Limited|2002/10/07||ACTIVE_PROCESS|ET|ABC Business Unit|||HIRE|E|Employee|EMP|HRC_SQLLOADER|283_WORK_TERM

MERGE|WorkTerms||ET10090|2008/12/08|4712/12/31|1|Y|283_WORK_RELATIONSHIP|1190_PERSON||ABC Limited|2002/10/07||INACTIVE_PROCESS|ET|ABC Business Unit|||TERMINATION||||HRC_SQLLOADER|283_WORK_TERM

MERGE|WorkTerms||ET10090-2|2015/02/02|2017/06/30|1|Y|101062_WORK_RELATIONSHIP|1190_PERSON||ABC Limited|2015/02/02||ACTIVE_PROCESS|ET|ABC Business Unit|||ASG_CHANGE|E|Employee|EMP|HRC_SQLLOADER|12608_WORK_TERM

MERGE|WorkTerms||ET10090-2|2017/07/01|4712/12/31|1|Y|101062_WORK_RELATIONSHIP|1190_PERSON||ABC Limited|2015/02/02||ACTIVE_PROCESS|ET|ABC Business Unit|||ASG_CHANGE|E|Employee|EMP|HRC_SQLLOADER|12608_WORK_TERM



METADATA|Assignment|ActionCode|EffectiveStartDate|EffectiveEndDate|EffectiveSequence|EffectiveLatestChange|WorkTermsAssignmentId(SourceSystemId)|WorkTermsNumber|AssignmentType|AssignmentName|AssignmentNumber|AssignmentStatusTypeCode|BusinessUnitShortCode|DateProbationEnd|WorkerCategory|AssignmentCategory|PermanentTemporary|FullPartTime|InternalFloor|GradeCode|JobCode|LocationCode|NormalHours|Frequency|DepartmentName|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PositionCode|PrimaryAssignmentFlag|PrimaryFlag|PersonTypeCode|SystemPersonType|EstablishmentId|NoticePeriod|NoticePeriodUOM|ProbationPeriod|ProbationUnit|HourlySalariedCode|LabourUnionMemberFlag|ManagerFlag|EndTime|StartTime|WorkAtHomeFlag|SourceSystemOwner|SourceSystemId

MERGE|Assignment|HIRE|2002/10/07|2008/12/07|1|Y|283_WORK_TERM||E||E10090|ACTIVE_PROCESS|ABC Business Unit||||||||||||ABC Limited|283_WORK_RELATIONSHIP|1190_PERSON||Y|Y|Employee|EMP||||||||||||HRC_SQLLOADER|283_ASSIGNMENT

MERGE|Assignment|TERMINATION|2008/12/08|4712/12/31|1|Y|283_WORK_TERM||E||E10090|INACTIVE_PROCESS|ABC Business Unit|||||||72|128|343|||ABC Maintenance|283_WORK_RELATIONSHIP|1190_PERSON||Y|Y||||||||||||||HRC_SQLLOADER|283_ASSIGNMENT

MERGE|Assignment|REHIRE|2015/02/02|2017/06/30|1|Y|12608_WORK_TERM||E||E10090-2|ACTIVE_PROCESS|ABC Business Unit|||||||67|173|142|||ABC Department|101062_WORK_RELATIONSHIP|1190_PERSON|165061|Y|Y|Employee|EMP|||||||N|N||||HRC_SQLLOADER|12608_ASSIGNMENT

MERGE|Assignment|ASG_CHANGE|2017/07/01|4712/12/31|1|Y|12608_WORK_TERM||E||E10090-2|ACTIVE_PROCESS|ABC Business Unit|||||||67|173|142||ABC Department|101062_WORK_RELATIONSHIP|1190_PERSON|627106|Y|Y|Employee|EMP|||||||N|N||||HRC_SQLLOADER|12608_ASSIGNMENT



METADATA|PersonLegislativeData|PersonLegislativeId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|LegislationCode|HighestEducationLevel|MaritalStatus|MaritalStatusDate|Sex|GUID|SourceSystemOwner|SourceSystemId

MERGE|PersonLegislativeData||2002/10/07|4712/12/31|1190_PERSON|US||M||M||HRC_SQLLOADER|1190_PERSON_LEGISLATIVE
HDL – Sample HDL file to update Organization DFF

Use any of the below formats of Organization HDL to update descriptive flexfield attribute:

Using Surrogate IDs:

METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate|FLEX:PER_ORGANIZATION_UNIT_DFF|testAttribute(PER_ORGANIZATION_UNIT_DFF=Global Data Elements)
MERGE|Organization|3000000001230139|1951/01/01|4712/12/31|Global Data Elements|Xyz	

METADATA|OrgUnitClassification|OrgUnitClassificationId|OrganizationId|EffectiveStartDate|EffectiveEndDate
MERGE|OrgUnitClassification|3000000001233056|3000000001230139|1951/01/01|4712/12/31

Using User Keys:

METADATA|Organization|Name|ClassificationName|EffectiveStartDate|EffectiveEndDate|FLEX:PER_ORGANIZATION_UNIT_DFF|testAttribute(PER_ORGANIZATION_UNIT_DFF=Global Data Elements)
MERGE|Organization|Test Organization|Department|1951/01/01|4712/12/31|Global Data Elements|Xyz	

METADATA|OrgUnitClassification|OrganizationName|ClassificationName|SetCode|EffectiveStartDate|EffectiveEndDate
MERGE|OrgUnitClassification|Test Organization|Department|COMMON|1951/01/01|4712/12/31

Version Validated :- 21D

HDL – Sample file for Dependent Enrollments
METADATA|DependentEnrollment|PersonNumber|ParticipantLastName|ParticipantFirstName|BenefitRelationship|LifeEvent|LifeEventOccuredDate|CloseLifeEvent|CloseLifeEventDate|EffectiveDate|LegalEmployer

MERGE|DependentEnrollment|22211|Singh|Manu|DFLT|New Hire|2021/01/01|Y|2021/01/01|2021/01/01|XYZ Legal Employer



METADATA|DesignateDependent|PersonNumber|Plan|Program|Option|OriginalEnrollmentDate|DependentPersonNumber|LineNumber|DependentFirstName|DependentLastName|DependentDateOfBirth
MERGE|DesignateDependent|22211|Benefit Plan 1|Benefit Program|Employee Only|2020/01/01|122211|1|Singh|Test 1|1992/01/31
MERGE|DesignateDependent|22211|Benefit Plan 1|Benefit Program|Employee Only|2020/01/01|122212|2|Singh|Test 2|2015/01/01
MERGE|DesignateDependent|22211|Benefit Plan 1|Benefit Program|Employee Only|2020/01/01|122213|3|Singh|Test 3|2017/01/05
HDL – Sample file to delete element eligibility

HDL supports deletion of element eligibilities. You can make use of PayrollElementDefinition business object. Child object ElementEligibility supports create, update and delete operations.

You can create a simple BIP query to extract the requisite data and create an HDL file out of it:

select pelf.element_link_id
      ,petf.element_type_id
      ,petf.base_element_name
      ,pelf.effective_start_date 
  from pay_element_types_f petf
     , pay_element_links_f pelf
 where petf.element_type_id = pelf.element_type_id
   and element_link_id=300000175215375   
   and TRUNC(sysdate) BETWEEN petf.effective_start_date and petf.effective_end_date 

Sample HDL file:

METADATA|ElementEligibility|ElementLinkId|ElementId|EffectiveStartDate
DELETE|ElementEligibility|300000175215375|300000175200627|1951/01/01

Sometime you may get an error that You can’t delete element eligibility as element entries exist for this eligibility. In this case first you need to identify the corresponding element entries and Delete those and then retry deleting element eligibility.

Please check mos note – When Attempting to Delete Element Eligibility Get Error ‘The element eligibility record can’t be deleted because it would invalidate existing element entries with effective start dates in the future. (PAY-1635756)’ (Doc ID 2686914.1) for sample query and sample file for element entry deletion.

Version Validated : 21C

HDL – Sample HDL to load Performance Rating

Historical performance ratings can be loaded into Worker Talent Profile using TalentProfile.dat business object.

Below is the sample HDL (enhanced profiles (v2)), to load performance rating data:

METADATA|ProfileItem|ProfileItemId|ProfileId|ProfileCode|ContentItemId|ContentTypeId|ContentType|CountryId|DateFrom|DateTo|RatingModelId1|RatingModelCode1|RatingLevelId1|RatingLevelCode1|SectionId|SourceSystemOwner|SourceSystemId
MERGE|ProfileItem|||PER_0055|||PERFORMANCE_RATING||2018/01/01|2018/12/31||XXX_PERFORMANCE||3.75|9933|HRC_SQLLOADER|PR_0055
MERGE|ProfileItem|||PER_0031|||PERFORMANCE_RATING|| 2018/01/01|2018/12/31||XXX_PERFORMANCE||3.90|9933| HRC_SQLLOADER |PR _0055 
BIP – Query to extract Actions.dat data

Many a time there is a requirement to update/change the existing actions and actions reasons usages. Doing this manually in UI will become a cumbersome process for multiple actions and may lead to human errors as well.

For this, HCM Data Loader can used to update the changes using Actions.dat file.

Use the below SQL queries to extract Actions and Action Reason Usages data from your Pod in HDL format:

Actions:

Select 'MERGE|Actions|'
||
actb.action_code
||'|'||
actt.action_name
||'|'||
actb.action_type_code
||'|'||
to_char(actb.start_date, 'yyyy/mm/dd')
||'|'||
to_char(actb.end_date, 'yyyy/mm/dd')
||'|'||
map.source_system_id
||'|'||
map.source_system_owner ACTIONS_HDL
from
PER_ACTIONS_B actb,
PER_ACTIONS_TL actt,
hrc_integration_key_map map
where 1=1
and actb.action_id = actt.action_id
and actt.language = USERENV('LANG')
and actb.action_id = map.surrogate_id

Action Reason Usage:

Select 'MERGE|ActionReasonUsage|'
||
aru.action_code
||'|'||
aru.action_reason_code
||'|'||
to_char(aru.start_date, 'yyyy/mm/dd')
||'|'||
to_char(aru.end_date, 'yyyy/mm/dd')
||'|'||
km.source_system_id
||'|'||
km.source_system_owner ARC_HDL
from 
hrc_integration_key_map km,
PER_ACTION_REASON_USAGES aru
where 1=1
and aru.ACTION_REASON_USAGE_ID = km.surrogate_id

Copy and save the data as Actions.dat and do the required changes.

Sample File:

METADATA|Actions|ActionCode|ActionName|ActionTypeCode|StartDate|EndDate|SourceSystemId|SourceSystemOwner
MERGE|Actions|TEST_MANAGE_CONTRACT|Test Manage Contract|EMPL_CONTRACT_EXTN|1951/01/01|4712/12/31|300000072427734|FUSION

METADATA|ActionReasonUsage|ActionCode|ActionReasonCode|StartDate|EndDate|SourceSystemId|SourceSystemOwner
MERGE|ActionReasonUsage|TEST_MANAGE_CONTRACT|CHANGE_CONTRACT|1951/01/01|4712/12/31|300000072427735|FUSION

Link to refresh business objects to get latest attributes of Actions.dat:

https://fusionhcmconsulting.com/2021/01/hdl-refresh-business-objects/

HDL – Loading Classroom Resources in Bulk

HCM Data Loader object ClassroomResource can be used to bulk upload classroom resources in Oracle Learning Cloud. Existing locations created as part of Global HR can also be designated as classroom resources.

Below is a sample file to upload classroom resources using HDL:

METADATA|ClassroomResource|ClassroomResourceNumber|Title|Description|Capacity|SetCode|LocationCode|SourceSystemOwner|SourceSystemId|OwnedByPersonNumber
MERGE|ClassroomResource|CLASS1001|Room 1|Room 1|10|COMMON|JPLoc004 Kyushu|HRC_SQLLOADER|1001|101
MERGE|ClassroomResource|CLASS1002|Room 2|Room 2|15|COMMON|JPLoc004 Kyushu|HRC_SQLLOADER|1002|101
MERGE|ClassroomResource|CLASS1003|Room 3|Room 3|20|COMMON|JPLoc004 Kyushu|HRC_SQLLOADER|1003|101

Once the data is loaded successfully, you can verify the results from UI:

Navigation :- My Client Groups -> Learning -> Catalog Resources -> Classrooms

HDL – Sample File to DELETE documents of Record

Use the below query to extract document of record data which should be deleted. You can add additional filters by adding person numbers.

select 'DELETE' 					"METADATA"
      ,'DocumentsOfRecord'			"DocumentsOfRecord"
	  ,hdor.documents_of_record_id	"DocumentsOfRecordId"
	  ,hdor.document_type_id	    "DocumentTypeId"
	  ,hdor.document_code			"DocumentCode"
	  ,hdor.document_name			"DocumentName"
	  ,hdor.document_number			"DocumentNumber"
	  ,hikm.source_system_owner		"SourceSystemOwner"
	  ,hikm.source_system_id		"SourceSystemId"
  from HR_DOCUMENTS_OF_RECORD hdor
      ,HRC_INTEGRATION_KEY_MAP hikm
 where hdor.documents_of_record_id = hikm.surrogate_id
   and hdor.documents_of_record_id = 300000217125443

Query for Person Number and Document Type:

select DISTINCT 'DELETE' 					"METADATA"
      ,'DocumentsOfRecord'			"DocumentsOfRecord"
	  ,hdor.documents_of_record_id	"DocumentsOfRecordId"
	  ,hdor.document_type_id	    "DocumentTypeId"
	  ,hdor.document_code			"DocumentCode"
	  ,hdor.document_name			"DocumentName"
	  ,hdor.document_number			"DocumentNumber"
	  ,hikm.source_system_owner		"SourceSystemOwner"
	  ,hikm.source_system_id		"SourceSystemId"
  from HR_DOCUMENTS_OF_RECORD hdor
      ,HRC_INTEGRATION_KEY_MAP hikm
	  ,PER_ALL_PEOPLE_F papf
	  ,HR_DOCUMENT_TYPES_TL hdtt
 where hdor.documents_of_record_id = hikm.surrogate_id
   --and hdor.documents_of_record_id = 300000217125443
   and hdor.person_id = papf.person_id
   and TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date
   and hdor.document_type_id = hdtt.document_type_id
   and hdtt.language = 'US'
   and hdtt.DOCUMENT_TYPE = 'Test Doc'
   and papf.person_number IN ('12','23')

Sample HDL File:

METADATA|DocumentsOfRecord|DocumentsOfRecordId|DocumentTypeId|DocumentCode|DocumentName|DocumentNumber|SourceSystemOwner|SourceSystemId
DELETE|DocumentsOfRecord|300000217125443|300000217168555|TMAD_001|Multiple Attachments||HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001
HDL – Update Goal Weight

Goal Weightage can be updated using GoalMeasurement metadata in Goal business object. First, we need to extract the uploaded measurements. Use the below query to extract the details:

SELECT HDL_DATA
FROM (
SELECT 'METADATA|GoalMeasurement|MeasurementId|MeasurementName|StartDate|EndDate|TargetValue|TargetPercentage|UomCode|MeasureTypeCode|TargetType|GoalId|SourceSystemId|SourceSystemOwner' HDL_DATA, 1 SEQ
  FROM DUAL
UNION ALL
SELECT 'MERGE'
     || CHR (124)
     || 'GoalMeasurement'
     || CHR (124)	 
	 || hgm.measurement_id
     || CHR (124)
	 || hgm.measurement_name
     || CHR (124)
	 || TO_CHAR(hgm.start_date,'RRRR/MM/DD')
     || CHR (124)
	 || TO_CHAR(hgm.end_date,'RRRR/MM/DD')
     || CHR (124)
	 || hgm.target_value
     || CHR (124)
	 || hgm.target_percentage
     || CHR (124)
	 || hgm.uom_code
     || CHR (124)
	 || hgm.measure_type_code
     || CHR (124)
	 || hgm.target_type
     || CHR (124)
	 || hg.goal_id
     || CHR (124)
	 || hikm.source_system_id
     || CHR (124)
	 || hikm.source_system_owner HDL_DATA, 2 SEQ
  FROM hrg_goals hg
      ,hrg_goal_measurements hgm
      ,hrc_integration_key_map hikm
 WHERE hg.goal_id = hgm.goal_id
   AND hgm.measurement_id = hikm.surrogate_id
   AND hgm.uom_code = 'PERCENT'
   --AND hg.goal_id = 300000215856607
)
ORDER BY SEQ

Save the data in Goal.dat HDL file. Update the target value as required. Save the .dat file and upload.

Sample HDL:

METADATA|GoalMeasurement|MeasurementId|MeasurementName|StartDate|EndDate|TargetValue|TargetPercentage|UomCode|MeasureTypeCode|TargetType|GoalId|SourceSystemId|SourceSystemOwner
MERGE|GoalMeasurement|300000215856608|Customer Survey Results|2019/01/01|2019/12/31|80||PERCENT|QUANTITATIVE|MIN|300000215856607|300000215856608|FUSION
Extract – Loopback Inbound Extract

Often in Cloud HCM, we encounter situation(s) where we need to update some information at worker assignment, post worker creation as this information was not available at the time of hiring an worker. One such example could be Employee category. Let’s take a hypothetical example, employee category should be auto populated based on worker Job. As, there is no direct link between employee category and job, so it becomes a pain to manually search and put the correct employee category while hiring. So, in this case, the worker is hired with Job with no value for employee category.

A DFF is opened at Job level which store the corresponding employee category. So, in this case we design a solution which will:

  • Read the worker job and then the corresponding employee category from Job.
  • Generate the data for WorkTerms and Assignments METADATA in HCM Data Loader Format.
  • HCM Extract to consume the data and trigger HDL Import and Load Process.
  • Schedule HCM Extract to run daily or depending upon the requirement.

Once, HCM Extract is run, employee category will populated automatically.

Steps to design the integration:

  1. Extract the Workterms and assignment data for all workers where the job is populated and employee category is NULL.
  2. Create a BIP publisher report to organize the data extracted in Step 1 in HCM Data Loader format. Copy the Global Reports Data Model (from path /Shared Folders/Human Capital Management/Payroll/Data Models/globalReportsDataModel) to a folder in /Shared Folders/Custom/HR. This folder can be anything as per your nomenclature specifications.
  3. Add a new data set in the globalReportsDataModel and paste your query in the new data set.
  4. Check below post on sample query to extract the data in HCM Data Loader format: https://fusionhcmconsulting.com/2021/01/hdl-updating-an-attribute-on-assignment/
  5. Create an etext template (sample RTF template is available to above link) and a report based on above data model.
  6. Global Data Model will look like (G_3 is new data set):-

Steps to create HCM Extract:

You can import the below sample extract in your environment:

  • Download the above file. Change the extension to xml.
  • Open the xml file with Notepad or Notepad++ and remove first two rows (these rows were added to make sure the file is uploaded here).
  • Navigate to My Client Groups -> Data Exchange -> HCM Extracts -> Extract Definitions:
  • Click on Import to import the xml file
  • Provide an Extract name. Uncheck the Changes Only checkbox and click on Ok:
  • Once the extract Import is complete, Click on pencil icon to edit:
  • Click on ‘Extract Delivery Option’ in navigation Tree on left side. And on the right side, Under ‘Extract Delivery Options’ click on edit to update the path of your report as created earlier. It should like – /Custom/HR/AssignmentUpdateRPT.xdo
  • Make sure default value for parameter Auto Load is set “Y”.
  • Save the details. Click on Extract Execution Tree next and Click All Formula:
  • Once the formulas are complied, then click on Submit button.

The next step is to refine the extract in order to Submit the Import and Load process:

  • Navigate to My Client Groups -> Data Exchange -> HCM Extracts -> Refine Extracts. Search the extract and click on edit.
  • Select and Add – Initiate HCM Data Loader process
  • Click on Go Task for “Initiate HCM Data Loader” and Click Edit for “ Data Loader Archive Action” and add the relevant parameters:

Parameter Basis – Bind to Flow Task
Basis Value – XX Assignment Update Integration, Submit , Payroll Process

  • Click Edit for “Data Loader Configurations” add relevant parameters

Parameter Basis – Constant Bind
Basis Value -ImportMaximumErrors=100,LoadMaximumErrors=100,LoadConcurrentThreads=8,LoadGroupSize=100

  • Task sequence should look as follows:
  • Go to Review and click on Submit.

Your extract is now ready for submission. You can submit the extract and test it.

HDL – Delete custom Action Reason Usages using HDL

Actions and Action Reasons are very important part of any Cloud HCM implementation. Oracle provides a large number of actions and action reasons out of the box. But if needed additional actions and action reasons can be created from UI as well as using HDL.

Each action is tied with an action type. Please note that action types are seeded and can’t be created. You can create a custom action and attach existing action reasons to it. The details are stored in PER_ACTION_REASON_USAGES table.

During implementation, there is a common requirement to delete some of the unwanted action reason usages which were created initially and are no longer required. In such cases finding each reason and deleting it from action is quite a painful task.

This can be achieved easily using HCM Data Loader.

Run the below query in BIP and save the file as Actions.dat. Zip the file and kick Import and Load HCM Data Loader process. You can modify the extract criteria as per your requirement:

SELECT data
FROM (
SELECT 'METADATA|ActionReasonUsage|ActionCode|ActionReasonCode|StartDate|EndDate|SourceSystemId|SourceSystemOwner' data, 1 DATA_SEQ
  FROm DUAL
UNION ALL
Select 'DELETE|ActionReasonUsage|'
||
paru.action_code
||'|'||
paru.action_reason_code
||'|'||
to_char(paru.start_date, 'yyyy/mm/dd')
||'|'||
to_char(paru.end_date, 'yyyy/mm/dd')
||'|'||
hikm.source_system_id
||'|'||
hikm.source_system_owner data, 2 DATA_SEQ
from 
hrc_integration_key_map hikm,
PER_ACTION_REASON_USAGES paru
where 1=1
and paru.ACTION_REASON_USAGE_ID = hikm.surrogate_id
and paru.created_by <> 'SEED_DATA_FROM_APPLICATION'
)
ORDER BY DATA_SEQ

Sample File:

METADATA|ActionReasonUsage|ActionCode|ActionReasonCode|EndDate|StartDate|SourceSystemId|SourceSystemOwner
DELETE|ActionReasonUsage|TEST_NEW|CMP_CHG||2002/01/01|HRC_SQLLOADER_TEST_NEW|HRC_SQLLOADER

Query to check action reason codes for an action:

SELECT paru.action_code
      ,paru.action_reason_code
  FROM per_action_reason_usages paru
 WHERE paru.action_code like '%CHANGE_SALARY%'
HDL – Sample File to Load Succession Plan

You can make use of HCM Data Loader to upload Succession Plans (Incumbent, Job and Position Types).

Use the below sample file to load Succession Plans:

METADATA|SuccessionPlan|PlanName|PlanType|Status|AccessTypeCode|Description|IncumbentPersonNumber|JobCode|JobSetCode|DepartmentName|DepartmentSetCode|GradeSetCode|GradeCode|PositionCode|BusinessUnitName|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlan|Test_Person_Succession_Plan|INCUMBENT|ACTIVE|PUBLIC|Sample description for Incument Plan Type.|111222|||||||||HRC_SQLLOADER_Test_Person_Succession_Plan_INCUMBENT|HRC_SQLLOADER
MERGE|SuccessionPlan|Test_Job_Succession_Plan|JOB|ACTIVE|PUBLIC|Sample description for Job Plan Type.||XX052|COMMON|Test HR Department|COMMON|COMMON|GRD_90||XX BU|HRC_SQLLOADER_Test_Job_Succession_Plan_JOB|HRC_SQLLOADER
MERGE|SuccessionPlan|Test_Position_Succession_Plan|POSITION|ACTIVE|PUBLIC|Sample description for Position Plan Type.||||Test HR Department|COMMON|||POS_0301|COMMON|HRC_SQLLOADER_Test_Position_Succession_Plan_POSITION|HRC_SQLLOADER

METADATA|SuccessionPlanOwner|PlanName|PlanOwnerPersonNumber|OwnerTypeCode|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlanOwner|Test_Person_Succession_Plan|99997|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99997_Test_Person_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Job_Succession_Plan|99997|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99997_Test_Job_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Position_Succession_Plan|99997|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99997_Test_Position_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Position_Succession_Plan|99998|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99998_Test_Position_Succession_Plan|HRC_SQLLOADER

Once the data is loaded, you can run the query to check loaded plans in HRM_PLANS table.

BIP – Query to get Configure HCM Data Loader parameters

Configure HCM Data Loader is a task that is used to define the HCM Data Loader parameters. The parameters are broadly divided into following categories:- Availability, Diagnostic, File Definition, Performance and Scheduling Default. These are the default settings which are applied to any HCM Data Load.

You can override some of these parameters at the individual dat file level i.e. using SET commands or some of the parameters can be overridden while submitting the Import and Load job.

Please note that the default values vary from HDL to HSDL (Spreadsheet Loader).

You can use below mentioned query to get the details from backend using a BIP:

select PARAM_CATEGORY
          ,PARAM_NAME
	  ,DEFAULT_VALUE
	  ,HSDL_DEFAULT_VALUE
          ,HDL_ENABLED
	  ,HDL_OVERRIDE_ENABLED
	  ,HSDL_ENABLED
	  ,HSDL_OVERRIDE_ENABLED
	  ,VALUE_LOOKUP_TYPE
	  ,CREATED_BY
	  ,CREATION_DATE
	  ,LAST_UPDATED_BY
	  ,LAST_UPDATE_DATE
 from hrc_dl_all_parameters
 ORDER By 1,2

Query to get list of overridden values:

select hdap.PARAM_CATEGORY
      ,hdap.PARAM_NAME
	  ,hdap.DEFAULT_VALUE
	  ,hdap.HSDL_DEFAULT_VALUE
      ,hdap.HDL_ENABLED
	  ,hdap.HDL_OVERRIDE_ENABLED
	  ,hdap.HSDL_ENABLED
	  ,hdap.HSDL_OVERRIDE_ENABLED
	  ,hdap.VALUE_LOOKUP_TYPE
	  ,hdap.CREATED_BY
	  ,hdap.CREATION_DATE
	  ,hdap.LAST_UPDATED_BY
	  ,hdap.LAST_UPDATE_DATE
	  ,hdpo.OVERRIDE_LEVEL "OVERRIDDEN_AT_LEVEL"
	  ,hdpo.OVERRIDE_VALUE "OVERRIDDEN_VALUE"
 from hrc_dl_all_parameters hdap
     ,hrc_dl_parameter_overrides hdpo
where hdap.parameter_id = hdpo.parameter_id	 
 ORDER By 1,2
HDL – Updating Work Day Information at Department Level

Organization.dat can be used to upload work day information at department level.

METADATA|OrgInformation|OrgInformationContext|SequenceNumber|FLEX:PER_ORGANIZATION_INFORMATION_EFF|EFF_CATEGORY_CODE|EffectiveStartDate|EffectiveEndDate|OrganizationName|ClassificationName|_STD_WORKING_HOURS(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_STD_WORKING_HOURS_FREQUENCY(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_STD_WORKING_HOURS_FREQUENCY_Display(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_WORK_END_TIME(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_WORK_START_TIME(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|OrganizationId
MERGE|OrgInformation|PER_WORK_DAY_INFO|1|PER_WORK_DAY_INFO|DEPARTMENT|1951/01/01||XYZ Markets|Department|40|W||18:00|09:00|5000000026311

Please verify the updated information from UI:

Validate Version – 21A

HDL – Inactivating Worker Unions

HCM Data Loader can be used to bulk upload worker unions. In Cloud HCM, Worker Unions are just another type of Organization.

You can run below query to find different Organization Types and their codes:

https://fusionhcmconsulting.com/2021/01/reports-bip-organization-classifications-in-fusion/

Below is the sample file to inactivate a worker union:

METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate|Name|ClassificationCode|ClassificationName|InternalAddressLine|ActionReasonCode|LocationSetId|LocationCode|LocationSetCode|EstablishmentId|EstablishmentName|GUID|SourceSystemOwner|SourceSystemId
MERGE|Organization||1951/01/01|2021/05/12|Test WU2|ORA_PER_UNION||||||||||FUSION|300000217254520
MERGE|Organization||2021/05/13|4712/12/31|Test WU2|ORA_PER_UNION||||||||||FUSION|300000217254520

METADATA|OrgUnitClassification|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|OrganizationId(SourceSystemId)|OrganizationName|ClassificationCode|ClassificationName|CategoryCode|SetCode|Status|LegislationCode|GUID|SourceSystemOwner|SourceSystemId
MERGE|OrgUnitClassification||1951/01/01|2021/05/12|300000217254520|Test WU2|ORA_PER_UNION||||A|IN||FUSION|300000217254521
MERGE|OrgUnitClassification||2021/05/13|4712/12/31|300000217254520|Test WU2|ORA_PER_UNION||||I|IN||FUSION|300000217254521

The file makes use of Source keys. You can as well use the surrogate keys.

Below sample user surrogate keys:

METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate
MERGE|Organization|300000217254520|1951/01/01|2021/05/12
MERGE|Organization|300000217254520|2021/05/13|4712/12/31

METADATA|OrgUnitClassification|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|OrganizationId|Status
MERGE|OrgUnitClassification|300000217254521|1951/01/01|2021/05/12|300000217254520|A
MERGE|OrgUnitClassification|300000217254521|2021/05/13|4712/12/31|300000217254520|I

Sample Query to extract the data for Organization METADATA in HDL format:

SELECT 'METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate|OrganizationName' DATAROW, 1 sequence
FROM DUAL
UNION ALL
SELECT 'MERGE'
       ||CHR (124)
	   ||'Organization'
	   ||CHR (124)
	   ||haou.organization_id
	   ||CHR (124)
	   ||to_char(haou.effective_start_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||to_char(haou.effective_end_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||haout.name DATAROW, 2 sequence
FROM hr_all_organization_units haou,hr_all_organization_units_tl haout
where haou.organization_id = haout.organization_id
and haout.language ='US'
--AND haou.organization_id = 300000047274447
UNION ALL
SELECT 'METADATA|OrgUnitClassification|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|OrganizationId|Status' DATAROW, 3 sequence
FROM DUAL
UNION ALL
SELECT 'MERGE'
       ||CHR (124)
	   ||'OrgUnitClassification'
	   ||CHR (124)
	   ||houcf.org_unit_classification_id
	   ||CHR (124)
	   ||to_char(houcf.effective_start_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||to_char(houcf.effective_end_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||haou.organization_id
	   ||CHR (124)
	   ||houcf.status DATAROW, 4 sequence
FROM hr_all_organization_units haou,hr_org_unit_classifications_f houcf
where haou.organization_id = houcf.organization_id
--AND haou.organization_id = 300000047274447
HDL – Loading CoverArt File at Course Level

While creating the Courses in Oracle Learning Cloud, there is a capability to load CoverArt for each course. If no cover art is loaded, the learner is presented with a blue strip film.

HCM Data Loaders provides an option to bulk upload the CoverArt file for multiple courses at one go. All the cover art related files (jpg, jepg) should be included in BlobFiles folder and the name of the jpg/jpeg file should be provided in the dat file.

Below is the sample Course dat file:

METADATA|Course|CourseId|EffectiveStartDate|EffectiveEndDate|CourseNumber|Title|ShortDescription|Syllabus|PublishStartDate|PublishEndDate|MinimumExpectedEffort|MaximumExpectedEffort|CurrencyCode|MinimumPrice|MaximumPrice|CoverArtFile|CoverArtFileName|TrailerLiId|TrailerLiNumber|OwnedByPersonId|OwnedByPersonNumber|SourceType|SourceId|SourceInfo|SourceSystemOwner|SourceSystemId
MERGE|Course||2016/01/01||COURSEN_1866|Test Staff Orientation|||2016/01/01||||USD|||Capture.jpg|Capture.jpg||||1010|TALEO|1111|TALEO_1111|HRC_SQLLOADER|HRC_SQLLOADER_1111

Once the BlobFiles Folder and dat file are ready, zip them together and trigger the HDL Import and Load process:

Once the Import and Load process is successful, login to Learning Admin to verify:

My Client Groups -> Learning -> Learning Catalog -> Courses