Search for:
HDL – Person EFF Information loaded using HDL is not visible in UI

Many a times, even after loading Person EFF information successfully, using HCM Data loader, the information is not available in UI.

In such cases, please make sure that following fields are passed correctly:

InformationType – Name of extra information type e.g. ‘XYZ Medical History’

EFF_CATEGORY_CODE – EFF Category Context. For example, for person EIT, the value will be PER_EIT

CategoryCode – EFF Category Context. For example, for person EIT, the value will be PER_EIT

PeiInformationCategory – Name of extra information type e.g. ‘XYZ Medical History’

Sample HDL file to load Worker EFF information:

METADATA|WorkerExtraInfo|PersonNumber|PersonId|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|FLEX:PER_PERSON_EIT_EFF|PeiInformationCategory|CategoryCode|InformationType|medicalStatus(PER_PERSON_EIT_EFF=XYZ Medical Informaton))|EFF_CATEGORY_CODE

MERGE|WorkerExtraInfo|998812||2022/04/01|4712/12/31|FUSION_998812_1|HRC_SQLLOADER|XYZ Medical Informaton|XYZ Medical Informaton|PER_EIT|XYZ Medical Informaton|Normal|PER_EIT
BIP – Extract Document Record File Link from Content Server

Below query can be used to extract the document record from content server.

SELECT papf.person_number		"Person Number",
       ppnf.first_name			"First Name",
       ppnf.last_name			"Last Name",
       fdt.file_name 			"Attached File Name",
       fdt.dm_version_number 	"Document Id",
       fdt.dm_document_id 		"UCM Content Id",
       (SELECT 'https://'||external_virtual_host
          FROM fusion.ask_deployed_domains
         WHERE deployed_domain_name = 'FADomain')
	   ||'/cs/idcplg?IdcService=GET_FILE' 
	   || chr(38) 
	   || 'dID='
       || fdt.dm_version_number
       || '&dDocName='
       || fdt.dm_document_id
       || '&allowInterrupt=1' 	"UCM File Link"  
  FROM per_all_people_f papf,
       per_person_names_f ppnf,
       hr_documents_of_record hdor,
       fnd_attached_documents fad,
       fnd_documents_tl fdt
 WHERE 1=1
   AND hdor.person_id = papf.person_id
   AND papf.person_id = ppnf.person_id
   AND hdor.documents_of_record_id = fad.pk1_value
   AND fad.document_id = fdt.document_id
   AND fdt.language = 'US'
   AND fad.entity_name = 'HR_DOCUMENTS_OF_RECORD'
   AND ppnf.name_type = 'GLOBAL'
   AND trunc(sysdate) between papf.effective_start_date and papf.effective_end_date
   AND trunc(sysdate) between ppnf.effective_start_date and ppnf.effective_end_date
 ORDER BY 1
HDL – Sample file to create Pending Worker Record

Below is a sample file to create Pending Worker record using HDL:

METADATA|Worker|EffectiveStartDate|EffectiveEndDate|PersonNumber|StartDate|DateOfBirth|CategoryCode|ActionCode|SourceSystemOwner|SourceSystemId
MERGE|Worker|2023/07/01|4712/12/31||2023/07/01|1990/05/12||ADD_PEN_WKR|HDL|TestEmp_123

METADATA|PersonName|EffectiveStartDate|EffectiveEndDate|PersonNumber|LegislationCode|NameType|FirstName|MiddleNames|LastName|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)
MERGE|PersonName|2023/07/01|4712/12/31||GB|GLOBAL|TestEmpFN|U|TestEmp|HDL|TestEmpName_123|TestEmp_123

METADATA|PersonLegislativeData|EffectiveStartDate|EffectiveEndDate|PersonNumber|LegislationCode|Sex|MaritalStatus|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)
MERGE|PersonLegislativeData|2023/07/01|4712/12/31||GB|M|M|HDL|TestEmpLegData_123|TestEmp_123

METADATA|WorkRelationship|PersonNumber|DateStart|WorkerType|ActionCode|LegalEmployerName|LegalEmployerSeniorityDate|EnterpriseSeniorityDate|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)|ReadyToConvert
MERGE|WorkRelationship||2023/07/01|P|ADD_PEN_WKR|GB Legal Employer|||HDL|TestEmpWR_123|TestEmp_123|Y

METADATA|WorkTerms|AssignmentNumber|EffectiveStartDate|EffectiveEndDate|PersonNumber|EffectiveLatestChange|EffectiveSequence|LegalEmployerName|WorkerType|DateStart|AssignmentStatusTypeCode|BusinessUnitShortCode|ActionCode|PrimaryWorkTermsFlag|ProposedUserPersonType|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)|PeriodOfServiceId(SourceSystemId)
MERGE|WorkTerms||2023/07/01|4712/12/31||Y|1|GB Legal Employer|P|2023/07/01|ACTIVE_PROCESS|GB Business Unit|ADD_PEN_WKR|Y|Member|HDL|TestEmpWT_123|TestEmp_123|TestEmpWR_123


METADATA|Assignment|ActionCode|EffectiveStartDate|EffectiveEndDate|EffectiveSequence|EffectiveLatestChange|WorkTermsNumber|AssignmentNumber|AssignmentStatusTypeCode|BusinessUnitShortCode|PersonNumber|WorkerType|DateStart|LegalEmployerName|PrimaryAssignmentFlag|ProposedUserPersonType|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)|WorkTermsAssignmentId(SourceSystemId)|ProjectedStartDate
MERGE|Assignment|ADD_PEN_WKR|2023/07/01|4712/12/31|1|Y|||ACTIVE_PROCESS|GB Business Unit||P|2023/07/01|GB Legal Employer|Y|Member|HDL|TestEmpASG_123|TestEmp_123|TestEmpWT_123|2023/07/10

ReadyToConvert Flag on Work relationship is used to convert the Pending worker record to Employee record. If this flag is set to Y, it will appear in NewPerson Dashboard with Automatic conversion marked as Yes.

Run Convert Pending Workers Automatically Process will pick the pending worker and convert it into Worker.

REST API Error –  The specified operation is not supported for the invoked HTTP method

Many a times while trying to use Post method using REST API, we encounter – ” The specified operation is not supported for the invoked HTTP method. Please check the URL and the headers.”

Cause of this error is incorrect header parameters/ missing header parameters while making a call to REST API.

To fix the issue:

  1. Open Postman
  2. Under Headers:

3. Set below values:

Key – Content-type

Value – application/vnd.oracle.adf.action+json

Once the header values are set, the error will go away:

HDL – Sample HDL file to create Role Provisioning Rules
METADATA|RoleMapping|RoleMappingId|MappingName|DateFrom|DateTo|LegalEmployerName|SystemPersonType|UserPersonType|AssignmentType|AssignmentStatus|SourceSystemId|SourceSystemOwner
MERGE|RoleMapping||Test HDL|1951/01/01|4712/12/31|Test Legal Employer|Employee|Employee|E|ACTIVE|RoleMapping_123|HRC_SQLLOADER


METADATA|Role|RoleMappingRoleId|RoleMappingId(SourceSystemId)|MappingName|RoleId|RequestableFlag|SelfRequestableFlag|UseForAutoProvisioningFlag|RoleCommonName|SourceSystemId|SourceSystemOwner
MERGE|Role||RoleMapping_123|Test HDL||N|N|Y|TEST_EMP_DATA|Role_123|HRC_SQLLOADER

The rules can be verified from UI, once the HDL load is successful:

HDL – Sample File to upload location DFF values
METADATA|Location|SourceSystemOwner|SourceSystemId|LocationCode|LocationName|EffectiveStartDate|EffectiveEndDate|SetCode|ActiveStatus|AddressLine1|Country|Description|AddressLine2|AddressLine3|AddressLine4|Building|FloorNumber|PostalCode|Region1|Region2|Region3|TimezoneCode|TownOrCity|FLEX:PER_LOCATIONS_DF|dffType(PER_LOCATIONS_DF=Global Data Elements)
MERGE|Location|HRC_SQLLOADER|TEST_LOCATION_1|TEST_LOCATION|Test Location|1951/01/01|4712/12/31|COMMON|A|1 Churchill Place|GB|1 Churchill Place||||||||||Europe/London|London|Global Data Elements|Sample DFF value
HDL – Loading Delivery Preferences for a Worker

PersonDeliveryMethod child business object of Worker can be used to upload delivery preferences for a worker.

Below is sample HDL file:

METADATA|PersonDeliveryMethod|DeliveryMethodId|DateStart|DateEnd|PersonId|PersonNumber|PreferredOrder|CommDlvryAddress|CommDlvryMethod|CommDlvryFkId|AddressType|AddressLine1|PhoneType|PhoneNumber|EmailType|EmailAddress|SourceSystemOwner|SourceSystemId
MERGE|PersonDeliveryMethod||2023/01/01|4712/12/31||1234|1||NORMAL||HOME|Address Line 1|||||HRC_SQLLOADER|1234_HOME_Address Line 1
HDL – Loaded Salary Data Extract in HDL Format
Select 'METADATA|Salary|AssignmentNumber|SalaryAmount|DateFrom|DateTo|SalaryBasisId|SalaryId|SourceSystemId|SourceSystemOwner|ActionCode|ActionReasonCode' Header, 1 data_flow_order
from dual
UNION
SELECT 'MERGE|Salary'||'|'||
paam.assignment_number||'|'||
SALARY_AMOUNT||'|'||
TO_CHAR(cs.date_from,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
TO_CHAR(cs.date_to,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
cs.salary_basis_id||'|'||
cs.salary_id||'|'||
hikm.source_system_id||'|'|| 
source_system_owner||'|'|| 
pav.action_code||'|'||
'XX_ANNL_REVIEW'  data_row,
2 data_flow_order
FROM cmp_salary cs,
per_all_assignments_m paam,
hrc_integration_key_map hikm,
PER_ACTIONS_VL pav,
PER_ACTION_REASONS_VL parv
WHERE cs.assignment_id= paam.assignment_id
AND trunc(sysdate) between paam.effective_start_date AND paam.effective_end_date
AND paam.assignment_type not like '%T'
AND cs.salary_id = hikm.surrogate_id
and cs.action_id = pav.action_id
and cs.action_reason_id = parv.action_reason_id
and parv.action_reason_code = 'CMP_ANNV'
ORDER BY data_flow_order
HDL – Sample file to load Life events

HDL provides an option to upload Potential life events for a person in benefits. Please note that ‘Open Enrolment’ life event can’t be loaded using the potential life events file.

Provide the data in below format and save it as PotentialLifeEvents.dat

METADATA|PotentialLifeEvents|PersonNumber|LegalEmployer|BenefitRelationName|LifeEventName|LifeEventStatusCode|LifeEventOccuredDate|UnprocessedDate|NotificationDate|DetectedStatusDate|ManualStatusDate|ManualOverrideStatusDate|ProcessedDate|VoidedStatusDate|PtnlLerForPerSrcCd|SourceSystemId|SourceSystemOwner|LerId|PtnlLerForPerId|PersonId|LegalEntityId|BenefitRelationId|LifeEventTypeCode|ProdCd
MERGE|PotentialLifeEvents|123718|XX Test|DFLT|New Hire|UNPROCD|2023/02/01|2023/02/01||||||||123718_2023/02/01|HRC_SQLLOADER|||||||
HDL – Inactivate secondary Org classification

Oracle HCM allows an Organization to be classified as multiple Orgs. However, sometime there is a need to inactivate one org classification while keeping the primary classification as Active.

Below is a sample HDL file which can be used for this purpose:

METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate|ClassificationCode
MERGE|Organization|300000085401190|2023/04/01||PA_EXPENDITURE_ORG

METADATA|OrgUnitClassification|OrganizationId|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|Status|ClassificationCode
MERGE|OrgUnitClassification|300000085401190|300000261056753|2023/04/01||I|PA_EXPENDITURE_ORG
BIP – Query to find Primary flags for a Worker

There are multiple primary flags for a worker in assignment table. Namely – Primary Assignment Flag, Primary Work Terms flag and Primary flag.

Below query can be used to check these flags:

--> Primary Flags Query
SELECT papf.person_number
      ,ppnf.first_name
      ,ppnf.last_name
      ,to_char(ppos.date_start,'RRRR/MM/DD') start_date
	  ,paam.assignment_number
      ,to_char(paam.effective_start_date,'RRRR/MM/DD') asg_eff_start_date
      ,to_char(paam.effective_end_date,'RRRR/MM/DD') asg_eff_end_date
	  ,paam.action_code
	  ,pastt.user_status
	  ,paam.assignment_status_type
	  ,paam.primary_flag
	  ,paam.primary_assignment_flag
	  ,paam.primary_work_relation_flag
	  ,ppos.primary_flag "WR Table Primary Flag"
  FROM PER_ALL_PEOPLE_F papf
      ,PER_PERSON_NAMES_F ppnf
      ,PER_ALL_ASSIGNMENTS_M paam
	  ,PER_PERIODS_OF_SERVICE ppos
      ,PER_ASSIGNMENT_STATUS_TYPES_TL pastt
WHERE papf.person_id = ppnf.person_id
  AND ppnf.name_type = 'GLOBAL'
  AND papf.person_id = paam.person_id
  AND paam.period_of_service_id = ppos.period_of_service_id
  AND paam.assignment_status_type_id = pastt.assignment_status_type_id
  AND paam.effective_sequence = 1
  AND paam.assignment_type NOT LIKE '%T'
  AND TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date 
  AND TRUNC(SYSDATE) BETWEEN ppnf.effective_start_date AND ppnf.effective_end_date 
  AND pastt.language = 'US'
  AND papf.person_number IN ('500035','500036')
ORDER BY 1, 6 
HDL – Sample File to load Contact Records
METADATA|Contact|SourceSystemOwner|SourceSystemId|PersonNumber|StartDate|EffectiveStartDate|EffectiveEndDate|CorrespondenceLanguage|DateOfBirth|DateOfDeath|CountryOfBirth|RegionOfBirth|TownOfBirth|CategoryCode
MERGE|Contact|HRC_SQLLOADER|CONTACT_1292001|1292001|2019/06/08|2019/06/08|4712/12/31||1973/06/15|||||

METADATA|ContactName|SourceSystemOwner|SourceSystemId|PersonNumber|PersonId(SourceSystemId)|EffectiveStartDate|EffectiveEndDate|LastName|NameType|LegislationCode|FirstName|MiddleNames|Title|Honors|KnownAs|PreNameAdjunct|PreviousLastName|Suffix|CharSetContext
MERGE|ContactName|HRC_SQLLOADER|CONTACT_NAME_1292001|1292001|CONTACT_1292001|2019/06/08|4712/12/31|Last Name|GLOBAL|GB|FName||MR.||NameLP||||US

METADATA|ContactLegislativeData|SourceSystemOwner|SourceSystemId|PersonNumber|PersonId(SourceSystemId)|EffectiveStartDate|EffectiveEndDate|LegislationCode|HighestEducationLevel|MaritalStatus|MaritalStatusDate|Sex|FLEX:PER_PERSON_LEGISLATIVE_DFF|FLEX:PER_PERSON_LEGISLATIVE_DATA_LEG_DDF|nationality(PER_PERSON_LEGISLATIVE_DFF=Global Data Elements)
MERGE|ContactLegislativeData|HRC_SQLLOADER|PER_LEGSL_1292001_GB|1292001|CONTACT_1292001|2019/06/08|4712/12/31|GB||||M|||

METADATA|ContactRelationship|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)|EffectiveStartDate|EffectiveEndDate|RelatedPersonNumber|ContactType|PersonNumber|BeneficiaryFlag|DependentFlag|EmergencyContactFlag|ExistingPerson|PrimaryContactFlag|SequenceNumber
MERGE|ContactRelationship|HRC_SQLLOADER|PER_CONTACT_RELTNSHP_1292001_S|CONTACT_1292001|2019/06/08|4712/12/31|1234123|S|1292001|N|N|Y|||
HDL – Loading Local Person Names

By default, PersonName child object of Worker.dat loads GLOBAL name type. PersonName object can be used to load local name as well.

Two most important attributes of PersonName object for local name upload are:

Name Type – Legislation Code

CharSetContext – Short code for language.

Below is a sample HDL file to upload local names in Thai language:

METADATA|PersonName|EffectiveEndDate|EffectiveStartDate|LegislationCode|PersonId|NameType|FirstName|MiddleNames|LastName|Honors|KnownAs|PreNameAdjunct|PreviousLastName|Suffix|Title|MilitaryRank|CharSetContext

MERGE|PersonName|4712/12/31|2019/09/20|MY|300000189140970|MY|||XYZ||Test122First||||||TH

Language code can be found in Task – Manage Languages.

HDL – Load Worker Images

HCM Data Loader can be used to mass upload worker images. The actual image file should be put in a BlobFiles folder and should be referenced in actual Worker.dat file.

Post this, Worker.dat will be zipped together with BlobFiles folder and uploaded into HCM using Import and Load.

Sample HDL file:

METADATA|PersonImage|Image|ImageName|PersonNumber|SourceSystemOwner|SourceSystemId
MERGE|PersonImage|XYZ_12364.jpg|XYZ_12364|12364|EBS-HR|12364

XYZ_12364.jpg file should be present under BlobFiles.
HDL – Script to DELETE positions data

In your test environments, you may encounter issues where you want to DELETE positions data. You can use below script for that:

SELECT DATA_ROW
FROM (
SELECT 'METADATA|Position|PositionId|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner'  AS DATA_ROW
FROM DUAL
UNION all
select 'DELETE'||'|'||
       'Position' ||'|'||
       hapf.Position_Id||'|'||
       to_char(hapf.Effective_Start_Date,'RRRR/MM/DD')||'|'||
       to_char(hapf.Effective_End_Date,'RRRR/MM/DD') ||'|'||
      (select email_hrc.source_system_id
         from hrc_integration_key_map email_hrc
        WHERE hapf.Position_Id = email_hrc.surrogate_id) ||'|'||
      (select email_hrc.source_system_owner
         from hrc_integration_key_map email_hrc
        WHERE hapf.Position_Id = email_hrc.surrogate_id) AS DATA_ROW
  from hr_all_positions_f hapf
)