Search for:
HDL – Sample HDL to load Non Catalog Learning items
METADATA|NoncatalogLearningItem|LearningItemId|EffectiveStartDate|EffectiveEndDate|LearningItemNumber|Title|Description|Duration|Price|CurrencyCode|OwnedByPersonId|OwnedByPersonNumber|NoncatalogURL|SourceId|SourceInfo|SourceType|SourceSystemId|SourceSystemOwner

MERGE|NoncatalogLearningItem||2022/05/01||CLASS_EXT_128|External Excel Course||||||12345||1231|CLASS_EXT_128|CLASS_EXT|CLASS_EXT_128|HRC_SQLLOADER
HDL – Mask Candidate email addresses

Post P2T, there is always a requirement to mask the candidate email addresses in lower environments. Below SQL can be used to generate data masked email data in HDL format:

SELECT 'METADATA|CandidateEmail|EmailAddressId|PersonId|CandidateNumber|EmailAddress|DateFrom|DateTo|SourceSystemOwner|SourceSystemId' "Record"

 FROM DUAL

UNION

SELECT DISTINCT 'MERGE'    || '|'||

'CandidateEmail'        || '|'||

ICEA.EMAIL_ADDRESS_ID          || '|'||

ICEA.PERSON_ID             || '|'||

ICEA.CANDIDATE_NUMBER          || '|'||

replace(PEAD.EMAIL_ADDRESS,'.com','.invalidcomx')           || '|'||

TO_CHAR(PEAD.DATE_FROM,'YYYY/MM/DD')  || '|'||

TO_CHAR(PEAD.DATE_TO,'YYYY/MM/DD')   || '|'||

HIKM.SOURCE_SYSTEM_OWNER      || '|'||

    HIKM.SOURCE_SYSTEM_ID      "Record"

FROM IRC_CAND_EMAIL_ADDRESS_V ICEA,

PER_EMAIL_ADDRESSES PEAD,

HRC_INTEGRATION_KEY_MAP HIKM

WHERE ICEA.EMAIL_ADDRESS_ID = PEAD.EMAIL_ADDRESS_ID

  AND HIKM.SURROGATE_ID = PEAD.EMAIL_ADDRESS_ID

You can update the email address as per your requirement.

HDL – Sample HDL file to upload DoR for a specific country
METADATA|DocumentsOfRecord|PersonNumber|Country|DocumentType|DocumentCode|DateFrom|DateTo

MERGE|DocumentsOfRecord|78652|United Kingdom|UK Vehicle Information|OVERTIME_AVERAGING_AGREEMENT_2020-10-01|2020/10/01|2022/10/01



METADATA|DocumentAttachment|PersonNumber|Country|DocumentType|DocumentCode|DataTypeCode|Title|URLorTextorFileName|File

MERGE|DocumentAttachment|78652|United Kingdom|UK Vehicle Information|OVERTIME_AVERAGING_AGREEMENT_2020-10-01|FILE|DD test document of record for HDL.pdf|DD test document of record for HDL.pdf|DD test document of record for HDL.pdf

Please note that under attribute country, you will need to pass the full country name. Passing country code like GB, will throw below error:

The values GB aren’t valid for the attribute LegislationCode.

HDL – Sample HDL to delete schedule assignment data

Below HDL file can be used to DELETE the schedule assignment data in a Fusion environment:

METADATA|ScheduleAssignment|ResourceType|ScheduleAssignmentId|SourceSystemId|SourceSystemOwner
DELETE|ScheduleAssignment|ASSIGN|300000117113937|SCHEDULE_E1000001_30-MAY-2017-31-DEC-2017_0-8-8-8-8-0-0|EBS
DELETE|ScheduleAssignment|ASSIGN|300000117309993|SCHEDULE_E1000002_04-JUN-2018-31-DEC-4712_0-8-8-8-8-0-0|EBS

This data can be extracted from Fusion HCM using a simple BIP query:

SELECT DATA_ROW
FROM (
SELECT 'METADATA|ScheduleAssignment|ResourceType|ScheduleAssignmentId|SourceSystemId|SourceSystemOwner'  AS DATA_ROW
FROM DUAL
UNION all
select 'DELETE|ScheduleAssignment|',
	   RESOURCE_TYPE,
       SCHEDULE_ASSIGNMENT_ID,
       source_system_id,
       source_system_owner   AS DATA_ROW
  from fusion.per_schedule_assignments psa
      ,hrc_integration_key_map hikm
 where psa.SCHEDULE_ASSIGNMENT_ID = hikm.surrogate_id
)
BIP – How to check Termination version on your environment?

You can use per_empl_configurations table to verify the versions of Seniority dates/ Termination dates.

Below is an example to check the version for Termination dates:

select version_code 
  from fusion.per_empl_configurations 
 where configuration_type = 'ORA_TERMINATION'

If the version code value is NULL that means you are still on version V2.

Same table can be used to query Seniority Dates version.

HDL – Assignment Working Hour Pattern

Oracle has provided a new feature where working hours for each day can be stored against assignment working hours.

The data is stored in PER_WORKING_HOUR_PATTERNS_F table.

Use below query to extract the data:

SELECT papf.person_number, pwhpf.* 
  FROM PER_WORKING_HOUR_PATTERNS_F pwhpf
      ,PER_ALL_ASSIGNMENTS_M paam
	  ,PER_ALL_PEOPLE_F papf
 WHERE pwhpf.OBJECT_ID = paam.ASSIGNMENT_ID
   AND paam.PERSON_ID = papf.PERSON_ID
   AND papf.person_number = '10011'
   AND paam.assignment_type = 'E'
   AND TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN paam.effective_start_date AND paam.effective_end_date
   AND paam.effective_start_date BETWEEN pwhpf.effective_start_date AND pwhpf.effective_end_date

Query to extract data in HDL format:

SELECT
'METADATA|WorkingHourPattern|WorkingHourPatternId|EffectiveStartDate|EffectiveEndDate|Object|ObjectId|AssignmentNumber|PersonId|ReplaceFirstEffectiveStartDate' as DATA_ROW, 1 ORDERBY From dual
UNION
SELECT 'MERGE|WorkingHourPattern|'
|| pwhpf.WORKING_HOUR_PATTERN_ID
|| '|'
|| TO_CHAR(pwhpf.EFFECTIVE_START_DATE,'YYYY/MM/DD')
|| '|'
|| TO_CHAR(pwhpf.EFFECTIVE_END_DATE,'YYYY/MM/DD')
|| '|ASSIGNMENT|'
|| pwhpf.object_id
|| '|'
|| paam.assignment_number
|| '|'
|| paam.person_id
|| '|'
|| 'Y'
 as DATA_ROW, 2 ORDERBY
FROM PER_WORKING_HOUR_PATTERNS_F pwhpf
    ,PER_ALL_ASSIGNMENTS_M paam
WHERE paam.assignment_id = pwhpf.object_id
  AND paam.assignment_type = 'E'
  AND TRUNC(SYSDATE) BETWEEN paam.effective_start_date AND paam.effective_end_date
  AND paam.effective_start_date BETWEEN pwhpf.effective_start_date AND pwhpf.effective_end_date;
Purge Worker/Person Records from Test Environment

Oracle provides a process Purge Person Data in Test Environments to remove Person/Contact/Worker records from a test environment of Oracle HCM. Please note that HDL for core worker object doesn’t support DELETE, so there is no other option to purge the Person/Contact/Worker information from Oracle HCM once a record is created. In this case, Purge Person Data in Test Environments is definitely a useful utility which helps to purge person related data from a test environment.

This process can’t be run in Oracle HCM Production.

Before 21D, the process required an additional step to get a key to enable to process in Test environments but from 21D onwards the process is enabled by default in all test environments.

You can find more details related to this process on below link:

https://docs.oracle.com/en/cloud/saas/human-resources/22c/fahdl/enable-the-purge-person-data-in-test-environments-process.html#s20065832

Below is the list of tables which gets purged once the process completes:

https://docs.oracle.com/en/cloud/saas/human-resources/22c/fahdl/tables-purged-by-the-purge-person-data-in-test-environments.html#s20065826

To run the process:

  1. Navigate to Tools -> Scheduled Process
  2. Click on schedule new process –> Search for Purge Person Data in Test Environments
  3. Supply the parameters
Make sure Save is set to Y to purge the records from DB.

Sample Person SQL’s:

To remove only Contact Person records:

SELECT DISTINCT papf.person_id FROM per_all_people_f papf WHERE 1=1 AND NOT EXISTS (SELECT 1 FROM per_periods_of_service ppos where ppos.person_id = papf.person_id)

To remove all person records:

SELECT DISTINCT papf.person_id FROM per_all_people_f papf

Use below SQL to monitor the progress of entries in each table:

SELECT a.*
FROM
(
SELECT 'ANC_PER_ABS_ENTRIES' table_name, count(*) rowcount FROM ANC_PER_ABS_ENTRIES
UNION
SELECT 'ANC_PER_ABS_ENTRY_DTLS' table_name, count(*) rowcount FROM ANC_PER_ABS_ENTRY_DTLS
UNION
SELECT 'ANC_PER_ABS_MATERNITY' table_name, count(*) rowcount FROM ANC_PER_ABS_MATERNITY
UNION
SELECT 'ANC_PER_PLAN_ENROLLMENT' table_name, count(*) rowcount FROM ANC_PER_PLAN_ENROLLMENT
UNION
SELECT 'ANC_PER_ABS_PLAN_ENTRIES' table_name, count(*) rowcount FROM ANC_PER_ABS_PLAN_ENTRIES
UNION
SELECT 'ANC_PER_ACCRUAL_ENTRIES' table_name, count(*) rowcount FROM ANC_PER_ACCRUAL_ENTRIES
UNION
SELECT 'ANC_PER_ACRL_ENTRY_DTLS' table_name, count(*) rowcount FROM ANC_PER_ACRL_ENTRY_DTLS
UNION
SELECT 'CMP_SALARY' table_name, count(*) rowcount FROM CMP_SALARY
UNION
SELECT 'PAY_ASSIGNED_PAYROLLS_DN' table_name, count(*) rowcount FROM PAY_ASSIGNED_PAYROLLS_DN
UNION
SELECT 'PAY_ASSIGNED_PAYROLLS_F' table_name, count(*) rowcount FROM PAY_ASSIGNED_PAYROLLS_F
UNION
SELECT 'PER_ADDRESSES_F' table_name, count(*) rowcount FROM PER_ADDRESSES_F
UNION
SELECT 'PER_ALL_ASSIGNMENTS_M' table_name, count(*) rowcount FROM PER_ALL_ASSIGNMENTS_M
UNION
SELECT 'PER_ALL_PEOPLE_F' table_name, count(*) rowcount FROM PER_ALL_PEOPLE_F
UNION
SELECT 'PER_ASSIGN_WORK_MEASURES_F' table_name, count(*) rowcount FROM PER_ASSIGN_WORK_MEASURES_F
UNION
SELECT 'PER_CITIZENSHIPS' table_name, count(*) rowcount FROM PER_CITIZENSHIPS
UNION
SELECT 'PER_DRIVERS_LICENSES' table_name, count(*) rowcount FROM PER_DRIVERS_LICENSES
UNION
SELECT 'PER_EMAIL_ADDRESSES' table_name, count(*) rowcount FROM PER_EMAIL_ADDRESSES
UNION
SELECT 'PER_ETHNICITIES' table_name, count(*) rowcount FROM PER_ETHNICITIES
UNION
SELECT 'PER_NATIONAL_IDENTIFIERS' table_name, count(*) rowcount FROM PER_NATIONAL_IDENTIFIERS
UNION
SELECT 'PER_PASSPORTS' table_name, count(*) rowcount FROM PER_PASSPORTS
UNION
SELECT 'PER_PEOPLE_LEGISLATIVE_F' table_name, count(*) rowcount FROM PER_PEOPLE_LEGISLATIVE_F
UNION
SELECT 'PER_PERIODS_OF_SERVICE' table_name, count(*) rowcount FROM PER_PERIODS_OF_SERVICE
UNION
SELECT 'PER_PERSON_ADDR_USAGES_F' table_name, count(*) rowcount FROM PER_PERSON_ADDR_USAGES_F
UNION
SELECT 'PER_PERSON_NAMES_F' table_name, count(*) rowcount FROM PER_PERSON_NAMES_F
UNION
SELECT 'PER_PERSON_TYPE_USAGES_M' table_name, count(*) rowcount FROM PER_PERSON_TYPE_USAGES_M
UNION
SELECT 'PER_PERSONS' table_name, count(*) rowcount FROM PER_PERSONS
UNION
SELECT 'PER_PHONES' table_name, count(*) rowcount FROM PER_PHONES
UNION
SELECT 'PER_RELIGIONS' table_name, count(*) rowcount FROM PER_RELIGIONS
UNION
SELECT 'PER_VISAS_PERMITS_F' table_name, count(*) rowcount FROM PER_VISAS_PERMITS_F
UNION
SELECT 'PAY_ELEMENT_ENTRIES_F' table_name, count(*) rowcount FROM PAY_ELEMENT_ENTRIES_F
UNION
SELECT 'PAY_ELEMENT_ENTRY_VALUES_F' table_name, count(*) rowcount FROM PAY_ELEMENT_ENTRY_VALUES_F
UNION
SELECT 'PER_WORKING_HOUR_PATTERNS_F' table_name, count(*) rowcount FROM PER_WORKING_HOUR_PATTERNS_F
UNION
SELECT 'PER_ASSIGNMENT_EXTRA_INFO_M' table_name, count(*) rowcount FROM PER_ASSIGNMENT_EXTRA_INFO_M
UNION
SELECT 'PER_ASSIGNMENT_SUPERVISORS_F' table_name, count(*) rowcount FROM PER_ASSIGNMENT_SUPERVISORS_F
UNION
SELECT 'PER_PEOPLE_EXTRA_INFO_F' table_name, count(*) rowcount FROM PER_PEOPLE_EXTRA_INFO_F
UNION
SELECT 'HR_DOCUMENTS_OF_RECORD ' table_name, count(*) rowcount FROM HR_DOCUMENTS_OF_RECORD 
UNION
SELECT 'PER_CONTACT_RELATIONSHIPS ' table_name, count(*) rowcount FROM PER_CONTACT_RELATIONSHIPS 
) a
where 
a.rowcount <> 0

You can add/remove more tables based on data in your environment.

Stats:

Normally the process takes around 5-6 hrs for 16-17K employees. Performance depends upon environment sizing as well among other factors.

HDL – Source values for Candidate HDL

When loading Candidate data using HDL, the values of Source attributes (Source and Source Medium) are required. These values are configured in UI using task ‘Manage Candidate Dimension Source Names’.

Use below SQL to get the values:

SELECT DISTINCT
 source_medium_url_value,
source_url_value
  FROM
 fusion.IRC_DIMENSION_DEF_VL
HDL – Sample HDL to load HCM Group Members

Oracle provides HCMGroup.dat which can be used to assign members against HCM Group. However, the business object has a major limitation (till release version 22A) that it can be used only for below two seeded Oracle HCM Groups:

  • Manually Maintained Excluded Members
  • Manually Maintained Audit Members

Below is the sample HDL file to add members to Audit HCM Group:

METADATA|HCMGroup|GrpMemberId|EffectiveStartDate|GrpId|MemberId|GroupCode|PersonNumber|SourceSystemOwner|SourceSystemId|GUID|EffectiveEndDate
MERGE|HCMGroup||2022/01/04|||MANUALLY MAINTAINED AUDIT MEMBERS|1014|HRC_SQLLOADER|MANUALLY MAINTAINED AUDIT MEMBERS_1014||

For any custom HCM Groups, this can’t be used. If you try to use any custom Group Code in the HDL file, you will receive below error:

Valid values for attribute GROUP CODE: MANUALLY MAINTAINED AUDIT MEMBERS,MANUALLY MAINTAINED EXCLUDED MEMBERS.

Query to extract members of an HCM Group:

SELECT hgt.group_name
      ,papf.person_id
      ,papf.person_number
      ,ppnf.full_name
      ,ppnf.last_name
      ,ppnf.first_name
      ,hgmf.effective_start_date
      ,hgmf.effective_end_date
  FROM PER_ALL_PEOPLE_F papf,
       PER_PERSON_NAMES_F ppnf,
       HWM_GRP_MEMBERS_F hgmf,
       HWM_GRPS_B hgb,
       HWM_GRPS_TL hgt
 WHERE hgb.grp_id = hgmf.grp_id
   AND hgb.grp_id = hgt.grp_id 
   AND hgt.language = 'US'
   AND papf.person_id = hgmf.member_id
   AND papf.person_id = ppnf.person_id
   AND ppnf.name_type ='GLOBAL'
   AND trunc(sysdate) BETWEEN hgmf.effective_start_date and hgmf.effective_end_date
   AND trunc(sysdate) BETWEEN ppnf.effective_start_date and ppnf.effective_end_date
ORDER BY 1,3   
HDL – Upload Criticality at Position Model Profile

Criticality at position model profile can be loaded using TalentProfile.dat file. Use the below HDL file:

METADATA|TalentProfile|ProfileCode|ProfileId|ProfileTypeId|ProfileUsageCode|SourceSystemOwner|SourceSystemId|ProfileStatusCode|Description|Summary
MERGE|TalentProfile|TEST_POS_PROFILE_3||3|M|HRC_SQLLOADER|TEST_POS_PROFILE_3|A|A Test Position Profile 3|A Test Position Profile 3 Summary

METADATA|ProfileItem|ProfileCode|ContentType|RatingLevelId1|ItemText3015|SourceSystemOwner|SourceSystemId|DateFrom|SectionId
MERGE|ProfileItem|TEST_POS_PROFILE_3|CRITICALITY||N|HRC_SQLLOADER|TEST_POS_PROFILE_3_CRIT|2022/01/01|8809

ItemText3015 – “Requires Succession Plan” radio box on UI

Screen shot from UI of uploaded data:

Useful queries:

Uploaded data will be stored in HRT_PROFILE_ITEMS.

Content Type –

select * from HRT_CONTENT_TYPES_TL
where content_type_id=127

Section ID –

select *
from HRT_PROFILE_TYP_SECTIONS
WHERE CONTENT_TYPE_ID=127 
  AND SECTION_CONTEXT = 'POSITION_CRITICALITY'
HDL – Sample file to upload Talent Pool

Use below sample HDL file to create talent pool and add members to it:


METADATA|TalentPool|PoolName|PoolTypeCode|Description|Status|OwnerPersonNumber|ProfileCode|JobFamilyName|PositionCode|JobSetCode|JobCode|GradeSetCode|GradeCode|DepartmentName|JobFamilyCode|SourceSystemOwner|SourceSystemId
MERGE|TalentPool|Talent Pool|TALENT|Description of Talent Pool|A|800012||||||||||HRC_SQLLOADER|HRC_SQLLOADER_Talent Pool_800012

METADATA|PoolOwner|OwnerPersonNumber|PoolName|PoolStatus|SourceSystemOwner|SourceSystemId
MERGE|PoolOwner|800015|Talent Pool|A|HRC_SQLLOADER|HRC_SQLLOADER_800015_Talent Pool
MERGE|PoolOwner|800087|Talent Pool|A|HRC_SQLLOADER|HRC_SQLLOADER_800087_Talent Pool

METADATA|PoolMember|MemberPersonNumber|Status|PoolName|PoolStatus|SourceSystemOwner|SourceSystemId
MERGE|PoolMember|800034|A|Talent Pool|A|HRC_SQLLOADER|HRC_SQLLOADER_Talent Pool_800034
MERGE|PoolMember|800045|A|Talent Pool|A|HRC_SQLLOADER|HRC_SQLLOADER_Talent Pool_800045
HDL – Talent Profile HDL related useful queries

Loading the profile data against a worker/job/position is little tricky one. One has to understand the templates thoroughly before uploading any data.

Below are some of the most commonly used queries to extract some of the required information for loading Talent profiles data.

Extract Person Profile Code:

SELECT distinct papf.person_id, papf.person_number,ProfilePEO.profile_code
  FROM HRT_PROFILES_VL ProfilePEO, PER_ALL_PEOPLE_F papf
 WHERE papf.person_id = ProfilePEO.person_id
    AND TRUNC(SYSDATE) BETWEEN papf.effective_start_date and papf.effective_end_date

Extract Profile Type Id:

Select * from HRT_PROFILE_TYPES_TL
HDL – Sample HDL to load Succession Plans

Sample HDL for succession plan bulk upload:

METADATA|SuccessionPlan|PlanName|PlanType|Status|AccessTypeCode|Description|IncumbentPersonNumber|JobCode|JobSetCode|DepartmentName|DepartmentSetCode|GradeSetCode|GradeCode|PositionCode|BusinessUnitName|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlan|Test_Person_Succession_Plan|INCUMBENT|ACTIVE|PUBLIC|Sample description for Incument Plan Type.|100019|||||||||HRC_SQLLOADER_Test_Person_Succession_Plan_INCUMBENT|HRC_SQLLOADER
MERGE|SuccessionPlan|Test_Job_Succession_Plan|JOB|ACTIVE|PUBLIC|Sample description for Job Plan Type.||JOB_33052|COMMON|Test Department|COMMON|COMMON|82||COMMON|HRC_SQLLOADER_Test_Job_Succession_Plan_JOB|HRC_SQLLOADER
MERGE|SuccessionPlan|Test_Position_Succession_Plan|POSITION|ACTIVE|PUBLIC|Sample description for Position Plan Type.||||Test Department|COMMON|||POS_0030|COMMON|HRC_SQLLOADER_Test_Position_Succession_Plan_POSITION|HRC_SQLLOADER


METADATA|SuccessionPlanCandidate|PlanName|PlanCandidatePersonNumber|CandidateRanking|CandidateReadiness|EmergencySuccessor|Status|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlanCandidate|Test_Position_Succession_Plan|100019|2|READY_NOW|Y|ACTIVE|HRC_SQLLOADER_Y_Test_Position_Succession_Plan_100019|HRC_SQLLOADER

METADATA|SuccessionPlanOwner|PlanName|PlanOwnerPersonNumber|OwnerTypeCode|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlanOwner|Test_Person_Succession_Plan|800012|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_800012_Test_Person_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Job_Succession_Plan|800012|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_800012_Test_Job_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Position_Succession_Plan|800012|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_800012_Test_Position_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Position_Succession_Plan|897132|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_104990_Test_Position_Succession_Plan|HRC_SQLLOADER
HDL – DELETE Person External Identifier

Person External Identifier information can be deleted easily using HDL. Under Worker use ExternalIdentifier to delete this data.

Run the below query to extract required data in HDL format:

SELECT 'METADATA|ExternalIdentifier|ExternalIdentifierId|PersonId' HEADER, 1 DATA_ROW
  FROM DUAL
 UNION
SELECT 'DELETE|ExternalIdentifier|'
       ||peai.EXT_IDENTIFIER_ID
       ||'|'||peai.PERSON_ID DATAR, 2 DATA_ROW
  FROM PER_EXT_APP_IDENTIFIERS peai	 
      ,PER_ALL_PEOPLE_F papf
WHERE peai.person_id = papf.person_id
  AND TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date	
  AND papf.person_number IN ('11','12')

Sample HDL File:

METADATA|ExternalIdentifier|ExternalIdentifierId|PersonId
DELETE|ExternalIdentifier|300000247194865|300000066997395
BIP – Query to DELETE Salary data using HDL

There may be a requirement to delete salary data while doing HDL iterations for the first time. You can use below query for the same purpose

Select 'METADATA|Salary|AssignmentNumber|SalaryAmount|DateFrom|DateTo|SalaryBasisId|SalaryId' Header, 1 data_flow_order
from dual
UNION
SELECT 'DELETE|Salary'||'|'||
paam.assignment_number||'|'||
SALARY_AMOUNT||'|'||
TO_CHAR(cs.date_from,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
TO_CHAR(cs.date_to,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
cs.salary_basis_id||'|'||
cs.salary_id data_row,
2 data_flow_order
FROM cmp_salary cs,
per_all_assignments_m paam
WHERE cs.assignment_id= paam.assignment_id
AND trunc(sysdate) between paam.effective_start_date AND paam.effective_end_date
AND paam.assignment_type in ('E', 'C', 'P')
AND trunc(cs.creation_date) < trunc(sysdate)-1
ORDER BY data_flow_order
HDL – Query to DELETE Assigned Payroll data from Fusion

In case, you need to delete assigned payroll data for masking purpose in lower environment or for some other reason like correction etc, use below query to pull the data in HDL format:

SELECT 'METADATA|AssignedPayroll|AssignedPayrollId|EffectiveStartDate|AssignmentId|LegislativeDataGroupId|PayrollId|PayrollTermId|SourceSystemId|SourceSystemOwner' HEADER, 1 DATA_ORDER
  FROM DUAL
UNION ALL
SELECT 'DELETE|AssignedPayroll'
	   ||'|'||
       papd.ASSIGNED_PAYROLL_ID
	   ||'|'||
       to_char(papd.start_date,'RRRR/MM/DD')
	   ||'|'||	   
	   paam.assignment_id
	   ||'|'||	   
	   papd.LEGISLATIVE_DATA_GROUP_ID
	   ||'|'||	   
	   papd.Payroll_Id
	   ||'|'||	   
	   papd.Payroll_Term_Id
	   ||'|'||	   
       hikm.source_system_id
	   ||'|'||	   
       hikm.source_system_owner HEADER, 2 DATA_ORDER
  FROM pay_assigned_payrolls_dn papd
	  ,pay_payroll_terms ppt
	  ,pay_pay_relationships_dn prrd
	  ,pay_all_payrolls_f papf_pay
	  ,pay_rel_groups_dn prgd
	  ,pay_rel_groups_f prgf
	  ,per_all_people_f papf
	  ,per_person_names_f ppnf
	  ,per_all_assignments_m paam 
	  ,hrc_integration_key_map hikm
 WHERE papd.payroll_term_id = ppt.payroll_term_id
   AND ppt.payroll_relationship_id = prrd.payroll_relationship_id
   AND papd.payroll_id = papf_pay.payroll_id
   AND prrd.person_id = papf.person_id
   AND prrd.payroll_relationship_id = prgd.payroll_relationship_id
   AND prgd.relationship_group_id = prgf.relationship_group_id
   AND prgd.assignment_id = paam.assignment_id
   AND paam.effective_latest_change = 'Y'
   AND prgd.group_type = 'A'
   AND ppnf.person_id = papf.person_id
   AND ppnf.name_type = 'GLOBAL'
   AND TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN ppnf.effective_start_date AND ppnf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN prgf.effective_start_date AND prgf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN paam.effective_start_date AND paam.effective_end_date
   AND hikm.surrogate_id = papd.ASSIGNED_PAYROLL_ID
   AND trunc(papd.creation_date) < trunc(sysdate)-1
   --AND papf.person_number = '11111'
 ORDER BY 2

Sample HDL File:

METADATA|AssignedPayroll|AssignedPayrollId|EffectiveStartDate|AssignmentId|LegislativeDataGroupId|PayrollId|PayrollTermId|SourceSystemId|SourceSystemOwner
DELETE|AssignedPayroll|300000012343876|2022/01/19|300000046430123|300000004188123|300000014605111|300000046430342|ASGN_PAYROLL_12345|EBS-HR
HDL – Load Position Model Profile Extra Info

For Position Model profiles, there are additional attributes like Description, Qualification and Responsibilities. To load these, one should make use of ModelProfileExtraInfo child business object of TalentProfile.dat.

Attributes values for Description, Qualification and Responsibilities should be stored in txt files. The txt file(s) should then be placed under the ClobFiles folder and should be zipped together with .dat file.

Sample HDL File to load position model profile extra info:

METADATA|TalentProfile|ProfileCode|ProfileId|ProfileTypeId|ProfileUsageCode|SourceSystemOwner|SourceSystemId|ProfileStatusCode|Description|Summary
MERGE|TalentProfile|TEST_POS_PROFILE_2||3|M|HRC_SQLLOADER|TEST_POS_PROFILE_2|A|A Test Position Profile2|A Test Position Profile 2
METADATA|ModelProfileExtraInfo|ProfileExtraInformationId|Description|ProfileId|Qualifications|Responsibilities|ProfileCode|SourceSystemId|SourceSystemOwner
MERGE|ModelProfileExtraInfo||Test_Desc.txt||Test_Qual.txt|Test_Resp.txt|TEST_POS_PROFILE_2|TEST_POS_PROFILE_2_MFEI|HRC_SQLLOADER

zip File Structure:

Once the data is loaded, same can be viewed in UI:
HDL – Loading Competencies against Worker

TalentProfile.dat can be used to create worker talent profiles and loading content items against the talent profiles. While some of the content items load is straight forward, loading competencies is little tricky one. This is because there are some additional fields like QualifierId1 and QualifierId2 which should be supplied to make the loaded data available on UI.

Sample HDL file to load competencies:

METADATA|ProfileItem|ProfileItemId|ProfileId|ProfileCode|ContentItemId|ContentTypeId|ContentType|CountryId|DateFrom|DateTo|RatingModelId1|RatingModelCode1|RatingLevelId1|RatingLevelCode1|RatingModelId2|RatingModelCode2|RatingLevelId2|RatingLevelCode2|SectionId|SourceSystemOwner|SourceSystemId|QualifierId1|QualifierId2|ItemDate6|ContentItem
MERGE|ProfileItem|||PERSON_112211||104|COMPETENCY||2022/02/07||300000236131123||||300000236131123|PERFORMANCE||3|300000236137110|HRC_SQLLOADER|PERSON_112211|300000236137121|100000086708123|2018/08/18|Leadership Skills

Use the below query to extract qualifiers data:

select QUALIFIER_ID,DESCRIPTION
  from FUSION.HRT_QUALIFIERS_VL
 where QUALIFIER_SET_ID in (
select QUALIFIER_SET_ID from              fusion.HRT_QUALIFIER_SETS_VL where SECTION_ID = 300000236137110)

Sample data from above query:

By default on UI, View By is set to “Official”:

Navigation :- Person Management -> Talent Profile

So, if you pass QualifierId1 as something different from the Id for “Official”, the loaded data will not be visible. Make sure to set “View By” to All.

Below is the high level description of some of the main attributes of ProfileItem business object:

ProfileId – profile Id of the person. If you don’t have it, pass it blank and pass the value under ProfileCode.

ProfileCode – Use this if you don’t know ProfileId (above).

DateFrom – From Date

RatingLevelCode2 – Pass values 1,2,3,4 or 5 based on your rating level (under rating model you can see).

SourceSystemId – Pass same as Profile Id or Profile Code

ItemDate6 – Review Date

ContentItem – Competency Name

QualifierId1 – Qualifier_id for an appropriate evaluator type. Can be used from the query above.

QualifierId2 – Person Id corresponding to evaluator type. For example, if using qualifier_id for Self, pass person_id of the worker in qualifierId2.

HCM Extracts – Inbound HCM Extract “Assertion Failure” Error

HCM Extracts with inbound delivery are one of the most powerful technical features of the Fusion HCM cloud. HCM Extracts with inbound delivery enables the technical consultants to do customizations and automations by making use of tools like BIP, HCM Extracts and HCM Data Loader.

You can follow the below post for more details on the topic:

One has to take care a number of things when designing the Inbound HCM Extracts or the Loopback extract, otherwise you may get variety of errors.

One of the most commonly encountered error is:

An assertion failure was detected at location BatchProcArchivePA.getRepMapId:2. error while running

There can be multiple reasons to this error. To resolve this, please check below:

  1. Are you using a BIP report to generate the output? If yes, can you please check if permissions are assigned to the report and data model (to FUSION%APPS%HCM%ESS%APPID)?
  2. Please download the sample output from data model and check if you can generate a sample output using BI Publisher plugin.
  3. Check if report path is correct if you are using.
  4. Also, please check if you are able to deliver the output to UCM.
  5. Check if the extract parameters are correct and the related value sets/ fast formula are present in the environment.
BIP – Query to extract Rating Model Levels in HDL Format

This is second post in a series to extract Rating Model data from Fusion environment in HDL format. The first post has a query to extract only the rating model data and second post describes the BIP query to extract rating model levels data.

Query to extract Rating Model data:

Query to extract Rating Model levels data:

SELECT 'METADATA|RatingLevel|RatingModelCode|RatingLevelCode|CareerStrDev|RatingDescription|DateFrom|FromPoints|MaximumRatingDistribution|MinimumRatingDistribution|NumericRating|ReviewPoints|ReviewRatingDescr|RatingShortDescr|StarRating|DateTo|ToPoints|SourceSystemId|SourceSystemOwner' DATA_LINE, 1 ORDER_NUM
  FROM DUAL
UNION
select
'MERGE'
||'|'||
'RatingLevel'
||'|'||
hrmb.rating_model_code
||'|'||
hrlv.rating_level_code
||'|'||
hrlv.career_str_dev
||'|'||
hrlv.rating_description
||'|'||
TO_CHAR(hrlv.date_from,'YYYY/MM/DD')
||'|'||
hrlv.from_points
||'|'||
hrlv.max_rating_distribution
||'|'||
hrlv.min_rating_distribution
||'|'||
hrlv.numeric_rating
||'|'||
hrlv.review_points
||'|'||
hrlv.review_rating_descr
||'|'||
hrlv.rating_short_descr
||'|'||
hrlv.star_rating
||'|'||
TO_CHAR(hrlv.date_to,'YYYY/MM/DD')
||'|'||
hrlv.to_points
||'|'||
hikm.source_system_id
||'|'||
hikm.source_system_owner DATA_LINE, 2 ORDER_NUM
FROM hrt_rating_models_b hrmb
   , hrt_rating_levels_vl hrlv
   , hrc_integration_key_map hikm
WHERE hrmb.rating_model_id = hrlv.rating_model_id
  AND hikm.surrogate_id = hrlv.rating_level_id