Search for:
HDL – Sample HDL files load Content Items

To bulk upload items catalogs in HCM profiles, you can use ContentItem.dat. Each of the template have certain mandatory attributes like Context Name, Value Set Name or Value Set Id:

S0, before you start preparing the file, you need to have below information handy:

  1. Context Name – This is a mandatory attribute. If you don’t pass the value in your HDL file, you will get below error:

The values 3000122xxxxx aren’t valid for ContentItemValueSetId.

You can get the context name from HRT_CONTENT_TYPES_B table:

2. Content Item Value Set Name/Id: This is again a mandatory attribute. You can get the Content Value Set Name/ Id from HRT_CONTENT_TP_VALUESETS_TL table.

Once you have the details, you can prepare ContentItem.dat.

Below are the sample files for different item catalog templates:

For Establishments:

METADATA|ContentItem|Name|ContextName|ContentItemValueSetName|ContentItemValueSetId|ContentItemCode|DateFrom|DateTo|RatingModelCode|SourceSystemId|SourceSystemOwner 

MERGE|ContentItem|Indian Institute of Technology, Bombay|EDUCATIONAL_ESTABLISHMENT|Establishment||IIT_B|1951/01/01|||IIT_B|HRC_SQLLOADER 
MERGE|ContentItem|Indian Institute of Management, Ahemdabad|EDUCATIONAL_ESTABLISHMENT|Establishment||IIM_A|1951/01/01|||IIT_B|HRC_SQLLOADER 

For Licenses and Certifications:

METADATA|ContentItem|Name|ContextName|ContentItemValueSetName|ContentItemValueSetId|ContentItemCode|DateFrom|DateTo|RatingModelCode|SourceSystemId|SourceSystemOwner  

MERGE|ContentItem|Oracle Global Human Resources 2023|CERTIFICATION|Licenses and Certifications||O_GHR_2023|1951/01/01|||O_GHR_2023|HRC_SQLLOADER MERGE|ContentItem|Oracle Benefits 2023|CERTIFICATION|Licenses and Certifications||O_BEN_2023|1951/01/01|||O_BEN_2023|HRC_SQLLOADER

For Degrees:

METADATA|ContentItem|Name|ContextName|ContentItemValueSetName|ContentItemValueSetId|ContentItemCode|DateFrom|DateTo|RatingModelCode|SourceSystemId|SourceSystemOwner 

MERGE|ContentItem|PhD|DEGREE|Degrees||XX_PhD|1951/01/01|||CI_XX_PhD|HRC_SQLLOADER 
MERGE|ContentItem|Higher National Certificate|DEGREE|Degrees||XX_Higher National Certificate|1951/01/01|||CI_XX_Higher National Certificate|HRC_SQLLOADER 

For Competencies:

METADATA|ContentItem|ContextName|ContentItemValueSetName|Name|ContentItemId|ContentItemCode|DateFrom|DateTo|ItemDescription|RatingModelId|RatingModelCode|CountryGeographyCode|CountryCountryCode|SourceSystemId|SourceSystemOwner

MERGE|ContentItem|COMPETENCY|Competencies|Accounting Standards and Principles||XX_ASAP|1951/01/01||To check knowledge on Accounting Standards and Principles.|5|PROFICIENCY|||XX_ASAP|HRC_SQLLOADER
MERGE|ContentItem|COMPETENCY|Competencies|Assessing Talent||XX_AT|1951/01/01||To check knowledge on Assessing Talent.|5|PROFICIENCY|||XX_AT|HRC_SQLLOADER
MERGE|ContentItem|COMPETENCY|Competencies|Assurance and Reporting||XX_AAR|1951/01/01||To check knowledge on Assurance and Reporting.|5|PROFICIENCY|||XX_AAR|HRC_SQLLOADER

For Languages:

METADATA|ContentItem|Name|ContextName|ContentItemValueSetName|ContentItemValueSetId|ContentItemCode|DateFrom|DateTo|RatingModelCode|SourceSystemId|SourceSystemOwner 

MERGE|ContentItem|Hindi|LANGUAGE|Languages||XX_Hindi|1951/01/01|||CI_Hindi|HRC_SQLLOADER
MERGE|ContentItem|Punjabi|LANGUAGE|Languages||XX_Punjabi|1951/01/01|||CI_Punjabi|HRC_SQLLOADER 

Please note that Rating Mode Id is mandatory for loading competencies. You can find the rating model id and rating model code from hrt_rating_models_b table.

Once the data load is successful, you can run below queries to extract loaded data:

HDL – SET PURGE_FUTURE_CHANGES Command

In this article, I will talk about affect of using SET PURGE_FUTURE_CHANGES in HCM Data Loader file. HCM use date tracked functionality for most of the objects in HCM, be it work structures, worker or payroll related objects.

I have seen many scenarios, where I had to go in past and make an update on the existing past date without touching the current existing record. There are numerous scenarios like these where HCM technical consultants are expected to make use of HDL to update the past dated records. Oracle has designed PURGE_FUTURE_CHANGES SET command specifically for these scenarios.

But before adding this command in your HDL (.dat) file, it is very important to understand the working of this command, otherwise it may unwantedly purge/ change data which is beyond recovery.

Let us start with an example. Below is existing employment data for a worker with Person Number – 120:

Later on, it has been found that Grade should have been changed Grade 6 from 16-Jun-2022 onwards but there is no data for this in the database currently..

So, in this case a new row with Grade update to Grade 6 should be inserted starting on 16-Jun-2022.

So, the technical consultant has created an an HDL with below format:

METADATA|WorkTerms|AssignmentId|PeriodOfServiceId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode
MERGE|WorkTerms|300000066966135|300000066966134|Y|1|2022/06/16|4712/12/31|300000066966135|FUSION|ASG_CHANGE

METADATA|Assignment|AssignmentId|WorkTermsAssignmentId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode|GradeCode
MERGE|Assignment|300000066966140|300000066966135|Y|1|2022/06/16|4712/12/31|300000066966140|FUSION|ASG_CHANGE|GRADE6

As you can see, EffectiveStartDate = 2022/06/16. So, the intention here is to insert a new row starting 2022/06/16.

But unfortuantely, running above HDL will replace all the existing future rows, i.e. the row with effective start date of 2023/01/01 will be purged from database. Below is how the new data will look once the above HDL load is done:

This happens because the default system setting for update mode is set to REPLACE. This can be verified using “Configure HCM Data Loader” task from Setup and Maintenance:

So, in order to preserve the future dated rows, you have to use SET command:

SET PURGE_FUTURE_CHANGES N

METADATA|WorkTerms|AssignmentId|PeriodOfServiceId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode
MERGE|WorkTerms|300000066966135|300000066966134|Y|1|2022/06/16|4712/12/31|300000066966135|FUSION|ASG_CHANGE

METADATA|Assignment|AssignmentId|WorkTermsAssignmentId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode|GradeCode
MERGE|Assignment|300000066966140|300000066966135|Y|1|2022/06/16|4712/12/31|300000066966140|FUSION|ASG_CHANGE|GRADE6

Executing this will preserve the future dated rows. But it will change the data in future rows as per current row data.

Now if the requirement is to keep the future dated rows and their data intact, you should pass #RETAIN in EffectiveEndDate attribute as shown in below example:

SET PURGE_FUTURE_CHANGES N

METADATA|WorkTerms|AssignmentId|PeriodOfServiceId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode
MERGE|WorkTerms|300000066966135|300000066966134|Y|1|2022/06/16|#RETAIN|300000066966135|FUSION|ASG_CHANGE

METADATA|Assignment|AssignmentId|WorkTermsAssignmentId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode|GradeCode
MERGE|Assignment|300000066966140|300000066966135|Y|1|2022/06/16|#RETAIN|300000066966140|FUSION|ASG_CHANGE|GRADE6

With #RETAIN value of Asg Attribute 1 is retained on 01-Jan-2023 row.

HDL – Sample file to load absence reason

Absence reasons can be loaded in BULK using AbsenceReason.dat.

BaseName is a mandatory attribute in the file. If BaseName is not passed, user get below error:

An error occurred. To review details of the error run the HCM Data Loader Error Analysis Report diagnostic test. Message details: {MESSAGE}.

An error occurred. To review details of the error run the HCM Data Loader Error Analysis Report diagnostic test. Message details: Please check the stack trace for more details.	

BaseName has to be provided in UPPER CASE concatenated with an underscore (_) and legislation code.

Example: TEST ABSENCE REASON_US

If the above format is not followed, HDL throws below error:

		There was a problem uploading your data.

Clicking on the error Message, shows below:

Below is the successfully loaded file:

METADATA|AbsenceReason|EffectiveStartDate|EffectiveEndDate|LegislationCode|Status|Name|BaseName|Description|SourceSystemOwner|SourceSystemId
MERGE|AbsenceReason|2024/01/15|2024/12/31|IN|A|Test Absence Reason 1|TEST ABSENCE REASON 1_US|Test Absence Reason 1|HRC_SQLLOADER|TEST ABSENCE REASON 1_US
MERGE|AbsenceReason|2024/01/16|2024/12/31|IN|A|Test Absence Reason 2|TEST ABSENCE REASON 2_US|Test Absence Reason 2|HRC_SQLLOADER|TEST ABSENCE REASON 2_US
BIP – Query to extract adjustment reasons attached on accrual Plan level

You can attach multiple adjustment reasons on accrual plan level. You can create any new custom adjustment reasons and add them in a lookup – ANC_ABS_PLAN_OTHER_REASONS.

Once the reason is added in the lookup, you can navigate to Absence Plans.

You can choose the newly added lookup code under “Enteries and Balances” tab.

Once the reasons are added, you can run below SQL query to extract the reasons added to the plan:

SELECT aapft.name
      ,aapf.legislation_code
      ,aapf.other_reasons
  FROM ANC_ABSENCE_PLANS_F aapf
      ,ANC_ABSENCE_PLANS_F_TL aapft
 WHERE aapf.absence_plan_id = aapft.absence_plan_id
   AND aapft.language = 'US'
   AND TRUNC(SYSDATE) BETWEEN aapf.effective_start_date AND aapf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN aapft.effective_start_date AND aapft.effective_end_date
   AND aapf.other_reasons IS NOT NULL
   AND aapf.other_reasons LIKE '%XX_MLA%'
ORDER BY 2,1  
HDL – Sample HDL to Load Agents information in ORC

HCM Data Loader supports bulk uploading for Agents for Oracle recruiting cloud. Currently, there is no HDL support for loading Agencies. And idea is already submitted for same:

https://community.oracle.com/customerconnect/discussion/666039/hcm-data-loader-support-for-loading-agencies-in-recruiting-cloud#latest

Coming to Agents bulk upload, you can used below HDL:

METADATA|Agent|EmailAddress|AgencyName|AgentNotes|FirstName|LastName|StatusCode|SourceSystemId|SourceSystemOwner
MERGE|Agent|[email protected]|ExecFinder|67 High Street, London|Agency|Name|Active|ExecFinder.Agent1|HRC_SQLLOADER

METADATA|AgentContext|EmailAddress|Username|SourceSystemId|SourceSystemOwner
MERGE|AgentContext|[email protected]|Agency.Name|ExecFinder.Agent1|HRC_SQLLOADER


METADATA|AgentContextLocation|EmailAddress|Username|LocationName|SourceSystemId|SourceSystemOwner
MERGE|AgentContextLocation|[email protected]|Agency.Name|Germany|ExecFinder.Agent1-Germany|HRC_SQLLOADER
BIP – Extract action type and action information from Work relationship
SELECT papf.person_number
      ,ppnf.full_name
      ,pao.action_type_code
      ,pav.action_name
      ,part.action_reason termination_reason 
  FROM per_all_people_f papf
      ,per_person_names_f ppnf
      ,per_periods_of_service ppos
      ,per_actions_vl pav
      ,per_action_reasons_tl part
      ,per_action_occurrences pao
 WHERE papf.person_id =ppnf.person_id 
   AND papf.person_id =ppos.person_id  
   AND UPPER(ppnf.name_type)='GLOBAL' 
  --AND TO_CHAR(ppos.actual_termination_date,'dd-mm-yyyy')<TO_CHAR(SYSDATE,'dd-mm-yyyy') 
   AND ppos.action_occurrence_id = pao.action_occurrence_id 
   AND pao.action_reason_id = part.action_reason_id (+)
   AND part.language (+) = USERENV('LANG')
   AND TRUNC(SYSDATE) BETWEEN TRUNC(papf.effective_start_date) AND TRUNC(papf.effective_end_date)
   AND TRUNC(SYSDATE) BETWEEN TRUNC(ppnf.effective_start_date) AND TRUNC(ppnf.effective_end_date)
   AND papf.person_number = '123351'
   AND pao.action_id = pav.action_id
order by papf.person_number
Configuration – Using UNION in a value set

For requirements to add List of values to a DFF segments, one can create independent value set or a table based value set. For example, if you want LOV as your existing departments, then you can use a table based value set.

But sometimes, you need to display one hard coded value “All” along with list of departments. This is in order to specify that all departments are applicable for a DFF segement value.

In, this case you need to use UNION.

Below is how the query will look like:

SELECT substr(pd.name,1,125) dep_name
  FROM per_departments pd
UNION
SELECT 'All Departments' dep_name
  FROM DUAL

Value Set setup:

From Clause:

(SELECT substr(pd.name,1,125) dep_name  FROM per_departments pd UNION SELECT ‘All Departments’ dep_name  FROM DUAL) a

Value Column Name:

a.dep_name

ID Column Name:

a.dep_name

BIP – Query to find attachment details for related materials

Learning admin can attachments on offering level for different roles – Admin/ Employee etc.

In case, you need to extract the attachment details for an offering, below BIP can be used:

SELECT wlf_learning_items_f.learning_item_id
      ,wlf_learning_items_f.learning_item_number    
	  ,wlf_li_relations_f.relation_id
	  ,fad.pk1_value 
	  ,fad.attached_document_id
	  ,fdt.document_id 
	  ,fdt.title 
	  ,fdt.file_name 
	  ,fad.creation_date
  FROM fnd_attached_documents fad 
      ,fnd_documents_tl fdt 
	  ,wlf_learning_items_f wlf_learning_items_f
	  ,wlf_li_relations_f wlf_li_relations_f
 WHERE fad.document_id = fdt.document_id 
   AND fdt.language = 'US'
   AND fad.entity_name = 'WLF_LI_RELATIONS_F'
   AND fad.pk1_value = wlf_li_relations_f.relation_id
   AND wlf_learning_items_f.learning_item_id = wlf_li_relations_f.source_id
   AND TRUNC(SYSDATE) BETWEEN wlf_learning_items_f.effective_start_date AND wlf_learning_items_f.effective_end_date 
   AND TRUNC(SYSDATE) BETWEEN wlf_li_relations_f.effective_start_date AND wlf_li_relations_f.effective_end_date 
 ORDER BY fad.creation_date DESC

Output:

HDL – DELETE future dated position record

There are scenarios when we want to delete a future dated row from an object. Please note that for objects like Positions/Locations etc which are date tracked, one can make use of HDL with SET command to delete the future dated row.

Let us take an example, where we have below data on Position:

Position Name – Test Position

Effective Start Date – 01- Jan-2023 – Record creation

Effective Start Date – 01-Oct-2023 – Record updated (let us say Standard working hours).

Now the requirement is to delete the row with effective start date – 01-Oct-2023.

In such cases, below HDL can be used:

SET PURGE_FUTURE_CHANGES Y

METADATA|Position|BusinessUnitName|PositionCode|EffectiveStartDate|EffectiveEndDate

MERGE|Position|BU Name|Pos Code|2023/01/01|4712/12/31

Configuration – Employee Location on Location Details page

Starting 22A, Oracle introduced a new field called “Employee Location”.

https://www.oracle.com/webfolder/technetwork/tutorials/tutorial/cloud/r13/wn/ghr/releases/22A/22A-hr-wn.htm#F21144

Any locations for which the value of the field will be selected as “Yes”, will be only available to be used in Manage Employment screen. This is a great feature to segregate locations which can be used on employee assignments vs locations which are used as training centers/ supplier locations etc.

By default, this field is hidden on Location Details quick action.

One needs to enable a sandbox and then from transaction design studio, make it visible.

From Quick actions under My Client Groups, choose Location Details:

image.png

If you still don’t see the attribute, create a new sandbox, add Experience Design Tool in it:

Choose Location Details as Action:

image.png

Click on Add,

under Page attributes, you see Employee Location is hidden OOTB:

image.png

P.S. – Please note this field can’t be used in classic locations page. Also, the default value is set to No.

Approvals – Using functions in BPM approvals

There are cases where we need to use functions like days between to get difference between two dates. These functions can not be used from Transaction console. One has to login to BPM Worklist to access the functions.

Follow the below steps:

Open the worklist app:

https://xxxx-saasfaprod1.fa.ocs.oraclecloud.com/integration/worklistapp/faces/home.jspx

Click on your user name -> Administration and then Task Configurations:

Search for Task:

image.png

Click on Edit:

Then under Assignees -> Click on Diamond Icon:

image.png

Click on – Go To Rule:

Click on create rule -> on Condition, choose fx:

image.png

Then from functions tab, choose the required function:

image.png
image.png

List of all available functions:

https://docs.oracle.com/en/middleware/fusion-middleware/bpm/12.2.1.3/rules-user/oracle-business-rules-built-classes-and-functions.html#GUID-E4EB667A-45C2-46BD-81C1-0FAD12C79851

BIP – Approval Task Name in Transaction Console vs Task name in BPM

In HCM, you can define approval rules for different HCM tasks from Tools -> Transaction Console -> Approval Rules

For example, you want to define an approval rule for Promote action. You can search Promo% in Find box and it will give you all matching approval rules:

You can then click on ‘Configure Rules’ to define the rules.

However, Transaction console has many limitations. You can’t define complex approval rules. You can’t make use of functions etc (Day between) in transaction console. In order to define more complex rules, you need to do it from BPM.

First step is that you need to search for a task from “Task Configuration”:

The issue here is sometimes the Approval Rule names in Transaction Console doesn’t match with Task names in BPM.

In such case, you can make sure of below Query:

SELECT HAPT.NAME "Approval Rule Name"
      ,HAPT.DESCRIPTION
      ,HRPB.TASK_FILE_NAME TaskName
      ,HRPB.TXN_MODULE_IDENTIFIER
      ,HAPT.CATEGORY_NAME
      ,HRPB.CATEGORY_CODE
      ,HAPT.SUBCATEGORY_NAME
      ,HRPB.SUBCATEGORY_CODE
      ,HRPB.FAMILY
      ,DECODE(HAAO.APPROVAL_DISABLED,'true','Bypassed','false','Enabled') "Enabled Status"
 FROM FUSION.HRC_ARM_PROCESS_B HRPB
     ,FUSION.HRC_ARM_PROCESS_TL HAPT
     ,FUSION.HRC_ARM_APPROVAL_OPTIONS HAAO
WHERE HRPB.PROCESS_ID=HAAO.PROCESS_ID
  AND HRPB.PROCESS_ID=HAPT.PROCESS_ID
  AND HAPT.LANGUAGE = 'US'
  AND HRPB.FAMILY = 'HCM'
 ORDER BY 1  

Result:

BIP – Query to find size of documents attached in DOR

Oracle HCM provides a functionality to store different kinds of worker documents in Document Records. Over a period of time, a large number of documents get accumulated for different workers. In such cases, there are requirements to know size of attachments in DORs. For, this I have developed a simple query which can be modified as needed:

select fdv.file_name
      ,fdv.title
      ,dm_document_id
      ,TRUNC(SUM(wcc_documents.dFileSize)/1024,2) 			as "Size in KB"
      ,TRUNC(SUM(wcc_documents.dFileSize)/1024/1024,2) 		as "Size in MB"
      ,TRUNC(SUM(wcc_documents.dFileSize)/1024/1024/1024,2) as "Size in GB"
 from hr_documents_of_record hdr, 
      fnd_attached_documents fad, 
      fnd_documents_vl fdv,
      fusion_ocserver11g.revisions wcc_revisions,  
      fusion_ocserver11g.documents wcc_documents
where to_char(hdr.documents_of_record_id) = fad.pk1_value(+)
  and fad.entity_name(+) = 'HR_DOCUMENTS_OF_RECORD'
  and fad.document_id = fdv.document_id(+)
  and fdv.dm_version_number = wcc_revisions.did
  and wcc_revisions.did = wcc_documents.did
  and wcc_documents.disprimary = 1
group by fdv.file_name
        ,fdv.title
	,dm_document_id	
order by 1 
BIP – Query to find workers with FTE < 1

Below SQL query can be used to find part-timers in HCM. Any worker where FTE is less than 1 is normally considered as part-timer:

select paam.assignment_number, pawmf.value
  from per_legal_employers ple
      ,per_periods_of_service ppos
      ,per_all_assignments_m paam
      ,PER_ASSIGN_WORK_MEASURES_F pawmf
 where ppos.legal_entity_id = ple.organization_id
   and paam.period_of_Service_id = ppos.period_of_Service_id
   and TRUNC(SYSDATE) between paam.effective_start_date AND paam.effective_end_date
   and TRUNC(SYSDATE) between pawmf.effective_start_date AND pawmf.effective_end_date
   and paam.assignment_status_type like 'ACTIVE%'
   and ple.name like 'XX%United%Stat%'
   and paam.assignment_id = pawmf.assignment_id
   and pawmf.unit = 'FTE'
   and  pawmf.value <1
   and paam.assignment_type ='E'
   and ppos.actual_termination_date is null
 order by 1  
BIP – Extract Absence Entries in HDL Format

Recently, I faced a scenario for a customer where after go-live, where there was an issue found with absence plan configuration. The absence plan was incorrectly set up. The Balance Frequency Source wasn’t setup to “Repeating Period”, so the accrual was calculated incorrectly. In order to fix this, a new absence plan was created and the absence entries were made against the new plan.

So, the approach taken was to take a backup of all absence entries from PROD, enroll the employees into new plan and reupload the absence entries.

A BIP report was developed to take a backup of existing absence entries in HDL format. Below is the query for same:

SELECT DATA_ROW
FROM (
SELECT 'METADATA'
|| CHR (124) || 'PersonAbsenceEntry'
|| CHR (124) || 'PerAbsenceEntryId'
|| CHR (124) || 'AbsenceType'
|| CHR (124) || 'AbsenceTypeId'
|| CHR (124) || 'AbsenceStatus'
|| CHR (124) || 'ApprovalStatus'
|| CHR (124) || 'AssignmentNumber'
|| CHR (124) || 'AssignmentId'
|| CHR (124) || 'EmployerId'
|| CHR (124) || 'PersonNumber'
|| CHR (124) || 'PersonId'
|| CHR (124) || 'StartDate'
|| CHR (124) || 'StartTime'
|| CHR (124) || 'StartDateDuration'
|| CHR (124) || 'EndDate'
|| CHR (124) || 'EndTime'
|| CHR (124) || 'EndDateDuration'
|| CHR (124) || 'AbsenceReasonId'
|| CHR (124) || 'SubmittedDate'
|| CHR (124) || 'PlannedEndDate'
|| CHR (124) || 'NotificationDate'
|| CHR (124) || 'ConfirmedDate'
|| CHR (124) || 'SourceSystemId' 
|| CHR (124) || 'SourceSystemOwner'  AS DATA_ROW
FROM DUAL
UNION all
select  'MERGE'
|| CHR (124) || 'PersonAbsenceEntry'
|| CHR (124) || apae.per_absence_entry_id
|| CHR (124) || aatft.name
|| CHR (124) || apae.absence_type_id
|| CHR (124) || apae.absence_status_cd
|| CHR (124) || apae.approval_status_cd
|| CHR (124) || paam.assignment_number
|| CHR (124) || paam.assignment_id
|| CHR (124) || apae.legal_entity_id
|| CHR (124) || papf.person_number
|| CHR (124) || apae.person_id
|| CHR (124) || to_char(apae.start_date,'RRRR/MM/DD')
|| CHR (124) || apae.start_time
|| CHR (124) || apae.start_date_duration
|| CHR (124) || to_char(apae.end_date,'RRRR/MM/DD')
|| CHR (124) || apae.end_time
|| CHR (124) || apae.end_date_duration
|| CHR (124) || apae.absence_type_reason_id
|| CHR (124) || to_char(apae.submitted_date,'RRRR/MM/DD')
|| CHR (124) || to_char(apae.planned_end_date,'RRRR/MM/DD')
|| CHR (124) || to_char(apae.notification_date,'RRRR/MM/DD')
|| CHR (124) || to_char(apae.confirmed_date,'RRRR/MM/DD')
|| CHR (124) || hikm.source_system_id
|| CHR (124) || hikm.source_system_owner
FROM ANC_PER_ABS_ENTRIES apae,
     ANC_ABSENCE_TYPES_F_TL aatft,
     PER_ALL_ASSIGNMENTS_M paam,
     PER_ALL_PEOPLE_F papf,
     HRC_INTEGRATION_KEY_MAP hikm
WHERE hikm.surrogate_id = apae.per_absence_entry_id
  AND apae.absence_type_id = aatft.absence_type_id
  AND apae.assignment_id = paam.assignment_id
  AND apae.person_id = papf.person_id
  AND paam.person_id = papf.person_id
  AND aatft.language = 'US'
  AND aatft.name LIKE 'Annual%'
  AND TRUNC(SYSDATE) BETWEEN aatft.effective_start_date AND aatft.effective_end_date
  AND TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date
  AND apae.start_date BETWEEN paam.effective_start_date AND paam.effective_end_date
 )

Once the data is extracted, you need to make sure that Source System Owner is updated from FUSION to HRC_SQLLOADER. Source System Owner is set to FUSION when an entry is created from UI.

Once the output of the BIP is ready, Change “MERGE” to “DELETE” to delete all the absence entries. Then enroll the workers in new plan, do the required changes in BIP extract and upload the data back in Fusion HCM.

HDL – Sample file to close benefit certifications

Starting 23C, a new HDL object has been added to close any pending actions for benefits. The certification type is validated against a lookup called – BEN_ENRT_CTFN_TYP

Below is a sample file to close any pending certification actions against “POC”:

METADATA|CloseEnrollmentCertification|PersonNumber|ParticipantFirstName|ParticipantLastName|CertificationType|ProgramName|PlanName|OptionName|ReceivedDate|DeniedDate|SourceSystemId|SourceSystemOwner
MERGE|CloseEnrollmentCertification|12|First Name 1|Last Name 1|POC|Test Benefits Program|Test Optical|Employee + 1|2023/02/01||12_First Name 1_Last Name 1_POC|HRC_SQLLOADER
MERGE|CloseEnrollmentCertification|13|First Name 2|Last Name 2|POC|Test Benefits Program|Test Medical|Employee + 1||2023/02/01|12_First Name 2_Last Name 2_POC|HRC_SQLLOADER

 
HDL – Sample file to load Person Habits

Attributes like student status, tobacco consumption which impact worker benefits can be loaded using PersonHabits.dat.

Sample File:

METADATA|PersonHabits|SourceSystemOwner|SourceSystemId|LegalEmployer|PersonNumber|EffectiveStartDate|EffectiveEndDate|StudentStatus|CvrdInAnthrPl
MERGE|PersonHabits|HRC_SQLLOADER|PERSON_HABIT_12|Test LE|12|2021/12/03|4712/12/31|PART_TIME|
MERGE|PersonHabits|HRC_SQLLOADER|PERSON_HABIT_13|Test LE|13|2021/09/24|4712/12/31|FULL_TIME|