Search for:
HDL – Sample file to delete element eligibility

HDL supports deletion of element eligibilities. You can make use of PayrollElementDefinition business object. Child object ElementEligibility supports create, update and delete operations.

You can create a simple BIP query to extract the requisite data and create an HDL file out of it:

select pelf.element_link_id
      ,petf.element_type_id
      ,petf.base_element_name
      ,pelf.effective_start_date 
  from pay_element_types_f petf
     , pay_element_links_f pelf
 where petf.element_type_id = pelf.element_type_id
   and element_link_id=300000175215375   
   and TRUNC(sysdate) BETWEEN petf.effective_start_date and petf.effective_end_date 

Sample HDL file:

METADATA|ElementEligibility|ElementLinkId|ElementId|EffectiveStartDate
DELETE|ElementEligibility|300000175215375|300000175200627|1951/01/01

Sometime you may get an error that You can’t delete element eligibility as element entries exist for this eligibility. In this case first you need to identify the corresponding element entries and Delete those and then retry deleting element eligibility.

Please check mos note – When Attempting to Delete Element Eligibility Get Error ‘The element eligibility record can’t be deleted because it would invalidate existing element entries with effective start dates in the future. (PAY-1635756)’ (Doc ID 2686914.1) for sample query and sample file for element entry deletion.

Version Validated : 21C

BIP – Query to Extract Learner Completion Details

Use below query to extract the learner completion details. The same query can be used to generate the Learning Record HDL format with small changes.

SELECT TO_CHAR(warf.EFFECTIVE_START_DATE,'RRRR/MM/DD') "Learning Record EffectiveStartDate"
	   ,warf.ASSIGNMENT_RECORD_NUMBER "Learning Record Number"
	   ,weaf.ASSIGNMENT_PROFILE_NUMBER "Assignment Number"
	   ,wlif.LEARNING_ITEM_TYPE "Learning ItemType"
	   ,wlif.LEARNING_ITEM_NUMBER "Learning ItemNumber"
	   ,warf.EVENT_TYPE "Assignment Type"
	   ,warf.EVENT_SUB_TYPE "Assignment SubType"
	   ,warf.ATTRIBUTION_TYPE "Assignment AttributionType"
	   ,warf.ATTRIBUTION_LOOKUP_CODE "Assignment AttributionCode"
       ,papf.PERSON_NUMBER "Learner Number"
	   ,warf.STATUS "Learning RecordStatus"
       ,TO_CHAR(NVL(warf.ASSIGNMENT_STARTED_ON_DATE,warf.EFFECTIVE_START_DATE),'RRRR/MM/DD') "Learning Record StartDate"
       ,TO_CHAR(warf.CALCULATED_DUE_DATE,'RRRR/MM/DD') "Due Date"	   
       ,TO_CHAR(warf.COMPLETION_DATE,'RRRR/MM/DD') "Completion Date"	   
       ,warf.ACTUAL_SCORE "Actual Score"
  FROM WLF_LEARNING_ITEMS_F wlif,
       WLF_ASSIGNMENT_RECORDS_F warf,
       WLF_EVENT_ASSIGNMENTS_F weaf,
       PER_ALL_PEOPLE_F papf	   
 WHERE to_char(warf.LEARNING_ITEM_ID) = to_char(wlif.LEARNING_ITEM_ID)
   AND warf.STATUS = 'ORA_ASSN_REC_COMPLETE'
   AND TRUNC(SYSDATE) BETWEEN warf.EFFECTIVE_START_DATE AND warf.EFFECTIVE_END_DATE
   AND TRUNC(SYSDATE) BETWEEN wlif.EFFECTIVE_START_DATE AND wlif.EFFECTIVE_END_DATE
   AND warf.learner_id = papf.person_id
   AND TRUNC(SYSDATE) BETWEEN papf.EFFECTIVE_START_DATE AND papf.EFFECTIVE_END_DATE
   AND weaf.EVENT_ASSIGNMENT_ID = warf.EVENT_ASSIGNMENT_ID
HDL – Sample HDL to load Performance Rating

Historical performance ratings can be loaded into Worker Talent Profile using TalentProfile.dat business object.

Below is the sample HDL (enhanced profiles (v2)), to load performance rating data:

METADATA|ProfileItem|ProfileItemId|ProfileId|ProfileCode|ContentItemId|ContentTypeId|ContentType|CountryId|DateFrom|DateTo|RatingModelId1|RatingModelCode1|RatingLevelId1|RatingLevelCode1|SectionId|SourceSystemOwner|SourceSystemId
MERGE|ProfileItem|||PER_0055|||PERFORMANCE_RATING||2018/01/01|2018/12/31||XXX_PERFORMANCE||3.75|9933|HRC_SQLLOADER|PR_0055
MERGE|ProfileItem|||PER_0031|||PERFORMANCE_RATING|| 2018/01/01|2018/12/31||XXX_PERFORMANCE||3.90|9933| HRC_SQLLOADER |PR _0055 

In order to make the loaded performance rating visible in UI, Source Type and Qualifier ID should be passed. Values of source type and qualifier Id depend upon source of performance rating.

Performance Rating SourceAdditional Attributes required in HDLComments
Added Manually from Talent RatingSourceType = HRT
QualifierId1 = 33
From Performance ManagementSourceType = HRA
QualifierId1 = 31
SourceId = 103
SourceKey1 = EVALUATION_ID
Talent Review MeetingSourceType = HRTR
QualifierId1 = 32
SourceKey1 = MEETING_ID
Conduct Meeting
Talent Review MeetingSourceType = HRRMGR
QualifierId1 = 35
SourceKey1 = MEETING_ID
Meeting Preparation

Details of source type can be found in – hrt_sources_vl.

METADATA|ProfileItem|ProfileItemId|ProfileId|ProfileCode|ContentItemId|ContentTypeId|ContentType|CountryId|DateFrom|DateTo|RatingModelId1|RatingModelCode1|RatingLevelId1|RatingLevelCode1|SectionId|SourceSystemOwner|SourceSystemId|SourceType|QualifierId1
MERGE|ProfileItem|||PER_0055|||PERFORMANCE_RATING||2018/01/01|2018/12/31||XXX_PERFORMANCE||3.75|9933|HRC_SQLLOADER|PR_0055
MERGE|ProfileItem|||PER_0031|||PERFORMANCE_RATING|| 2018/01/01|2018/12/31||XXX_PERFORMANCE||3.90|9933| HRC_SQLLOADER |PR _0055|HRT|33
HDL – Loading Classroom Resources in Bulk

HCM Data Loader object ClassroomResource can be used to bulk upload classroom resources in Oracle Learning Cloud. Existing locations created as part of Global HR can also be designated as classroom resources.

Below is a sample file to upload classroom resources using HDL:

METADATA|ClassroomResource|ClassroomResourceNumber|Title|Description|Capacity|SetCode|LocationCode|SourceSystemOwner|SourceSystemId|OwnedByPersonNumber
MERGE|ClassroomResource|CLASS1001|Room 1|Room 1|10|COMMON|JPLoc004 Kyushu|HRC_SQLLOADER|1001|101
MERGE|ClassroomResource|CLASS1002|Room 2|Room 2|15|COMMON|JPLoc004 Kyushu|HRC_SQLLOADER|1002|101
MERGE|ClassroomResource|CLASS1003|Room 3|Room 3|20|COMMON|JPLoc004 Kyushu|HRC_SQLLOADER|1003|101

Once the data is loaded successfully, you can verify the results from UI:

Navigation :- My Client Groups -> Learning -> Catalog Resources -> Classrooms

HDL – Sample File to DELETE documents of Record

Use the below query to extract document of record data which should be deleted. You can add additional filters by adding person numbers.

select 'DELETE' 					"METADATA"
      ,'DocumentsOfRecord'			"DocumentsOfRecord"
	  ,hdor.documents_of_record_id	"DocumentsOfRecordId"
	  ,hdor.document_type_id	    "DocumentTypeId"
	  ,hdor.document_code			"DocumentCode"
	  ,hdor.document_name			"DocumentName"
	  ,hdor.document_number			"DocumentNumber"
	  ,hikm.source_system_owner		"SourceSystemOwner"
	  ,hikm.source_system_id		"SourceSystemId"
  from HR_DOCUMENTS_OF_RECORD hdor
      ,HRC_INTEGRATION_KEY_MAP hikm
 where hdor.documents_of_record_id = hikm.surrogate_id
   and hdor.documents_of_record_id = 300000217125443

Query for Person Number and Document Type:

select DISTINCT 'DELETE' 					"METADATA"
      ,'DocumentsOfRecord'			"DocumentsOfRecord"
	  ,hdor.documents_of_record_id	"DocumentsOfRecordId"
	  ,hdor.document_type_id	    "DocumentTypeId"
	  ,hdor.document_code			"DocumentCode"
	  ,hdor.document_name			"DocumentName"
	  ,hdor.document_number			"DocumentNumber"
	  ,hikm.source_system_owner		"SourceSystemOwner"
	  ,hikm.source_system_id		"SourceSystemId"
  from HR_DOCUMENTS_OF_RECORD hdor
      ,HRC_INTEGRATION_KEY_MAP hikm
	  ,PER_ALL_PEOPLE_F papf
	  ,HR_DOCUMENT_TYPES_TL hdtt
 where hdor.documents_of_record_id = hikm.surrogate_id
   --and hdor.documents_of_record_id = 300000217125443
   and hdor.person_id = papf.person_id
   and TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date
   and hdor.document_type_id = hdtt.document_type_id
   and hdtt.language = 'US'
   and hdtt.DOCUMENT_TYPE = 'Test Doc'
   and papf.person_number IN ('12','23')

Sample HDL File:

METADATA|DocumentsOfRecord|DocumentsOfRecordId|DocumentTypeId|DocumentCode|DocumentName|DocumentNumber|SourceSystemOwner|SourceSystemId
DELETE|DocumentsOfRecord|300000217125443|300000217168555|TMAD_001|Multiple Attachments||HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001
HDL – Update Goal Weight

Goal Weightage can be updated using GoalMeasurement metadata in Goal business object. First, we need to extract the uploaded measurements. Use the below query to extract the details:

SELECT HDL_DATA
FROM (
SELECT 'METADATA|GoalMeasurement|MeasurementId|MeasurementName|StartDate|EndDate|TargetValue|TargetPercentage|UomCode|MeasureTypeCode|TargetType|GoalId|SourceSystemId|SourceSystemOwner' HDL_DATA, 1 SEQ
  FROM DUAL
UNION ALL
SELECT 'MERGE'
     || CHR (124)
     || 'GoalMeasurement'
     || CHR (124)	 
	 || hgm.measurement_id
     || CHR (124)
	 || hgm.measurement_name
     || CHR (124)
	 || TO_CHAR(hgm.start_date,'RRRR/MM/DD')
     || CHR (124)
	 || TO_CHAR(hgm.end_date,'RRRR/MM/DD')
     || CHR (124)
	 || hgm.target_value
     || CHR (124)
	 || hgm.target_percentage
     || CHR (124)
	 || hgm.uom_code
     || CHR (124)
	 || hgm.measure_type_code
     || CHR (124)
	 || hgm.target_type
     || CHR (124)
	 || hg.goal_id
     || CHR (124)
	 || hikm.source_system_id
     || CHR (124)
	 || hikm.source_system_owner HDL_DATA, 2 SEQ
  FROM hrg_goals hg
      ,hrg_goal_measurements hgm
      ,hrc_integration_key_map hikm
 WHERE hg.goal_id = hgm.goal_id
   AND hgm.measurement_id = hikm.surrogate_id
   AND hgm.uom_code = 'PERCENT'
   --AND hg.goal_id = 300000215856607
)
ORDER BY SEQ

Save the data in Goal.dat HDL file. Update the target value as required. Save the .dat file and upload.

Sample HDL:

METADATA|GoalMeasurement|MeasurementId|MeasurementName|StartDate|EndDate|TargetValue|TargetPercentage|UomCode|MeasureTypeCode|TargetType|GoalId|SourceSystemId|SourceSystemOwner
MERGE|GoalMeasurement|300000215856608|Customer Survey Results|2019/01/01|2019/12/31|80||PERCENT|QUANTITATIVE|MIN|300000215856607|300000215856608|FUSION
BIP – Query to get Configure HCM Data Loader parameters

Configure HCM Data Loader is a task that is used to define the HCM Data Loader parameters. The parameters are broadly divided into following categories:- Availability, Diagnostic, File Definition, Performance and Scheduling Default. These are the default settings which are applied to any HCM Data Load.

You can override some of these parameters at the individual dat file level i.e. using SET commands or some of the parameters can be overridden while submitting the Import and Load job.

Please note that the default values vary from HDL to HSDL (Spreadsheet Loader).

You can use below mentioned query to get the details from backend using a BIP:

select PARAM_CATEGORY
          ,PARAM_NAME
	  ,DEFAULT_VALUE
	  ,HSDL_DEFAULT_VALUE
          ,HDL_ENABLED
	  ,HDL_OVERRIDE_ENABLED
	  ,HSDL_ENABLED
	  ,HSDL_OVERRIDE_ENABLED
	  ,VALUE_LOOKUP_TYPE
	  ,CREATED_BY
	  ,CREATION_DATE
	  ,LAST_UPDATED_BY
	  ,LAST_UPDATE_DATE
 from hrc_dl_all_parameters
 ORDER By 1,2

Query to get list of overridden values:

select hdap.PARAM_CATEGORY
      ,hdap.PARAM_NAME
	  ,hdap.DEFAULT_VALUE
	  ,hdap.HSDL_DEFAULT_VALUE
      ,hdap.HDL_ENABLED
	  ,hdap.HDL_OVERRIDE_ENABLED
	  ,hdap.HSDL_ENABLED
	  ,hdap.HSDL_OVERRIDE_ENABLED
	  ,hdap.VALUE_LOOKUP_TYPE
	  ,hdap.CREATED_BY
	  ,hdap.CREATION_DATE
	  ,hdap.LAST_UPDATED_BY
	  ,hdap.LAST_UPDATE_DATE
	  ,hdpo.OVERRIDE_LEVEL "OVERRIDDEN_AT_LEVEL"
	  ,hdpo.OVERRIDE_VALUE "OVERRIDDEN_VALUE"
 from hrc_dl_all_parameters hdap
     ,hrc_dl_parameter_overrides hdpo
where hdap.parameter_id = hdpo.parameter_id	 
 ORDER By 1,2
BIP – Query to get element entry details
Select DISTINCT peevf.element_entry_value_id
,peef.element_entry_id
,petf.base_element_name
,peevf.effective_start_date
,paam.assignment_number
,pivf.base_name
from per_all_assignments_m paam
,pay_element_types_f petf
,pay_element_entries_f peef
,pay_element_entry_values_f peevf
,pay_input_values_f pivf
where 1=1
and paam.person_id = peef.person_id
and peef.element_type_id = petf.element_type_id
and pivf.element_type_id = petf.element_type_id
and peef.element_entry_id = peevf.element_entry_id
and paam.ASSIGNMENT_TYPE in ('E')
and paam.primary_assignment_flag = 'Y'
and petf.base_element_name = 'Dental Plan'
and pivf.base_name = 'Amount'
and paam.assignment_number = 'E1111'
and trunc(sysdate) between petf.effective_start_date and petf.effective_end_date
and trunc(sysdate) between paam.effective_start_date and paam.effective_end_date
and trunc(sysdate) between pivf.effective_start_date and pivf.effective_end_date

HDL – Sample File to Load extended Lookup Codes

Oracle HCM makes use of extended lookups feature to support dependent lookup values. For example, while creating a VISA or Work Permit record for a Person for Singapore, the Category field is dependent upon Type of the pass chosen. The values of Category field are derived from Extended lookup.

Navigate to Setup and Maintenance -> Manage Extended Lookup codes -> Visa Permit Type

Now, let us take an example where we need to load 2 values for categories based on the lookup code S Pass.

Prepare the HDL file in below format:

METADATA|ExtendedLookupCode|ExtendedLookupCodeId|LookupType|LookupCode|LegislationCode|ExtendedLookupCode|ExtendedLookupCodeName|SourceSystemOwner|SourceSystemId
MERGE|ExtendedLookupCode||PER_VISA_PERMIT_TYPE|SG_SP|SG|TEST_SP1|S Pass Holder – Test 1|HRC_SQLLOADER|TEST_1
MERGE|ExtendedLookupCode||PER_VISA_PERMIT_TYPE|SG_SP|SG|TEST_SP2|S Pass Holder – Test 2|HRC_SQLLOADER|TEST_2

zip the file and upload using HCM Data Loader from Data Exchange.

On successful load, the new values can be verified from either of following two places on the UI:

  1. From Manage Extended Lookup codes:

2. From Documents Tab on Person UI:

HDL – Loading Multiple attachments for same document type

There are various cases where more than one attachment should be maintained against employee document record. First of all, we will create a new document type where multiple attachments will be allowed.

  1. Navigate to Setup and Maintenance -> Document Types -> Click on Create

2. Choose multiple occurrences as Yes:

3. Enter the relevant information and submit the record.

4. Prepare the HDL in below format and load the data:

METADATA|DocumentsOfRecord|PersonNumber|DateFrom|DateTo|DocumentType|DocumentName|DocumentNumber|DocumentCode|Country|AttachmentEntityName|Comments|IssuedDate|IssuingAuthority|IssuingCountry|IssuingLocation|Publish|PublishDate|SourceSystemId|SourceSystemOwner
MERGE|DocumentsOfRecord|101|2021/01/01||Test Multi Attachments Doc|Multiple Attachments||TMAD_001||||||||||HRC_SQLLOADER_101_TMAD_001|HRC_SQLLOADER

METADATA|DocumentAttachment|SourceSystemOwner|SourceSystemId|PersonNumber|DocumentType|DocumentCode|Title|Description|DataTypeCode|Country|EntityName|File|FileName|MimeType|URL|URLorTextorFileName|DmVersonNumber
MERGE|DocumentAttachment|HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001_1|101|Test Multi Attachments Doc|TMAD_001|Test Multi Attachments Doc|Test Multi Attachments Doc 1|FILE|||Historical_rating.docx|Historical_rating.docx|||Historical_rating.docx|
MERGE|DocumentAttachment|HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001_2|101|Test Multi Attachments Doc|TMAD_001|Test Multi Attachments Doc|Test Multi Attachments Doc 2|FILE|||Historical_rating1.docx|Historical_rating1.docx|||Historical_rating1.docx|

5. Place the docx files in BlobFiles folder and zip with the .dat file:

6. Import and Load the file.

7. Once the data is loaded, verify from the UI:

HDL – Loading Contract History Using HDL

Sample HDL file to contract history:

METADATA|Worker|PersonId|EffectiveStartDate|EffectiveEndDate|PersonNumber|BloodType|CorrespondenceLanguage|StartDate|DateOfBirth|DateOfDeath|CountryOfBirth|RegionOfBirth|TownOfBirth|PersonDuplicateCheck|ActionCode|ReasonCode|GUID|SourceSystemOwner|SourceSystemId
MERGE|Worker||2016/08/27|4712/12/31|435282|||2016/08/27|1967/07/23||PH|Philippines|||HIRE|||LEGACY|PER_909090

METADATA|PersonName|PersonNameId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|NameType|FirstName|MiddleNames|LastName|Honors|KnownAs|PreNameAdjunct|MilitaryRank|PreviousLastName|Suffix|Title|CharSetContext|NameInformation1|NameInformation2|NameInformation3|NameInformation4|NameInformation5|NameInformation6|NameInformation7|NameInformation8|NameInformation9|NameInformation10|NameInformation11|NameInformation12|NameInformation13|NameInformation14|NameInformation15|NameInformation16|NameInformation17|NameInformation18|NameInformation19|NameInformation20|NameInformation21|NameInformation22|NameInformation23|NameInformation24|NameInformation25|NameInformation26|NameInformation27|NameInformation28|NameInformation29|NameInformation30|GUID|SourceSystemOwner|SourceSystemId
MERGE|PersonName||2016/08/27|4712/12/31|PER_909090||US|GLOBAL|||Ashwin Ravi||Ashwin Ravi|||||MR.|||||||||||||||||||||||||||||||||LEGACY|909090_PER_NAME

METADATA|PersonLegislativeData|PersonLegislativeId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|HighestEducationLevel|MaritalStatus|MaritalStatusDate|Sex|GUID|SourceSystemOwner|SourceSystemId
MERGE|PersonLegislativeData||2016/08/27|4712/12/31|PER_909090||US||M||M||LEGACY|909090_PER_LEG

METADATA|WorkRelationship|PeriodOfServiceId|LegalEmployerName|PersonId(SourceSystemId)|PersonNumber|DateStart|WorkerType|LegalEmployerSeniorityDate|ActualTerminationDate|LegalEntityId|Comments|EnterpriseSeniorityDate|LastWorkingDate|NotifiedTerminationDate|OnMilitaryServiceFlag|PrimaryFlag|ProjectedTerminationDate|RehireAuthorizerPersonId|RehireAuthorizor|RehireReason|RevokeUserAccess|WorkerNumber|RehireRecommendationFlag|NewStartDate|ActionCode|ReasonCode|DateOfDeath|HideUntilDate|DateForPrimaryFlagChange|CancelWorkRelationshipFlag|TerminateWorkRelationshipFlag|ReverseTerminationFlag|CorrectTerminationFlag|GUID|SourceSystemOwner|SourceSystemId|GlobalTransferFlag|ReadyToConvert|WorkerComments
MERGE|WorkRelationship||US Company|PER_909090|435282|2016/08/27|E||||||||N|Y|||||||||HIRE||||||||||LEGACY|455985_WR||||

METADATA|WorkTerms|AssignmentId|AssignmentNumber|EffectiveEndDate|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|LegalEmployerName|DateStart|WorkerType|AssignmentName|PeopleGroup|AssignmentStatusTypeId|AssignmentStatusTypeCode|AssignmentType|PersonTypeId|PersonTypeCode|SystemPersonType|BargainingUnitCode|BillingTitle|BusinessUnitId|BusinessUnitShortCode|ContractId|DateProbationEnd|WorkerCategory|AssignmentCategory|EstablishmentId|ReportingEstablishment|ExpenseCheckSendToAddress|GradeId|GradeCode|GradeLadderPgmId|GradeLadderPgmName|HourlySalariedCode|InternalBuilding|InternalFloor|InternalLocation|InternalMailstop|InternalOfficeNumber|JobId|JobCode|LabourUnionMemberFlag|LocationId|LocationCode|ManagerFlag|NormalHours|Frequency|NoticePeriod|NoticePeriodUOM|OrganizationId|DepartmentName|PositionId|PositionCode|PositionOverrideFlag|PrimaryWorkTermsFlag|ProbationPeriod|ProbationUnit|ProjectedEndDate|ProjectedStartDate|ProposedWorkerType|ProposedUserPersonType|ReasonCode|RetirementAge|RetirementDate|SpecialCeilingStepId|SpecialCeilingStep|StepEntryDate|TaxAddressId|EndTime|StartTime|VendorSiteId|WorkAtHomeFlag|WorkTermsAssignmentId|VendorId|FreezeStartDate|FreezeUntilDate|ActionCode|CategoryCode|TaxReportingUnitId|TaxReportingUnit|GUID|CollectiveAgreementId|CollectiveAgreementIdCode|FullPartTime|PermanentTemporary|SourceSystemOwner|SourceSystemId
MERGE|WorkTerms||ET435282|2017/07/31|Y|1|2016/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||HIRE|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|2018/08/26|Y|1|2017/08/01|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||ASG_CHANGE|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|2019/08/26|Y|1|2018/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||CONTRACT_EXTENSION|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|4712/12/31|Y|1|2019/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||CONTRACT_EXTENSION|||||||||LEGACY|52361_WT

METADATA|Assignment|AssignmentId|ActionCode|EffectiveStartDate|EffectiveEndDate|EffectiveSequence|EffectiveLatestChange|WorkTermsAssignmentId(SourceSystemId)|WorkTermsNumber|AssignmentType|AssignmentName|AssignmentNumber|PeopleGroup|AssignmentStatusTypeId|AssignmentStatusTypeCode|BargainingUnitCode|BillingTitle|BusinessUnitId|BusinessUnitShortCode|DateProbationEnd|WorkerCategory|AssignmentCategory|EstablishmentId|ReportingEstablishment|ExpenseCheckSendToAddress|GradeId(SourceSystemId)|GradeCode|GradeLadderPgmId|GradeLadderPgmName|HourlySalariedCode|InternalBuilding|InternalFloor|InternalLocation|InternalMailstop|InternalOfficeNumber|JobId(SourceSystemId)|JobCode|LabourUnionMemberFlag|LocationId|LocationCode|ManagerFlag|NormalHours|Frequency|NoticePeriod|NoticePeriodUOM|OrganizationId(SourceSystemId)|DepartmentName|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|DateStart|WorkerType|LegalEmployerName|PersonTypeId|PersonTypeCode|SystemPersonType|PositionId(SourceSystemId)|PositionCode|PositionOverrideFlag|PrimaryAssignmentFlag|PrimaryFlag|ProbationPeriod|ProbationUnit|ProjectTitle|ProjectedEndDate|ProjectedStartDate|ProposedWorkerType|ProposedUserPersonType|ReasonCode|RetirementAge|RetirementDate|SpecialCeilingStepId|SpecialCeilingStep|TaxAddressId|EndTime|StartTime|VendorSiteId|WorkAtHomeFlag|VendorId|FreezeStartDate|FreezeUntilDate|CategoryCode|CollectiveAgreementId|CollectiveAgreementIdCode|FullPartTime|PermanentTemporary|DefaultExpenseAccount|GUID|SourceSystemOwner|SourceSystemId|GspEligibilityFlag|OvertimePeriodName|SeniorityBasis|UnionId|UnionName
MERGE|Assignment||HIRE|2016/08/27|2017/07/31|1|Y|52361_WT|ET435282|E||E435282|121||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||ASG_CHANGE|2017/08/01|2018/08/26|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||CONTRACT_EXTENSION|2018/08/27|2019/08/26|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||CONTRACT_EXTENSION|2019/08/27|4712/12/31|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||

METADATA|Contract|ContractId|EffectiveStartDate|EffectiveEndDate|AssignmentId|AssignmentNumber|PersonNumber|ContractType|Description|Duration|DurationUnits|ExtensionPeriod|ExtensionPeriodUnits|ExtensionReason|PersonId(SourceSystemId)|GUID|SourceSystemOwner|SourceSystemId
MERGE|Contract||2016/08/27|2018/08/26||ET435282|435282|Flexiable 2 Years|Rolling Contract|2|Y||||PER_909090||LEGACY|52361_CONTRACT
MERGE|Contract||2018/08/27|2019/08/26||ET435282|435282|Flexiable 2 Years|Rolling Contract|||1|Y||PER_909090||LEGACY|52361_CONTRACT
MERGE|Contract||2019/08/27|4712/12/31||ET435282|435282|Flexiable 2 Years|Rolling Contract|||1|Y||PER_909090||LEGACY|52361_CONTRACT

Version Tested – 20B

Reports (BIP) – Query to extract default Access Group data for a Course/Offering

In Oracle Learning cloud, Learning Admin can define the Default access at Course or offering level. Course.dat can be used to load this information.

In order to extract this information from DB in HDL format, please use below query:

SELECT 'MERGE' 	                                           "METADATA"       
      ,'CourseDefaultAccess' 		                   "CourseDefaultAccess"                  
      ,to_char(wapf.effective_start_date,'RRRR/MM/DD')     "EffectiveStartDate"           
      ,to_char(wapf.effective_end_date,'RRRR/MM/DD')       "EffectiveEndDate"       
      ,wapf.access_permission_number                       "DefaultAccessNumber"       
      ,wlif.learning_item_number                           "CourseNumber"       
      ,wapf.follow_spec 	                           "FollowSpecialization"       
      ,wapf.assign_as_manager                		   "MgrInitialStatus"       
      ,wapf.assign_for_self                                "SelfInitialStatus"        
      ,wlif.ss_view_mode                                   "SelfViewMode"    
 FROM WLF_ACCESS_PERMISSIONS_F wapf      
     ,WLF_LEARNING_ITEMS_F wlif 
WHERE 1=1   
  AND wapf.access_permission_id= wlif.access_permission_id   
--AND trunc(sysdate) between wapf.effective_start_date and wapf.effective_end_date   
  AND trunc(sysdate) between wlif.effective_start_date and wlif.effective_end_date   
  AND wlif.learning_item_number = 'OLC251051'

Also, learners can be assigned using Access Groups.

To extract this information below SQL can be used:

SELECT wlif.learning_item_id
      ,wlif.learning_item_number course_Number
      ,wlif_tl.name course_Name
      ,wlif_tl.description_short description
      ,wlif_ag.learning_item_number access_group_number
      ,wlif_tl_ag.name access_group_name
  FROM wlf_learning_items_f 	wlif
      ,wlf_learning_items_f_tl 	wlif_tl
      ,wlf_learning_items_f 	wlif_ag
      ,wlf_learning_items_f_tl 	wlif_tl_ag
      ,wlf_li_ag_relations_f 	wlarf
 WHERE 1=1
   AND wlif_tl.learning_item_id 		= wlif.learning_item_id
   AND wlif_tl.name 				= 'Test 1234'
   AND wlif_ag.learning_item_type 		= 'ORA_ACCESS_GROUP'
   AND wlif_tl_ag.learning_item_id 		= wlif_ag.learning_item_id
   AND wlarf.access_learning_item_id 	= wlif_ag.learning_item_id
   AND wlarf.catalog_learning_item_id 	= wlif.learning_item_id   
   AND TRUNC(SYSDATE) BETWEEN wlif_tl_ag.effective_start_date AND wlif_tl_ag.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN wlif_ag.effective_start_date AND wlif_ag.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN wlif_tl.effective_start_date AND wlif_tl.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN wlif.effective_start_date AND wlif.effective_end_date
HDL – Query to get Assignment Grade Steps
SELECT 'MERGE' "METADATA"
      ,'AssignmentGradeSteps' "AssignmentGradeSteps"
      ,pagsf.EFFECTIVE_START_DATE "EffectiveStartDate"
      ,pagsf.EFFECTIVE_END_DATE "EffectiveEndDate"
      ,pagsf.ASSIGN_GRADE_STEP_ID "AssignGradeStepId"
      ,paam.assignment_number "AssignmentNumber"
      ,pav.ACTION_CODE "ActionCode"
      ,parv.ACTION_REASON_CODE "ReasonCode"
      ,pgsfv.name "GradeStepName"
      ,NULL "NewGradeStepName" --> to be supplied
 FROM PER_ASSIGN_GRADE_STEPS_F pagsf,
      PER_GRADE_STEPS_F_VL pgsfv,
      PER_ALL_ASSIGNMENTS_M paam,
      PER_ACTION_OCCURRENCES pao,
      PER_ACTIONS_VL pav,
      PER_ACTION_REASONS_VL parv
WHERE pagsf.GRADE_STEP_ID = pgsfv.GRADE_STEP_ID
  AND TRUNC(SYSDATE) BETWEEN pagsf.effective_start_date AND pagsf.effective_end_date
  AND TRUNC(SYSDATE) BETWEEN pgsfv.effective_start_date AND pgsfv.effective_end_date
  AND TRUNC(SYSDATE) BETWEEN paam.effective_start_date AND paam.effective_end_date
  AND pagsf.assignment_id = paam.assignment_id
  AND paam.assignment_type = 'E'
  AND paam.effective_latest_change = 'Y'
  AND pagsf.ACTION_OCCURRENCE_ID = pao.ACTION_OCCURRENCE_ID
  AND pav.action_id = pao.action_id
  AND parv.action_reason_id = pao.action_reason_id
HDL – Sample File to Load Document Type Security Profile

Manage Document Type security profile can be used to define a security profile with a list of various documents tagged to it. The documents can either be included or excluded in this security profile.

HDL can be used to mass create and update the document type security profiles.

Check the below post on how to refresh the business objects to download the latest template for DocumentTypeSecurityProfile.dat:

HDL – Refresh Business Objects – Welcome to Fusion HCM Consulting

Below is the sample HDL for same:

METADATA|DocumentTypeSecurityProfile|DocumentTypeSecurityProfileId|EnabledFlag|IncludeExclude|Name|SourceSystemOwner|SourceSystemId|GUID
MERGE|DocumentTypeSecurityProfile||Y|I|TestDocSecurityProfile|HRC_SQLLOADER|HRC_SQLLOADER_TestDocSecurityProfile|

METADATA|DocumentType|DocumentTypeSecProfDocumentTypeId|DocumentTypeSecurityProfileId|DocumentTypeId|DocumentTypeName|DocumentTypeLegislationCode|DocumentTypeCountry|DocumentTypeSecurityProfileName|SourceSystemOwner|SourceSystemId|GUID
MERGE|DocumentType||||W_2|US|US|TestDocSecurityProfile|HRC_SQLLOADER|HRC_SQLLOADER_TestDocSecurityProfile_W_2|

Verify the uploaded security profile from UI:

HDL – Load Goal Library Data

Goal Library is loaded as Content items using ContentItem.dat. The various attributes and their mapping can be found using GoalLibraryAttributes.xlsx which can be found on metalink using below note id:

Query To Get Goal Library Goals (Doc ID 2228128.1)

How to Load Library Goals (Doc ID 2201757.1)

Please note that ItemText2 (Priority Code) and ItemText12 (External ID) is a required attribute. Although these are not marked as mandatory and HDL load will complete successfully with a NULL value for these, however, the loaded Goals will not be visible in UI.

ItemText12 – Can be any random numeric value.

Sample HDL File:

METADATA|ContentItem|ContentItemId|ContentTypeId|ContentItemCode|Name|ItemDescrlong|ItemTextTl11|DateFrom|DateTo|ItemText1|ItemText2|ItemDate1|ItemDate2|ItemDate3|ItemText3|ItemTextTl1|ItemText5|ItemTextTl2|ItemNumber1|ItemText6|ItemText7|RatingModelId|ItemText9|ItemText10|ItemText12|SourceSystemId|SourceSystemOwner
MERGE|ContentItem||124||Test Goal Libarary 123|Goal Loaded into libarary using content item HDL|Schedule regular headcount and budget review meetings. Facilitate regular reporting distributions to senior management on a biweekly basis|2000/01/01||A|HIGH||||STRETCH||QUANTITATIVE||20|MAX|PERCENT||PERFORMANCE|ALL|1000090000|VIS09090909|VIS
HDL – Updating an attribute on Assignment

Technical consultants encounter many scenario’s where they have to update one or two attributes on the assignment data post initial data load. This can happen during any phase of the new implementation or post go-live. For example, there is a need to update the assignment category for a set of workers. In such scenario’s, the approach is to extract the already uploaded assignment data with minimum required attributes and append the attributes for assignment category and upload it back to Fusion using HDL.

The key thing here is to extract the minimum required data for Assignment and WorkTerms. The mandatory fields required for an update using HDL can be checked from:

My Client Groups -> Data Exchange -> View Business Objects -> Worker -> WorkTerms -> Assignment

Validated Version :- 20D

Sample Query to extract the minimum required attributes for Assignments and Workterms in order to update assignment category (AssignmentCategory):

SELECT HDL_DATA
FROM (
SELECT 'METADATA'
|| CHR (124)
|| 'WorkTerms'
|| CHR (124)
|| 'AssignmentId'
|| CHR (124)
|| 'PeriodOfServiceId'
|| CHR (124)
|| 'EffectiveLatestChange'
|| CHR (124)
|| 'EffectiveSequence'
|| CHR (124)
|| 'EffectiveStartDate'
|| CHR (124)
|| 'EffectiveEndDate'
|| CHR (124)
|| 'SourceSystemId'
|| CHR (124)
|| 'SourceSystemOwner'
|| CHR (124)
|| 'ActionCode'
HDL_DATA
, 1 SEQUENCE
FROM DUAL
UNION ALL
select 'MERGE'
|| CHR (124)
|| 'WorkTerms'
|| CHR (124)
|| paam.assignment_id
|| CHR (124)
|| paam.period_of_service_id
|| CHR (124)
|| paam.effective_latest_change
|| CHR (124)
|| paam.effective_sequence
|| CHR (124)
|| to_char(paam.effective_start_date, 'YYYY/MM/DD')
|| CHR (124)
|| to_char(paam.effective_end_date, 'YYYY/MM/DD')
|| CHR (124)
|| hikm.source_system_id
|| CHR (124)
|| hikm.source_system_owner
|| CHR (124)
|| paam.action_code HDL_DATA,
2 SEQUENCE
FROM per_all_assignments_m paam
,hrc_integration_key_map hikm
WHERE 1 = 1
AND paam.assignment_type IN ('ET','CT','PT')
AND hikm.surrogate_id = paam.assignment_id
UNION ALL
SELECT 'METADATA'
|| CHR (124)
|| 'Assignment'
|| CHR (124)
|| 'AssignmentId'
|| CHR (124)
|| 'WorkTermsAssignmentId'
|| CHR (124)
|| 'EffectiveLatestChange'
|| CHR (124)
|| 'EffectiveSequence'
|| CHR (124)
|| 'EffectiveStartDate'
|| CHR (124)
|| 'EffectiveEndDate'
|| CHR (124)
|| 'SourceSystemId'
|| CHR (124)
|| 'SourceSystemOwner'
|| CHR (124)
|| 'ActionCode'
|| CHR (124)
|| 'AssignmentCategory' HDL_DATA
, 3 SEQUENCE
FROM DUAL
UNION ALL
SELECT 'MERGE'
|| CHR (124)
|| 'Assignment'
|| CHR (124)
|| paam.assignment_id
|| CHR (124)
|| paam.work_terms_assignment_id
|| CHR (124)
|| paam.effective_latest_change
|| CHR (124)
|| paam.effective_sequence
|| CHR (124)
|| to_char(paam.effective_start_date, 'YYYY/MM/DD')
|| CHR (124)
|| to_char(paam.effective_end_date, 'YYYY/MM/DD')
|| CHR (124)
|| hikm.source_system_id
|| CHR (124)
|| hikm.source_system_owner
|| CHR (124)
|| paam.action_code
|| CHR (124)
|| 'FR' HDL_DATA
,4 SEQUENCE
FROM per_all_assignments_m paam
,hrc_integration_key_map hikm
WHERE 1 = 1
AND paam.assignment_type IN ('E','C','P')
AND hikm.surrogate_id = paam.assignment_id
)
ORDER BY SEQUENCE

You can modify the above query to include any attributes which you want to update. For example, sample file to update Notice Period will look like:

METADATA|WorkTerms|AssignmentId|PeriodOfServiceId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode
MERGE|WorkTerms|300000066966135|300000066966134|Y|1|2002/06/16|4712/12/31|300000066966135|FUSION|HIRE

METADATA|Assignment|AssignmentId|WorkTermsAssignmentId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode|NoticePeriod|NoticePeriodUOM
MERGE|Assignment|300000066966140|300000066966135|Y|1|2002/06/16|4712/12/31|300000066966140|FUSION|HIRE|3|M

Sample RTF File:

Configuration – How to disable Position Incumbent Validation?

Sometimes while loading the worker assignments or assigning an employee to a new position, you get “The Selected Position Has No Open headcount. Select A Different Position To Continue” error, one resolution to this is to disable the Position Incumbent Validation at enterprise level.

Follow the below steps to disable the position validation:

  1. Navigate to Setup and Maintenance -> Search -> Manage Enterprise HCM Information:

2. Search for Position Incumbent Validation on Enterprise screen. If not found, then follow the steps 3 – 8, else move directly to step 9.

3. If Position Incumbent Validation is not available on enterprise screen, search for ‘Manage Extensible Flexfields’ and search for ‘PER_ORGANIZATION_INFORMATION_EFF’:

4. Click on edit icon:

5. Under Category, expand Category and choose Enterprise and scroll down:

6. Click on Pages under ‘Enterprise Details’:

7. Click on add icon under ‘Enterprise Details: Associated Context Details’ and select & add ‘Position Incumbent Validation’ context:

8. A new context will be added:

Click on Save and Close and Deploy the flexfield.

9. On successful deployment of flexfield, go back to ‘Manage Enterprise HCM Information’ task and search for ‘Position Incumbent Validation’:

10. Click on Edit button in Correction mode and check the checkbox and save the record:

HDL – Sample HDL to load GL information at Department level

Below is the sample HDL to load the GL Costing information at department level using Organization.dat file. It is always advisable to get the latest METADATA attributes. Check the below link to see the process of Refreshing the HDL objects:

HDL – Refresh Business Objects

Once the latest .dat file is available, check the DFF attributes from attribute window in View business objects.

Sample HDL:

METADATA|OrgInformation|FLEX:PER_ORGANIZATION_INFORMATION_EFF|EFF_CATEGORY_CODE|recordIdentifier(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO)|_COMPANY_VALUESET(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO)|OrganizationId(SourceSystemId)|_COMPANY(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO)|EffectiveStartDate|EffectiveEndDate|ClassificationCode|SourceSystemOwner|SourceSystemId
MERGE|OrgInformation|PER_GL_COST_CENTER_INFO|DEPARTMENT|XX1001|980007|Org_6115|139|1901/01/01|4712/12/31|DEPARTMENT|HRC_SQLLOADER|6115_OrgInformation_PER_GL_COST_CENTER_INFO

Save this as Organization.dat, zip the file and upload it.

Please Note:-

  1. The GL Accounting should have been setup by financials in Fusion.
  2. recordIdentifier(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO) : This is a unique value. It can be anything.
  3. _COMPANY_VALUESET(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO) : Pass the value set Id attached to the segment.
  4. _COMPANY(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO) : Pass the actual value under this attribute.

Check the below post to extract the loaded data from HCM Cloud:

https://fusionhcmconsulting.com/2021/10/bip-query-to-extract-gl-cost-details-from-department/

You can run below query to find value set ID:

select *
  from FND_VS_VALUE_SETS
 where value_set_code = 'XXX_COMPANY'