Search for:
HDL – Sample file to load Talent Profile data

In the enhanced profiles (Profile V2), for bulk upload, one has to use TalentProfile.dat file as earlier. However while loading profile items against a profile, a new attribute SectionId should be provided. SectionId is based on the content item being used.

Check the below post on how to get section id:

https://fusionhcmconsulting.com/2021/02/hdl-query-to-find-sectionid-for-talent-profile-item-load/

Below is the sample data for ProfileItem for language content item:

METADATA|ProfileItem|ProfileItemId|ProfileId|ProfileCode|ContentItem|ContentItemId|ContentType|ContentTypeId|CountryCountryCode|CountryGeographyCode|CountryId|DateFrom|Importance|InterestLevel|ItemClob1File|ItemClob2File|ItemClob3File|ItemClob4File|ItemClob5File|ItemDate1|ItemDate10|ItemDate2|ItemDate3|ItemDate4|ItemDate5|ItemDate6|ItemDate7|ItemDate8|ItemDate9|ItemDecimal1|ItemDecimal2|ItemDecimal3|ItemDecimal4|ItemDecimal5|ItemNumber1|ItemNumber10|ItemNumber2|ItemNumber3|ItemNumber4|ItemNumber5|ItemNumber6|ItemNumber7|ItemNumber8|ItemNumber9|ItemText20001|ItemText20002|ItemText20003|ItemText20004|ItemText20005|ItemText2401|ItemText24010|ItemText24011|ItemText24012|ItemText24013|ItemText24014|ItemText24015|ItemText2402|ItemText2403|ItemText2404|ItemText2405|ItemText2406|ItemText2407|ItemText2408|ItemText2409|ItemText301|ItemText3010|ItemText3011|ItemText3012|ItemText3013|ItemText3014|ItemText3015|ItemText302|ItemText303|ItemText304|ItemText305|ItemText306|ItemText307|ItemText308|ItemText309|Mandatory|QualifierCode1|QualifierId1|QualifierCode2|QualifierId2|QualifierSetCode1|QualifierSetCode2|RatingLevelCode1|RatingLevelId1|RatingLevelCode2|RatingLevelId2|RatingLevelCode3|RatingLevelId3|RatingModelCode1|RatingModelId1|RatingModelCode2|RatingModelId2|RatingModelCode3|RatingModelId3|SectionId|SectionName|SourceId|SourceKey1|SourceKey2|SourceKey3|SourceType|StateCountryCode|StateGeographyCode|StateProvinceId|DateTo|SourceSystemId|SourceSystemOwner

MERGE|ProfileItem|||PERS_300000002|English||LANGUAGE|||||2020/01/01|||||||||||||||||||||||||||||||||||||||||||||||||||||Y||||||||||||||||||||||3||1||2||LANGUAGE||LANGUAGE||LANGUAGE||300000001855526|||||||||||TPFPIPERS_300000002_LANG1|HRC_SQLLOADER

HDL – Sample File to Load extended Lookup Codes

Oracle HCM makes use of extended lookups feature to support dependent lookup values. For example, while creating a VISA or Work Permit record for a Person for Singapore, the Category field is dependent upon Type of the pass chosen. The values of Category field are derived from Extended lookup.

Navigate to Setup and Maintenance -> Manage Extended Lookup codes -> Visa Permit Type

Now, let us take an example where we need to load 2 values for categories based on the lookup code S Pass.

Prepare the HDL file in below format:

METADATA|ExtendedLookupCode|ExtendedLookupCodeId|LookupType|LookupCode|LegislationCode|ExtendedLookupCode|ExtendedLookupCodeName|SourceSystemOwner|SourceSystemId
MERGE|ExtendedLookupCode||PER_VISA_PERMIT_TYPE|SG_SP|SG|TEST_SP1|S Pass Holder – Test 1|HRC_SQLLOADER|TEST_1
MERGE|ExtendedLookupCode||PER_VISA_PERMIT_TYPE|SG_SP|SG|TEST_SP2|S Pass Holder – Test 2|HRC_SQLLOADER|TEST_2

zip the file and upload using HCM Data Loader from Data Exchange.

On successful load, the new values can be verified from either of following two places on the UI:

  1. From Manage Extended Lookup codes:

2. From Documents Tab on Person UI:

Data Masking – Mask Salary Data in lower environment

There is a common requirement to mask salary data post P2T refreshes. This should be done in order to hide the actual salaries information as salary is a very sensitive information.

Use the below query to generate data in HDL format in a test environment immediately after P2T refresh. The below query generate a random salary amount. Save the downloaded data in .dat file format and upload it back to the instance.

Select 'METADATA|Salary|AssignmentNumber|SalaryAmount|DateFrom|DateTo|SalaryBasisId|SalaryId' Header, 1 data_flow_order
from dual
UNION
SELECT 'MERGE|Salary'||'|'||
paam.assignment_number||'|'||
round(DBMS_RANDOM.VALUE (1,15000) , 2)||'|'||
TO_CHAR(cs.date_from,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
TO_CHAR(cs.date_to,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
cs.salary_basis_id||'|'||
cs.salary_id data_row,
2 data_flow_order
FROM cmp_salary cs,
per_all_assignments_m paam
WHERE cs.assignment_id= paam.assignment_id
AND trunc(sysdate) between paam.effective_start_date AND paam.effective_end_date
AND paam.assignment_type in ('E', 'C', 'P')
AND paam.assignment_number ='E788880'
ORDER BY data_flow_order

HDL – Loading Multiple attachments for same document type

There are various cases where more than one attachment should be maintained against employee document record. First of all, we will create a new document type where multiple attachments will be allowed.

  1. Navigate to Setup and Maintenance -> Document Types -> Click on Create

2. Choose multiple occurrences as Yes:

3. Enter the relevant information and submit the record.

4. Prepare the HDL in below format and load the data:

METADATA|DocumentsOfRecord|PersonNumber|DateFrom|DateTo|DocumentType|DocumentName|DocumentNumber|DocumentCode|Country|AttachmentEntityName|Comments|IssuedDate|IssuingAuthority|IssuingCountry|IssuingLocation|Publish|PublishDate|SourceSystemId|SourceSystemOwner
MERGE|DocumentsOfRecord|101|2021/01/01||Test Multi Attachments Doc|Multiple Attachments||TMAD_001||||||||||HRC_SQLLOADER_101_TMAD_001|HRC_SQLLOADER

METADATA|DocumentAttachment|SourceSystemOwner|SourceSystemId|PersonNumber|DocumentType|DocumentCode|Title|Description|DataTypeCode|Country|EntityName|File|FileName|MimeType|URL|URLorTextorFileName|DmVersonNumber
MERGE|DocumentAttachment|HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001_1|101|Test Multi Attachments Doc|TMAD_001|Test Multi Attachments Doc|Test Multi Attachments Doc 1|FILE|||Historical_rating.docx|Historical_rating.docx|||Historical_rating.docx|
MERGE|DocumentAttachment|HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001_2|101|Test Multi Attachments Doc|TMAD_001|Test Multi Attachments Doc|Test Multi Attachments Doc 2|FILE|||Historical_rating1.docx|Historical_rating1.docx|||Historical_rating1.docx|

5. Place the docx files in BlobFiles folder and zip with the .dat file:

6. Import and Load the file.

7. Once the data is loaded, verify from the UI:

HDL – Loading Contract History Using HDL

Sample HDL file to contract history:

METADATA|Worker|PersonId|EffectiveStartDate|EffectiveEndDate|PersonNumber|BloodType|CorrespondenceLanguage|StartDate|DateOfBirth|DateOfDeath|CountryOfBirth|RegionOfBirth|TownOfBirth|PersonDuplicateCheck|ActionCode|ReasonCode|GUID|SourceSystemOwner|SourceSystemId
MERGE|Worker||2016/08/27|4712/12/31|435282|||2016/08/27|1967/07/23||PH|Philippines|||HIRE|||LEGACY|PER_909090

METADATA|PersonName|PersonNameId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|NameType|FirstName|MiddleNames|LastName|Honors|KnownAs|PreNameAdjunct|MilitaryRank|PreviousLastName|Suffix|Title|CharSetContext|NameInformation1|NameInformation2|NameInformation3|NameInformation4|NameInformation5|NameInformation6|NameInformation7|NameInformation8|NameInformation9|NameInformation10|NameInformation11|NameInformation12|NameInformation13|NameInformation14|NameInformation15|NameInformation16|NameInformation17|NameInformation18|NameInformation19|NameInformation20|NameInformation21|NameInformation22|NameInformation23|NameInformation24|NameInformation25|NameInformation26|NameInformation27|NameInformation28|NameInformation29|NameInformation30|GUID|SourceSystemOwner|SourceSystemId
MERGE|PersonName||2016/08/27|4712/12/31|PER_909090||US|GLOBAL|||Ashwin Ravi||Ashwin Ravi|||||MR.|||||||||||||||||||||||||||||||||LEGACY|909090_PER_NAME

METADATA|PersonLegislativeData|PersonLegislativeId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|HighestEducationLevel|MaritalStatus|MaritalStatusDate|Sex|GUID|SourceSystemOwner|SourceSystemId
MERGE|PersonLegislativeData||2016/08/27|4712/12/31|PER_909090||US||M||M||LEGACY|909090_PER_LEG

METADATA|WorkRelationship|PeriodOfServiceId|LegalEmployerName|PersonId(SourceSystemId)|PersonNumber|DateStart|WorkerType|LegalEmployerSeniorityDate|ActualTerminationDate|LegalEntityId|Comments|EnterpriseSeniorityDate|LastWorkingDate|NotifiedTerminationDate|OnMilitaryServiceFlag|PrimaryFlag|ProjectedTerminationDate|RehireAuthorizerPersonId|RehireAuthorizor|RehireReason|RevokeUserAccess|WorkerNumber|RehireRecommendationFlag|NewStartDate|ActionCode|ReasonCode|DateOfDeath|HideUntilDate|DateForPrimaryFlagChange|CancelWorkRelationshipFlag|TerminateWorkRelationshipFlag|ReverseTerminationFlag|CorrectTerminationFlag|GUID|SourceSystemOwner|SourceSystemId|GlobalTransferFlag|ReadyToConvert|WorkerComments
MERGE|WorkRelationship||US Company|PER_909090|435282|2016/08/27|E||||||||N|Y|||||||||HIRE||||||||||LEGACY|455985_WR||||

METADATA|WorkTerms|AssignmentId|AssignmentNumber|EffectiveEndDate|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|LegalEmployerName|DateStart|WorkerType|AssignmentName|PeopleGroup|AssignmentStatusTypeId|AssignmentStatusTypeCode|AssignmentType|PersonTypeId|PersonTypeCode|SystemPersonType|BargainingUnitCode|BillingTitle|BusinessUnitId|BusinessUnitShortCode|ContractId|DateProbationEnd|WorkerCategory|AssignmentCategory|EstablishmentId|ReportingEstablishment|ExpenseCheckSendToAddress|GradeId|GradeCode|GradeLadderPgmId|GradeLadderPgmName|HourlySalariedCode|InternalBuilding|InternalFloor|InternalLocation|InternalMailstop|InternalOfficeNumber|JobId|JobCode|LabourUnionMemberFlag|LocationId|LocationCode|ManagerFlag|NormalHours|Frequency|NoticePeriod|NoticePeriodUOM|OrganizationId|DepartmentName|PositionId|PositionCode|PositionOverrideFlag|PrimaryWorkTermsFlag|ProbationPeriod|ProbationUnit|ProjectedEndDate|ProjectedStartDate|ProposedWorkerType|ProposedUserPersonType|ReasonCode|RetirementAge|RetirementDate|SpecialCeilingStepId|SpecialCeilingStep|StepEntryDate|TaxAddressId|EndTime|StartTime|VendorSiteId|WorkAtHomeFlag|WorkTermsAssignmentId|VendorId|FreezeStartDate|FreezeUntilDate|ActionCode|CategoryCode|TaxReportingUnitId|TaxReportingUnit|GUID|CollectiveAgreementId|CollectiveAgreementIdCode|FullPartTime|PermanentTemporary|SourceSystemOwner|SourceSystemId
MERGE|WorkTerms||ET435282|2017/07/31|Y|1|2016/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||HIRE|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|2018/08/26|Y|1|2017/08/01|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||ASG_CHANGE|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|2019/08/26|Y|1|2018/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||CONTRACT_EXTENSION|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|4712/12/31|Y|1|2019/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||CONTRACT_EXTENSION|||||||||LEGACY|52361_WT

METADATA|Assignment|AssignmentId|ActionCode|EffectiveStartDate|EffectiveEndDate|EffectiveSequence|EffectiveLatestChange|WorkTermsAssignmentId(SourceSystemId)|WorkTermsNumber|AssignmentType|AssignmentName|AssignmentNumber|PeopleGroup|AssignmentStatusTypeId|AssignmentStatusTypeCode|BargainingUnitCode|BillingTitle|BusinessUnitId|BusinessUnitShortCode|DateProbationEnd|WorkerCategory|AssignmentCategory|EstablishmentId|ReportingEstablishment|ExpenseCheckSendToAddress|GradeId(SourceSystemId)|GradeCode|GradeLadderPgmId|GradeLadderPgmName|HourlySalariedCode|InternalBuilding|InternalFloor|InternalLocation|InternalMailstop|InternalOfficeNumber|JobId(SourceSystemId)|JobCode|LabourUnionMemberFlag|LocationId|LocationCode|ManagerFlag|NormalHours|Frequency|NoticePeriod|NoticePeriodUOM|OrganizationId(SourceSystemId)|DepartmentName|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|DateStart|WorkerType|LegalEmployerName|PersonTypeId|PersonTypeCode|SystemPersonType|PositionId(SourceSystemId)|PositionCode|PositionOverrideFlag|PrimaryAssignmentFlag|PrimaryFlag|ProbationPeriod|ProbationUnit|ProjectTitle|ProjectedEndDate|ProjectedStartDate|ProposedWorkerType|ProposedUserPersonType|ReasonCode|RetirementAge|RetirementDate|SpecialCeilingStepId|SpecialCeilingStep|TaxAddressId|EndTime|StartTime|VendorSiteId|WorkAtHomeFlag|VendorId|FreezeStartDate|FreezeUntilDate|CategoryCode|CollectiveAgreementId|CollectiveAgreementIdCode|FullPartTime|PermanentTemporary|DefaultExpenseAccount|GUID|SourceSystemOwner|SourceSystemId|GspEligibilityFlag|OvertimePeriodName|SeniorityBasis|UnionId|UnionName
MERGE|Assignment||HIRE|2016/08/27|2017/07/31|1|Y|52361_WT|ET435282|E||E435282|121||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||ASG_CHANGE|2017/08/01|2018/08/26|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||CONTRACT_EXTENSION|2018/08/27|2019/08/26|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||CONTRACT_EXTENSION|2019/08/27|4712/12/31|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||

METADATA|Contract|ContractId|EffectiveStartDate|EffectiveEndDate|AssignmentId|AssignmentNumber|PersonNumber|ContractType|Description|Duration|DurationUnits|ExtensionPeriod|ExtensionPeriodUnits|ExtensionReason|PersonId(SourceSystemId)|GUID|SourceSystemOwner|SourceSystemId
MERGE|Contract||2016/08/27|2018/08/26||ET435282|435282|Flexiable 2 Years|Rolling Contract|2|Y||||PER_909090||LEGACY|52361_CONTRACT
MERGE|Contract||2018/08/27|2019/08/26||ET435282|435282|Flexiable 2 Years|Rolling Contract|||1|Y||PER_909090||LEGACY|52361_CONTRACT
MERGE|Contract||2019/08/27|4712/12/31||ET435282|435282|Flexiable 2 Years|Rolling Contract|||1|Y||PER_909090||LEGACY|52361_CONTRACT

Version Tested – 20B

HDL – Query to get Assignment Grade Steps
SELECT 'MERGE' "METADATA"
      ,'AssignmentGradeSteps' "AssignmentGradeSteps"
      ,pagsf.EFFECTIVE_START_DATE "EffectiveStartDate"
      ,pagsf.EFFECTIVE_END_DATE "EffectiveEndDate"
      ,pagsf.ASSIGN_GRADE_STEP_ID "AssignGradeStepId"
      ,paam.assignment_number "AssignmentNumber"
      ,pav.ACTION_CODE "ActionCode"
      ,parv.ACTION_REASON_CODE "ReasonCode"
      ,pgsfv.name "GradeStepName"
      ,NULL "NewGradeStepName" --> to be supplied
 FROM PER_ASSIGN_GRADE_STEPS_F pagsf,
      PER_GRADE_STEPS_F_VL pgsfv,
      PER_ALL_ASSIGNMENTS_M paam,
      PER_ACTION_OCCURRENCES pao,
      PER_ACTIONS_VL pav,
      PER_ACTION_REASONS_VL parv
WHERE pagsf.GRADE_STEP_ID = pgsfv.GRADE_STEP_ID
  AND TRUNC(SYSDATE) BETWEEN pagsf.effective_start_date AND pagsf.effective_end_date
  AND TRUNC(SYSDATE) BETWEEN pgsfv.effective_start_date AND pgsfv.effective_end_date
  AND TRUNC(SYSDATE) BETWEEN paam.effective_start_date AND paam.effective_end_date
  AND pagsf.assignment_id = paam.assignment_id
  AND paam.assignment_type = 'E'
  AND paam.effective_latest_change = 'Y'
  AND pagsf.ACTION_OCCURRENCE_ID = pao.ACTION_OCCURRENCE_ID
  AND pav.action_id = pao.action_id
  AND parv.action_reason_id = pao.action_reason_id
HDL – Sample File to Load Document Type Security Profile

Manage Document Type security profile can be used to define a security profile with a list of various documents tagged to it. The documents can either be included or excluded in this security profile.

HDL can be used to mass create and update the document type security profiles.

Check the below post on how to refresh the business objects to download the latest template for DocumentTypeSecurityProfile.dat:

HDL – Refresh Business Objects – Welcome to Fusion HCM Consulting

Below is the sample HDL for same:

METADATA|DocumentTypeSecurityProfile|DocumentTypeSecurityProfileId|EnabledFlag|IncludeExclude|Name|SourceSystemOwner|SourceSystemId|GUID
MERGE|DocumentTypeSecurityProfile||Y|I|TestDocSecurityProfile|HRC_SQLLOADER|HRC_SQLLOADER_TestDocSecurityProfile|

METADATA|DocumentType|DocumentTypeSecProfDocumentTypeId|DocumentTypeSecurityProfileId|DocumentTypeId|DocumentTypeName|DocumentTypeLegislationCode|DocumentTypeCountry|DocumentTypeSecurityProfileName|SourceSystemOwner|SourceSystemId|GUID
MERGE|DocumentType||||W_2|US|US|TestDocSecurityProfile|HRC_SQLLOADER|HRC_SQLLOADER_TestDocSecurityProfile_W_2|

Verify the uploaded security profile from UI:

HDL – Load Goal Library Data

Goal Library is loaded as Content items using ContentItem.dat. The various attributes and their mapping can be found using GoalLibraryAttributes.xlsx which can be found on metalink using below note id:

Query To Get Goal Library Goals (Doc ID 2228128.1)

How to Load Library Goals (Doc ID 2201757.1)

Please note that ItemText2 (Priority Code) and ItemText12 (External ID) is a required attribute. Although these are not marked as mandatory and HDL load will complete successfully with a NULL value for these, however, the loaded Goals will not be visible in UI.

ItemText12 – Can be any random numeric value.

Sample HDL File:

METADATA|ContentItem|ContentItemId|ContentTypeId|ContentItemCode|Name|ItemDescrlong|ItemTextTl11|DateFrom|DateTo|ItemText1|ItemText2|ItemDate1|ItemDate2|ItemDate3|ItemText3|ItemTextTl1|ItemText5|ItemTextTl2|ItemNumber1|ItemText6|ItemText7|RatingModelId|ItemText9|ItemText10|ItemText12|SourceSystemId|SourceSystemOwner
MERGE|ContentItem||124||Test Goal Libarary 123|Goal Loaded into libarary using content item HDL|Schedule regular headcount and budget review meetings. Facilitate regular reporting distributions to senior management on a biweekly basis|2000/01/01||A|HIGH||||STRETCH||QUANTITATIVE||20|MAX|PERCENT||PERFORMANCE|ALL|1000090000|VIS09090909|VIS
HDL – Updating an attribute on Assignment

Technical consultants encounter many scenario’s where they have to update one or two attributes on the assignment data post initial data load. This can happen during any phase of the new implementation or post go-live. For example, there is a need to update the assignment category for a set of workers. In such scenario’s, the approach is to extract the already uploaded assignment data with minimum required attributes and append the attributes for assignment category and upload it back to Fusion using HDL.

The key thing here is to extract the minimum required data for Assignment and WorkTerms. The mandatory fields required for an update using HDL can be checked from:

My Client Groups -> Data Exchange -> View Business Objects -> Worker -> WorkTerms -> Assignment

Validated Version :- 20D

Sample Query to extract the minimum required attributes for Assignments and Workterms in order to update assignment category (AssignmentCategory):

SELECT HDL_DATA
FROM (
SELECT 'METADATA'
|| CHR (124)
|| 'WorkTerms'
|| CHR (124)
|| 'AssignmentId'
|| CHR (124)
|| 'PeriodOfServiceId'
|| CHR (124)
|| 'EffectiveLatestChange'
|| CHR (124)
|| 'EffectiveSequence'
|| CHR (124)
|| 'EffectiveStartDate'
|| CHR (124)
|| 'EffectiveEndDate'
|| CHR (124)
|| 'SourceSystemId'
|| CHR (124)
|| 'SourceSystemOwner'
|| CHR (124)
|| 'ActionCode'
HDL_DATA
, 1 SEQUENCE
FROM DUAL
UNION ALL
select 'MERGE'
|| CHR (124)
|| 'WorkTerms'
|| CHR (124)
|| paam.assignment_id
|| CHR (124)
|| paam.period_of_service_id
|| CHR (124)
|| paam.effective_latest_change
|| CHR (124)
|| paam.effective_sequence
|| CHR (124)
|| to_char(paam.effective_start_date, 'YYYY/MM/DD')
|| CHR (124)
|| to_char(paam.effective_end_date, 'YYYY/MM/DD')
|| CHR (124)
|| hikm.source_system_id
|| CHR (124)
|| hikm.source_system_owner
|| CHR (124)
|| paam.action_code HDL_DATA,
2 SEQUENCE
FROM per_all_assignments_m paam
,hrc_integration_key_map hikm
WHERE 1 = 1
AND paam.assignment_type IN ('ET','CT','PT')
AND hikm.surrogate_id = paam.assignment_id
UNION ALL
SELECT 'METADATA'
|| CHR (124)
|| 'Assignment'
|| CHR (124)
|| 'AssignmentId'
|| CHR (124)
|| 'WorkTermsAssignmentId'
|| CHR (124)
|| 'EffectiveLatestChange'
|| CHR (124)
|| 'EffectiveSequence'
|| CHR (124)
|| 'EffectiveStartDate'
|| CHR (124)
|| 'EffectiveEndDate'
|| CHR (124)
|| 'SourceSystemId'
|| CHR (124)
|| 'SourceSystemOwner'
|| CHR (124)
|| 'ActionCode'
|| CHR (124)
|| 'AssignmentCategory' HDL_DATA
, 3 SEQUENCE
FROM DUAL
UNION ALL
SELECT 'MERGE'
|| CHR (124)
|| 'Assignment'
|| CHR (124)
|| paam.assignment_id
|| CHR (124)
|| paam.work_terms_assignment_id
|| CHR (124)
|| paam.effective_latest_change
|| CHR (124)
|| paam.effective_sequence
|| CHR (124)
|| to_char(paam.effective_start_date, 'YYYY/MM/DD')
|| CHR (124)
|| to_char(paam.effective_end_date, 'YYYY/MM/DD')
|| CHR (124)
|| hikm.source_system_id
|| CHR (124)
|| hikm.source_system_owner
|| CHR (124)
|| paam.action_code
|| CHR (124)
|| 'FR' HDL_DATA
,4 SEQUENCE
FROM per_all_assignments_m paam
,hrc_integration_key_map hikm
WHERE 1 = 1
AND paam.assignment_type IN ('E','C','P')
AND hikm.surrogate_id = paam.assignment_id
)
ORDER BY SEQUENCE

You can modify the above query to include any attributes which you want to update. For example, sample file to update Notice Period will look like:

METADATA|WorkTerms|AssignmentId|PeriodOfServiceId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode
MERGE|WorkTerms|300000066966135|300000066966134|Y|1|2002/06/16|4712/12/31|300000066966135|FUSION|HIRE

METADATA|Assignment|AssignmentId|WorkTermsAssignmentId|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|EffectiveEndDate|SourceSystemId|SourceSystemOwner|ActionCode|NoticePeriod|NoticePeriodUOM
MERGE|Assignment|300000066966140|300000066966135|Y|1|2002/06/16|4712/12/31|300000066966140|FUSION|HIRE|3|M

Sample RTF File:

Configuration – How to disable Position Incumbent Validation?

Sometimes while loading the worker assignments or assigning an employee to a new position, you get “The Selected Position Has No Open headcount. Select A Different Position To Continue” error, one resolution to this is to disable the Position Incumbent Validation at enterprise level.

Follow the below steps to disable the position validation:

  1. Navigate to Setup and Maintenance -> Search -> Manage Enterprise HCM Information:

2. Search for Position Incumbent Validation on Enterprise screen. If not found, then follow the steps 3 – 8, else move directly to step 9.

3. If Position Incumbent Validation is not available on enterprise screen, search for ‘Manage Extensible Flexfields’ and search for ‘PER_ORGANIZATION_INFORMATION_EFF’:

4. Click on edit icon:

5. Under Category, expand Category and choose Enterprise and scroll down:

6. Click on Pages under ‘Enterprise Details’:

7. Click on add icon under ‘Enterprise Details: Associated Context Details’ and select & add ‘Position Incumbent Validation’ context:

8. A new context will be added:

Click on Save and Close and Deploy the flexfield.

9. On successful deployment of flexfield, go back to ‘Manage Enterprise HCM Information’ task and search for ‘Position Incumbent Validation’:

10. Click on Edit button in Correction mode and check the checkbox and save the record:

HDL – Sample HDL to load GL information at Department level

Below is the sample HDL to load the GL Costing information at department level using Organization.dat file. It is always advisable to get the latest METADATA attributes. Check the below link to see the process of Refreshing the HDL objects:

HDL – Refresh Business Objects

Once the latest .dat file is available, check the DFF attributes from attribute window in View business objects.

Sample HDL:

METADATA|OrgInformation|FLEX:PER_ORGANIZATION_INFORMATION_EFF|EFF_CATEGORY_CODE|recordIdentifier(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO)|_COMPANY_VALUESET(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO)|OrganizationId(SourceSystemId)|_COMPANY(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO)|EffectiveStartDate|EffectiveEndDate|ClassificationCode|SourceSystemOwner|SourceSystemId
MERGE|OrgInformation|PER_GL_COST_CENTER_INFO|DEPARTMENT|XX1001|980007|Org_6115|139|1901/01/01|4712/12/31|DEPARTMENT|HRC_SQLLOADER|6115_OrgInformation_PER_GL_COST_CENTER_INFO

Save this as Organization.dat, zip the file and upload it.

Please Note:-

  1. The GL Accounting should have been setup by financials in Fusion.
  2. recordIdentifier(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO) : This is a unique value. It can be anything.
  3. _COMPANY_VALUESET(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO) : Pass the value set Id attached to the segment.
  4. _COMPANY(PER_ORGANIZATION_INFORMATION_EFF=PER_GL_COST_CENTER_INFO) : Pass the actual value under this attribute.

Check the below post to extract the loaded data from HCM Cloud:

https://fusionhcmconsulting.com/2021/10/bip-query-to-extract-gl-cost-details-from-department/

You can run below query to find value set ID:

select *
  from FND_VS_VALUE_SETS
 where value_set_code = 'XXX_COMPANY'
HDL – Refresh Business Objects

Refresh Business objects is a process which is used to create the HDL template for a business object with the latest supported attributes.

When to run ‘Refresh Business Objects’ process:

It is advised to run this process:

  1. After every upgrade
  2. After any new DFF/EFF attributes are configured for any HDL supported business objects.

How to run ‘Refresh Business Objects’ process:

To run ‘Refresh Business Objects’ process, navigate to My Client Groups -> Data Exchange -> View Business Objects

On the View Business Objects UI, you will see 2 buttons:

a. Refresh Object – This is used when you want to refresh only one object at a time. This is usually run when any new DFF attributes are setup for a particular object. Once you click on ‘Refresh Object’, you will see the below screen:

Click on Submit button to submit the process.

You can then click on ‘View Process Results’ button to check the progress:

Once the process completes, you can see the last refreshed date:

b. Refresh All Objects – This is used when you want to refresh all the business objects. This is usually run post release upgrade.

HDL – Email Data Obfuscation in Test environment

Post P2T refresh, it is mandatory to mask the actual email Ids of the users in cloned test environment. This is required to avoid any unwanted emails getting triggered to the end users.

The approach is already discussed in below post: https://fusionhcmconsulting.com/2020/12/hdl-query-to-delete-phones-data/

Use the below SQL to extract the data in HDL format:

select 'MERGE' "METADATA",
       'PersonEmail' "PersonEmail",
       pea.email_address_id "EmailAddressId",
       pea.person_id "PersonId",
       to_char(pea.date_from,'RRRR/MM/DD') "DateFrom",
       to_char(pea.date_to,'RRRR/MM/DD') "DateTo",
       pea.email_type "EmailType",
       -->pea.email_address "1EmailAddress",
       replace(replace(pea.email_address,'@','@invalid'),'.co','.xco') "EmailAddress", --> Change the format as per your requirement
      (select email_hrc.source_system_id
         from hrc_integration_key_map email_hrc
        WHERE pea.email_Address_id = email_hrc.surrogate_id) "SourceSystemId",
      (select email_hrc.source_system_owner
         from hrc_integration_key_map email_hrc
        WHERE pea.email_Address_id = email_hrc.surrogate_id) "SourceSystemOwner"
  from per_email_addresses pea
 where 1=1
   and upper(email_address) like '%XXCOMPANY%' --> Change the format as per your requirement

Query with Person Number:

select 'MERGE' "METADATA",
       'PersonEmail' "PersonEmail",
       pea.email_address_id "EmailAddressId",
       pea.person_id "PersonId",
       papf.person_number "PersonNumber",
       to_char(pea.date_from,'RRRR/MM/DD') "DateFrom",
       to_char(pea.date_to,'RRRR/MM/DD') "DateTo",
       pea.email_type "EmailType",
       -->pea.email_address "1EmailAddress",
       replace(replace(pea.email_address,'@','@invalid'),'.co','.xco') "EmailAddress", --> Change the format as per your requirement
      (select email_hrc.source_system_id
         from hrc_integration_key_map email_hrc
        WHERE pea.email_Address_id = email_hrc.surrogate_id) "SourceSystemId",
      (select email_hrc.source_system_owner
         from hrc_integration_key_map email_hrc
        WHERE pea.email_Address_id = email_hrc.surrogate_id) "SourceSystemOwner"
  from per_email_addresses pea, per_all_people_f papf
 where 1=1
   and pea.person_id = papf.person_id
   and trunc(sysdate) between papf.effective_start_date and papf.effective_end_date
   and upper(email_address) like '%XXCOMPANY%' --> Change the format as per your requirement

Once the data is generated, create the HDL file and load the Worker.dat.

Post data load steps:

  1. Run “Send Personal Data for Multiple Users to LDAP” process. This process will sync up the latest email address from worker email to user.
  2. Login to Security Console and verify for couple of users from UI or you can verify from backend.
  3. Disable the notifications (optional) – Navigate to Security Console -> User Categories -> Notification Preferences -> Uncheck the Enable Notifications:

4. Or you can redirect all the notification to single email address. Search for ‘Manage Approval Groups’ task in ‘Setup and Maintenance”:

  • Clicking on above task will take you to the BPM Worklist page.
  • Click on “Administration” button and search for Test Notification Email Address and provide the value where you want to redirect all your notifications:

If you want to completely purge the email data then use below post:

https://fusionhcmconsulting.com/2022/10/hdl-delete-person-email-addresses/

HDL – Query to delete phones data

Sometimes it is required to delete the uploaded phones data from Fusion HCM. Below query can be used to create a BIP report which in turn can be used to generate HDL file to delete phones data.

The query can be modified to delete phones data for contacts as well as for other person types.

Supported Version :- 20D

Query:

SELECT 'METADATA|PersonPhone|PhoneId|PhoneType|PersonId|DateFrom|SourceSystemId|SourceSystemOwner' D_ROW
FROM DUAL
union
SELECT 'DELETE|PersonPhone|'||
       pp.phone_id||'|'||
 
       pp.phone_type||'|'||
       pp.person_id||'|'||
       to_char(pp.date_from,'RRRR/MM/DD')||'|'||
       hikm.source_system_id||'|'||
       hikm.source_system_owner
  from per_phones pp,
       hrc_integration_key_map hikm
 where pp.phone_id= hikm.surrogate_id
   and exists (select 1 from per_periods_of_service where person_id=pp.person_id)