Search for:
BIP – Query to get Default Assignment Rules of the Course

Default Assignment Rules like Validity Period, Expiration, Renewal Options, Renewal Period etc are maintained at the course level in Oracle Learning Cloud.

These details are stored in the backend table ‘WLF_ASSIGNMENT_RULES’.

The query from below post can be joined with WLF_ASSIGNMENT_RULES using ASSIGNMENT_RULE_ID column:

https://fusionhcmconsulting.com/2021/03/reports-bip-query-to-get-assignee-details-for-a-course/?amp

and warf.ASSIGNMENT_RULE_ID = war.ASSIGNMENT_RULE_ID

And you will get the required output.

HDL – Loading CoverArt File at Course Level

While creating the Courses in Oracle Learning Cloud, there is a capability to load CoverArt for each course. If no cover art is loaded, the learner is presented with a blue strip film.

HCM Data Loaders provides an option to bulk upload the CoverArt file for multiple courses at one go. All the cover art related files (jpg, jepg) should be included in BlobFiles folder and the name of the jpg/jpeg file should be provided in the dat file.

Below is the sample Course dat file:

METADATA|Course|CourseId|EffectiveStartDate|EffectiveEndDate|CourseNumber|Title|ShortDescription|Syllabus|PublishStartDate|PublishEndDate|MinimumExpectedEffort|MaximumExpectedEffort|CurrencyCode|MinimumPrice|MaximumPrice|CoverArtFile|CoverArtFileName|TrailerLiId|TrailerLiNumber|OwnedByPersonId|OwnedByPersonNumber|SourceType|SourceId|SourceInfo|SourceSystemOwner|SourceSystemId
MERGE|Course||2016/01/01||COURSEN_1866|Test Staff Orientation|||2016/01/01||||USD|||Capture.jpg|Capture.jpg||||1010|TALEO|1111|TALEO_1111|HRC_SQLLOADER|HRC_SQLLOADER_1111

Once the BlobFiles Folder and dat file are ready, zip them together and trigger the HDL Import and Load process:

Once the Import and Load process is successful, login to Learning Admin to verify:

My Client Groups -> Learning -> Learning Catalog -> Courses

Reports (BIP) – Offering Details Query
select
WLFI.LEARNING_ITEM_NUMBER
,WLFI.LEARNING_ITEM_ID
,TO_CHAR(WLFI.EFFECTIVE_END_DATE,'YYYY/MM/DD') "Offering Effective End Date"
,TO_CHAR(WLFI.EFFECTIVE_START_DATE,'YYYY/MM/DD') "Offering Effective Start Date"
,WLFT.NAME
,WLCLF.MAXIMUM_ATTENDEES
,WLCLF.MINIMUM_ATTENDEES
,TO_CHAR(WLCLF.ENROLMENT_END_DATE,'YYYY/MM/DD') "Offering Enrollment End Date"
,TO_CHAR(WLCLF.ENROLMENT_START_DATE,'YYYY/MM/DD') "Offering Enrollment Start Date"
,WLCLF.DELIVERY_MODE
,TO_CHAR(WLFI.END_DATE,'YYYY/MM/DD') "Offering Publish End Date"
,TO_CHAR(WLFI.START_DATE,'YYYY/MM/DD') "Offering Publish Start Date"
,TO_CHAR(WLFI.LI_START_DATE,'YYYY/MM/DD') "Offering Start Date"
,TO_CHAR(WLFI.LI_END_DATE,'YYYY/MM/DD') "Offering End Date"
FROM WLF_LEARNING_ITEMS_F WLFI,
WLF_LI_COURSES_F WLCF,
WLF_LI_CLASSES_F WLCLF,
WLF_LEARNING_ITEMS_F_TL WLFT
WHERE
WLFI.LEARNING_ITEM_ID = WLFT.LEARNING_ITEM_ID
AND WLFI.LEARNING_ITEM_ID = WLCLF.LEARNING_ITEM_ID
AND WLCLF.COURSE_LEARNING_ITEM_ID = WLCF.LEARNING_ITEM_ID
AND WLFT.LANGUAGE(+) = 'US'
AND WLFI.EFFECTIVE_START_DATE BETWEEN WLFT.EFFECTIVE_START_DATE AND WLFT.EFFECTIVE_END_DATE
AND WLFI.EFFECTIVE_END_DATE BETWEEN WLFT.EFFECTIVE_START_DATE AND WLFT.EFFECTIVE_END_DATE
AND WLFI.LEARNING_ITEM_TYPE = 'ORA_CLASS'
AND WLFI.LEARNING_ITEM_NUMBER = 'OLC245030'
HDL – Sample file to load Talent Profile data

In the enhanced profiles (Profile V2), for bulk upload, one has to use TalentProfile.dat file as earlier. However while loading profile items against a profile, a new attribute SectionId should be provided. SectionId is based on the content item being used.

Check the below post on how to get section id:

https://fusionhcmconsulting.com/2021/02/hdl-query-to-find-sectionid-for-talent-profile-item-load/

Below is the sample data for ProfileItem for language content item:

METADATA|ProfileItem|ProfileItemId|ProfileId|ProfileCode|ContentItem|ContentItemId|ContentType|ContentTypeId|CountryCountryCode|CountryGeographyCode|CountryId|DateFrom|Importance|InterestLevel|ItemClob1File|ItemClob2File|ItemClob3File|ItemClob4File|ItemClob5File|ItemDate1|ItemDate10|ItemDate2|ItemDate3|ItemDate4|ItemDate5|ItemDate6|ItemDate7|ItemDate8|ItemDate9|ItemDecimal1|ItemDecimal2|ItemDecimal3|ItemDecimal4|ItemDecimal5|ItemNumber1|ItemNumber10|ItemNumber2|ItemNumber3|ItemNumber4|ItemNumber5|ItemNumber6|ItemNumber7|ItemNumber8|ItemNumber9|ItemText20001|ItemText20002|ItemText20003|ItemText20004|ItemText20005|ItemText2401|ItemText24010|ItemText24011|ItemText24012|ItemText24013|ItemText24014|ItemText24015|ItemText2402|ItemText2403|ItemText2404|ItemText2405|ItemText2406|ItemText2407|ItemText2408|ItemText2409|ItemText301|ItemText3010|ItemText3011|ItemText3012|ItemText3013|ItemText3014|ItemText3015|ItemText302|ItemText303|ItemText304|ItemText305|ItemText306|ItemText307|ItemText308|ItemText309|Mandatory|QualifierCode1|QualifierId1|QualifierCode2|QualifierId2|QualifierSetCode1|QualifierSetCode2|RatingLevelCode1|RatingLevelId1|RatingLevelCode2|RatingLevelId2|RatingLevelCode3|RatingLevelId3|RatingModelCode1|RatingModelId1|RatingModelCode2|RatingModelId2|RatingModelCode3|RatingModelId3|SectionId|SectionName|SourceId|SourceKey1|SourceKey2|SourceKey3|SourceType|StateCountryCode|StateGeographyCode|StateProvinceId|DateTo|SourceSystemId|SourceSystemOwner

MERGE|ProfileItem|||PERS_300000002|English||LANGUAGE|||||2020/01/01|||||||||||||||||||||||||||||||||||||||||||||||||||||Y||||||||||||||||||||||3||1||2||LANGUAGE||LANGUAGE||LANGUAGE||300000001855526|||||||||||TPFPIPERS_300000002_LANG1|HRC_SQLLOADER

Reports (BIP) – Query to get learning outcomes

Oracle Learning Cloud supports to define learning outcomes against each course as shown in below fig:

For reporting and integration purposes, there is a need to extract the learning outcomes assigned against each course. Learning outcomes are stored as profile relationship against each course. Below query can be used to extract the asked data:

select wlifv.learning_item_number,
wlifv.name learning_item_name,
wlifv.learning_item_type,
wlifv.status,
wlifv.effective_start_date,
wlifv.effective_end_date,
hpi.content_type_id,
hpi.date_from,
hpi.date_to,
hpi.content_item_id competency_id,
hctt.content_type_name,
hcit.name content_item_name
from HRT_RELATION_CONFIG_B hrcb
,HRT_PROFILE_RELATIONS hpr
,HRT_CONTENT_TYPES_B hctb
,HRT_CONTENT_TYPES_TL hctt
,HRT_CONTENT_ITEMS_TL hcit
,HRT_PROFILE_ITEMS hpi
,HRT_PROFILES_B hpb
,WLF_LEARNING_ITEMS_F_VL wlifv
where hrcb.key_table_name = 'WLF_LEARNING_ITEMS_F_VL'
and hrcb.relation_code = 'LEARNING_ITEM'
and hrcb.relation_id = hpr.relation_id
and hpi.profile_id = hpr.profile_id
and hpi.content_type_id = hctb.content_type_id
and hctt.content_type_id = hctb.content_type_id
and hctb.context_name = 'COMPETENCY'
and hpi.profile_id = hpb.profile_id
and hpr.object_id = wlifv.learning_item_id
and hpb.profile_usage_code = 'L'
and hpi.content_item_id = hcit.content_item_id
and TRUNC(SYSDATE) BETWEEN wlifv.effective_start_date and wlifv.effective_end_date
and hctt.language = 'US'
and hcit.language = 'US'
ORDER BY 1
Lookups – Mass Upload Lookup Types

Lookups are commonly used across modules in SaaS. Sometimes, the number of lookups is so much that it takes lot of time and effort to create them manually in the application. Oracle SaaS supports bulk upload of both lookup types and lookup codes.

In this post, we will see how to make use of file based loader to load lookup types.

  1. Prepare the lookup code file as given below:

LookupType|Meaning|Description|ModuleKey|ModuleType
TXX_MASS_UPLOAD|Mass Upload Lookup Definition|Test Lookup created for demo purpose|HcmCommonHrCore|LBA
TXX_MASS_UPLOAD_A|Mass Upload Lookup Def – A|Test Lookup created for demo purpose-A|HcmCommonHrCore|LBA

Out of above listed attributes, only the description is optional.

Module Key and module type both are required parameters. To know what value shall be passed, please check the below post:

Reports (BIP) – Query to find module type and key (fusionhcmconsulting.com)

2. Save the file as csv with pipe as a delimiter.

3. Once the file is ready , navigate to – Tools -> File Import and Export

4. Click on Add (+) and choose your file:

Select account as :-> setup/functionalSetupManger/import

Click on Save and Close.

5. Navigate to Manage Common Lookups. Under Search Results click on Action and Import:

6. Monitor the import progress.

7. Once the import is complete, verify the uploaded values:

Follow the below to see how to upload the lookup values in bulk

Lookups – Mass Upload Lookup Values (fusionhcmconsulting.com)

HDL – Sample File to Load extended Lookup Codes

Oracle HCM makes use of extended lookups feature to support dependent lookup values. For example, while creating a VISA or Work Permit record for a Person for Singapore, the Category field is dependent upon Type of the pass chosen. The values of Category field are derived from Extended lookup.

Navigate to Setup and Maintenance -> Manage Extended Lookup codes -> Visa Permit Type

Now, let us take an example where we need to load 2 values for categories based on the lookup code S Pass.

Prepare the HDL file in below format:

METADATA|ExtendedLookupCode|ExtendedLookupCodeId|LookupType|LookupCode|LegislationCode|ExtendedLookupCode|ExtendedLookupCodeName|SourceSystemOwner|SourceSystemId
MERGE|ExtendedLookupCode||PER_VISA_PERMIT_TYPE|SG_SP|SG|TEST_SP1|S Pass Holder – Test 1|HRC_SQLLOADER|TEST_1
MERGE|ExtendedLookupCode||PER_VISA_PERMIT_TYPE|SG_SP|SG|TEST_SP2|S Pass Holder – Test 2|HRC_SQLLOADER|TEST_2

zip the file and upload using HCM Data Loader from Data Exchange.

On successful load, the new values can be verified from either of following two places on the UI:

  1. From Manage Extended Lookup codes:

2. From Documents Tab on Person UI:

Reports (BIP) – Query to find module type and key

While defining Common Lookups or value sets, you need to provide module value. Each module has an associated module type, module key and product code associated with it. For example:

These details are stored in backed in a table – FND_APPL_TAXONOMY.

Use the below query to find module type, module key etc for a module:

select fat.MODULE_NAME
,fat.MODULE_TYPE
,fat.MODULE_KEY
,fat.PRODUCT_CODE
from FND_APPL_TAXONOMY fat
Lookups – Mass Upload Lookup Values

Lookups are used commonly to meet different requirements. Many a times, lookup values easily go past hundred values, in such case adding the values one by one into the lookup is very tedious and error prone job.

There is no HDL support to bulk upload the lookup values. However, a file based solution is available which is easy to use and quick.

We have already discussed on how to bulk upload lookup types in below post:

Lookups – Mass Upload Lookup Types (fusionhcmconsulting.com)

Follow the below steps to mass upload lookup values:

  1. Create a custom lookup from UI:

[N] – Setup and Maintenance -> Search -> Manage Common Lookups

2. Click on Add New (+) under search results:

3. Provide the details and Click on Save:

4. Prepare the lookup values file in below format:

LookupType|LookupCode|DisplaySequence|EnabledFlag|StartDateActive|EndDateActive|Meaning|Description|Tag
XXX_MASS_UPLOAD|MASS_01|1|Y|15/12/2001||Mass Upload Value 1|Mass Upload Value 1 Description|+GB

Below mentioned attributes in the above file are Mandatory:

-> LookupType

-> LookupCode

-> EnabledFlag

-> Meaning

Except these all other fields are optional.

Date Format for StartDateActive and EndDateActive attributes is DD/M/RRRR.

File should be pipe (|) delimited.

Save the file as csv.

5. Once the file is ready, navigate to – Tools -> File Import and Export

6. Click on Add (+) and choose your file:

Select account as :-> setup/functionalSetupManger/import

Click on Save and Close.

7. Navigate to Manage Common Lookups. Under Search Results click on Action and Import:

8. Select the account and give the file name as given in step 6 and Click on Upload button:

9. Monitor the import progress:

10. Once the import is complete, verify the uploaded values:

11. Results can be verified from Import file log as well:

Both Lookup types and lookup codes can be loaded in one shot as well. Prepare both the files simultaneously and follow the same steps as given above.

Data Masking – Mask Salary Data in lower environment

There is a common requirement to mask salary data post P2T refreshes. This should be done in order to hide the actual salaries information as salary is a very sensitive information.

Use the below query to generate data in HDL format in a test environment immediately after P2T refresh. The below query generate a random salary amount. Save the downloaded data in .dat file format and upload it back to the instance.

Select 'METADATA|Salary|AssignmentNumber|SalaryAmount|DateFrom|DateTo|SalaryBasisId|SalaryId' Header, 1 data_flow_order
from dual
UNION
SELECT 'MERGE|Salary'||'|'||
paam.assignment_number||'|'||
round(DBMS_RANDOM.VALUE (1,15000) , 2)||'|'||
TO_CHAR(cs.date_from,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
TO_CHAR(cs.date_to,'RRRR/MM/DD', 'nls_date_language=American')||'|'||
cs.salary_basis_id||'|'||
cs.salary_id data_row,
2 data_flow_order
FROM cmp_salary cs,
per_all_assignments_m paam
WHERE cs.assignment_id= paam.assignment_id
AND trunc(sysdate) between paam.effective_start_date AND paam.effective_end_date
AND paam.assignment_type in ('E', 'C', 'P')
AND paam.assignment_number ='E788880'
ORDER BY data_flow_order

Reports (BIP) – Query to extract seniority dates setup
select hr_general.decode_lookup('ORA_PER_SENIORITY_ITEMS',psds.seniority_date_code) seniority_date_code_meaning
,hr_general.decode_lookup('ORA_PER_SENIORITY_LEVELS',psds.level_code) level_code_meaning
,hr_general.decode_lookup('ORA_PER_TRIG_FIELDS',psds.triggering_field) triggering_field_meaning
,psds.active_flag
,psds.allow_edit_flag
,psds.display_in_ui
,psds.cumulative_flag
,psds.hours_in_year
,psds.hours_in_month
,psds.hours_in_day
,psds.object
,psds.source_field
,psds.seniority_version
,psds.record_creator
,psds.setup_seniority_basis
from per_seniority_dates_setup psds
Reports (BIP) – User Role Details
SELECT distinct pu.username
,prd.role_common_name
,prd.abstract_role
,prd.job_role
,prd.data_role
,prd.delegation_allowed
,prd.active_flag
,prdt.role_name
,prdt.description
FROM per_users pu
,per_roles_dn prd
,per_user_roles pur
,per_roles_dn_tl prdt
WHERE pu.user_id = pur.user_id
and pur.role_id = prdt.role_id
and pur.role_id = prd.role_id
and prdt.language = 'US'
HDL – Loading Multiple attachments for same document type

There are various cases where more than one attachment should be maintained against employee document record. First of all, we will create a new document type where multiple attachments will be allowed.

  1. Navigate to Setup and Maintenance -> Document Types -> Click on Create

2. Choose multiple occurrences as Yes:

3. Enter the relevant information and submit the record.

4. Prepare the HDL in below format and load the data:

METADATA|DocumentsOfRecord|PersonNumber|DateFrom|DateTo|DocumentType|DocumentName|DocumentNumber|DocumentCode|Country|AttachmentEntityName|Comments|IssuedDate|IssuingAuthority|IssuingCountry|IssuingLocation|Publish|PublishDate|SourceSystemId|SourceSystemOwner
MERGE|DocumentsOfRecord|101|2021/01/01||Test Multi Attachments Doc|Multiple Attachments||TMAD_001||||||||||HRC_SQLLOADER_101_TMAD_001|HRC_SQLLOADER

METADATA|DocumentAttachment|SourceSystemOwner|SourceSystemId|PersonNumber|DocumentType|DocumentCode|Title|Description|DataTypeCode|Country|EntityName|File|FileName|MimeType|URL|URLorTextorFileName|DmVersonNumber
MERGE|DocumentAttachment|HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001_1|101|Test Multi Attachments Doc|TMAD_001|Test Multi Attachments Doc|Test Multi Attachments Doc 1|FILE|||Historical_rating.docx|Historical_rating.docx|||Historical_rating.docx|
MERGE|DocumentAttachment|HRC_SQLLOADER|HRC_SQLLOADER_101_TMAD_001_2|101|Test Multi Attachments Doc|TMAD_001|Test Multi Attachments Doc|Test Multi Attachments Doc 2|FILE|||Historical_rating1.docx|Historical_rating1.docx|||Historical_rating1.docx|

5. Place the docx files in BlobFiles folder and zip with the .dat file:

6. Import and Load the file.

7. Once the data is loaded, verify from the UI:

Reports (BIP) – Query to get Department Manager Name
SELECT distinct houft.name, 
       ppnfv.full_name
FROM HR_ORGANIZATION_INFORMATION_F hoif,
     per_person_names_f_v ppnfv,
     hr_organization_units_f_tl houft
WHERE trunc(sysdate) BETWEEN hoif.EFFECTIVE_START_DATE AND hoif.EFFECTIVE_END_DATE
AND trunc(sysdate) BETWEEN ppnfv.effective_Start_date AND ppnfv.effective_end_date
AND ppnfv.person_id=hoif.ORG_INFORMATION2
AND houft.organization_id = hoif.organization_id
AND hoif.ORG_INFORMATION_CONTEXT = 'PER_ORG_MANAGER_INFO'
AND houft.language = 'US'
AND TRUNC(sysdate) BETWEEN houft.EFFECTIVE_START_DATE AND houft.EFFECTIVE_END_DATE
HDL – Loading Contract History Using HDL

Sample HDL file to contract history:

METADATA|Worker|PersonId|EffectiveStartDate|EffectiveEndDate|PersonNumber|BloodType|CorrespondenceLanguage|StartDate|DateOfBirth|DateOfDeath|CountryOfBirth|RegionOfBirth|TownOfBirth|PersonDuplicateCheck|ActionCode|ReasonCode|GUID|SourceSystemOwner|SourceSystemId
MERGE|Worker||2016/08/27|4712/12/31|435282|||2016/08/27|1967/07/23||PH|Philippines|||HIRE|||LEGACY|PER_909090

METADATA|PersonName|PersonNameId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|NameType|FirstName|MiddleNames|LastName|Honors|KnownAs|PreNameAdjunct|MilitaryRank|PreviousLastName|Suffix|Title|CharSetContext|NameInformation1|NameInformation2|NameInformation3|NameInformation4|NameInformation5|NameInformation6|NameInformation7|NameInformation8|NameInformation9|NameInformation10|NameInformation11|NameInformation12|NameInformation13|NameInformation14|NameInformation15|NameInformation16|NameInformation17|NameInformation18|NameInformation19|NameInformation20|NameInformation21|NameInformation22|NameInformation23|NameInformation24|NameInformation25|NameInformation26|NameInformation27|NameInformation28|NameInformation29|NameInformation30|GUID|SourceSystemOwner|SourceSystemId
MERGE|PersonName||2016/08/27|4712/12/31|PER_909090||US|GLOBAL|||Ashwin Ravi||Ashwin Ravi|||||MR.|||||||||||||||||||||||||||||||||LEGACY|909090_PER_NAME

METADATA|PersonLegislativeData|PersonLegislativeId|EffectiveStartDate|EffectiveEndDate|PersonId(SourceSystemId)|PersonNumber|LegislationCode|HighestEducationLevel|MaritalStatus|MaritalStatusDate|Sex|GUID|SourceSystemOwner|SourceSystemId
MERGE|PersonLegislativeData||2016/08/27|4712/12/31|PER_909090||US||M||M||LEGACY|909090_PER_LEG

METADATA|WorkRelationship|PeriodOfServiceId|LegalEmployerName|PersonId(SourceSystemId)|PersonNumber|DateStart|WorkerType|LegalEmployerSeniorityDate|ActualTerminationDate|LegalEntityId|Comments|EnterpriseSeniorityDate|LastWorkingDate|NotifiedTerminationDate|OnMilitaryServiceFlag|PrimaryFlag|ProjectedTerminationDate|RehireAuthorizerPersonId|RehireAuthorizor|RehireReason|RevokeUserAccess|WorkerNumber|RehireRecommendationFlag|NewStartDate|ActionCode|ReasonCode|DateOfDeath|HideUntilDate|DateForPrimaryFlagChange|CancelWorkRelationshipFlag|TerminateWorkRelationshipFlag|ReverseTerminationFlag|CorrectTerminationFlag|GUID|SourceSystemOwner|SourceSystemId|GlobalTransferFlag|ReadyToConvert|WorkerComments
MERGE|WorkRelationship||US Company|PER_909090|435282|2016/08/27|E||||||||N|Y|||||||||HIRE||||||||||LEGACY|455985_WR||||

METADATA|WorkTerms|AssignmentId|AssignmentNumber|EffectiveEndDate|EffectiveLatestChange|EffectiveSequence|EffectiveStartDate|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|LegalEmployerName|DateStart|WorkerType|AssignmentName|PeopleGroup|AssignmentStatusTypeId|AssignmentStatusTypeCode|AssignmentType|PersonTypeId|PersonTypeCode|SystemPersonType|BargainingUnitCode|BillingTitle|BusinessUnitId|BusinessUnitShortCode|ContractId|DateProbationEnd|WorkerCategory|AssignmentCategory|EstablishmentId|ReportingEstablishment|ExpenseCheckSendToAddress|GradeId|GradeCode|GradeLadderPgmId|GradeLadderPgmName|HourlySalariedCode|InternalBuilding|InternalFloor|InternalLocation|InternalMailstop|InternalOfficeNumber|JobId|JobCode|LabourUnionMemberFlag|LocationId|LocationCode|ManagerFlag|NormalHours|Frequency|NoticePeriod|NoticePeriodUOM|OrganizationId|DepartmentName|PositionId|PositionCode|PositionOverrideFlag|PrimaryWorkTermsFlag|ProbationPeriod|ProbationUnit|ProjectedEndDate|ProjectedStartDate|ProposedWorkerType|ProposedUserPersonType|ReasonCode|RetirementAge|RetirementDate|SpecialCeilingStepId|SpecialCeilingStep|StepEntryDate|TaxAddressId|EndTime|StartTime|VendorSiteId|WorkAtHomeFlag|WorkTermsAssignmentId|VendorId|FreezeStartDate|FreezeUntilDate|ActionCode|CategoryCode|TaxReportingUnitId|TaxReportingUnit|GUID|CollectiveAgreementId|CollectiveAgreementIdCode|FullPartTime|PermanentTemporary|SourceSystemOwner|SourceSystemId
MERGE|WorkTerms||ET435282|2017/07/31|Y|1|2016/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||HIRE|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|2018/08/26|Y|1|2017/08/01|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||ASG_CHANGE|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|2019/08/26|Y|1|2018/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||CONTRACT_EXTENSION|||||||||LEGACY|52361_WT
MERGE|WorkTerms||ET435282|4712/12/31|Y|1|2019/08/27|455985_WR|PER_909090|435282|US Company|2016/08/27|E||||ACTIVE_PROCESS|ET||Employee|EMP||||US Company|||||||||||||||||||||||||||||||||Y||||||||||||||||||||||CONTRACT_EXTENSION|||||||||LEGACY|52361_WT

METADATA|Assignment|AssignmentId|ActionCode|EffectiveStartDate|EffectiveEndDate|EffectiveSequence|EffectiveLatestChange|WorkTermsAssignmentId(SourceSystemId)|WorkTermsNumber|AssignmentType|AssignmentName|AssignmentNumber|PeopleGroup|AssignmentStatusTypeId|AssignmentStatusTypeCode|BargainingUnitCode|BillingTitle|BusinessUnitId|BusinessUnitShortCode|DateProbationEnd|WorkerCategory|AssignmentCategory|EstablishmentId|ReportingEstablishment|ExpenseCheckSendToAddress|GradeId(SourceSystemId)|GradeCode|GradeLadderPgmId|GradeLadderPgmName|HourlySalariedCode|InternalBuilding|InternalFloor|InternalLocation|InternalMailstop|InternalOfficeNumber|JobId(SourceSystemId)|JobCode|LabourUnionMemberFlag|LocationId|LocationCode|ManagerFlag|NormalHours|Frequency|NoticePeriod|NoticePeriodUOM|OrganizationId(SourceSystemId)|DepartmentName|PeriodOfServiceId(SourceSystemId)|PersonId(SourceSystemId)|PersonNumber|DateStart|WorkerType|LegalEmployerName|PersonTypeId|PersonTypeCode|SystemPersonType|PositionId(SourceSystemId)|PositionCode|PositionOverrideFlag|PrimaryAssignmentFlag|PrimaryFlag|ProbationPeriod|ProbationUnit|ProjectTitle|ProjectedEndDate|ProjectedStartDate|ProposedWorkerType|ProposedUserPersonType|ReasonCode|RetirementAge|RetirementDate|SpecialCeilingStepId|SpecialCeilingStep|TaxAddressId|EndTime|StartTime|VendorSiteId|WorkAtHomeFlag|VendorId|FreezeStartDate|FreezeUntilDate|CategoryCode|CollectiveAgreementId|CollectiveAgreementIdCode|FullPartTime|PermanentTemporary|DefaultExpenseAccount|GUID|SourceSystemOwner|SourceSystemId|GspEligibilityFlag|OvertimePeriodName|SeniorityBasis|UnionId|UnionName
MERGE|Assignment||HIRE|2016/08/27|2017/07/31|1|Y|52361_WT|ET435282|E||E435282|121||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||ASG_CHANGE|2017/08/01|2018/08/26|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||CONTRACT_EXTENSION|2018/08/27|2019/08/26|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||
MERGE|Assignment||CONTRACT_EXTENSION|2019/08/27|4712/12/31|1|Y|52361_WT|ET435282|E||E435282|122||ACTIVE_PROCESS||||US Company||||||||103||||||||||413|N||US Company|N|8|D|||LEGACY_2856||455985_WR|PER_909090||2016/08/27|E|US Company||Employee|EMP|LEGACY_227441|||Y|Y||||||||||||||03:30|07:00||N|||||||||||LEGACY|52361_ASSIGN|||||

METADATA|Contract|ContractId|EffectiveStartDate|EffectiveEndDate|AssignmentId|AssignmentNumber|PersonNumber|ContractType|Description|Duration|DurationUnits|ExtensionPeriod|ExtensionPeriodUnits|ExtensionReason|PersonId(SourceSystemId)|GUID|SourceSystemOwner|SourceSystemId
MERGE|Contract||2016/08/27|2018/08/26||ET435282|435282|Flexiable 2 Years|Rolling Contract|2|Y||||PER_909090||LEGACY|52361_CONTRACT
MERGE|Contract||2018/08/27|2019/08/26||ET435282|435282|Flexiable 2 Years|Rolling Contract|||1|Y||PER_909090||LEGACY|52361_CONTRACT
MERGE|Contract||2019/08/27|4712/12/31||ET435282|435282|Flexiable 2 Years|Rolling Contract|||1|Y||PER_909090||LEGACY|52361_CONTRACT

Version Tested – 20B

BIP (Reports) – Query to extract Accrual details
select papf.person_number,
aapft.name aapftan_name ,
apae.begin_bal,
apae.accrued,
apae.used,
apae.accrual_period,
apae.end_bal,
apae.first_last_prd_in_aapft_term
from per_all_peoaapfte_f papf,
anc_per_accrual_entries apae,
anc_absence_aapftans_f_tl aapft
where 1=1
and papf.person_id = apae.person_id
and apae.aapftan_id= aapft.absence_aapftan_id
and trunc(sysdate) between papf.effective_start_date and papf.effective_end_date
and aapft.language ='US'
and trunc(sysdate) between aapft.effective_start_date and aapft.effective_end_date
and apae.accrual_period <= TO_DATE('2021/12/31','YYYY/MM/DD')
Reports (BIP) – Basic Query for Checklist E-sign Report

Oracle provided a new functionality to include e signatures in checklist tasks. A custom report can be configured and upon e-signature and completion, it will be saved to worker’s document of records.

You can follow the below document on my oracle support for detailed steps – 2611795.1

Below is the basic SQL query to start the data model development. You can add additional tables/columns as per your need.

SELECT DISTINCT Person.PERSON_ID
,PAT.ALLOCATED_CHECKLIST_ID AS P_ALLOCATED_CHECKLIST_ID
--,PAT.ALLOCATED_TASK_ID AS TASK_ID
,PersonName.FULL_NAME FULL_NAME
,PersonName.DISPLAY_NAME DISPLAY_NAME
,PersonName.first_name FIRST_NAME
,PersonName.last_name LAST_NAME
,PersonName.middle_names MIDDLE_NAMES
,to_char(Person.date_of_birth,'DD-MM-RRRR', 'nls_date_language=American') DOB
,(SELECT flv.meaning
FROM FND_LOOKUP_VALUES flv
WHERE flv.lookup_type = 'TITLE'
AND flv.language = 'US'
AND flv.lookup_code = PersonName.title) TITLE
--,PAT.SIGNER_NAME AS P_SIGNER_NAME
--,TO_CHAR(PAT.SIGN_DATE,'MM/DD/YY') AS P_SIGN_DATE
,:P_SIGNER_NAME P_SIGNER_NAME
,:P_SIGN_DATE P_SIGN_DATE
FROM PER_PERSONS Person,
PER_PERSON_NAMES_F PersonName,
PER_ALLOCATED_CHECKLISTS PAC,
PER_ALLOCATED_TASKS_VL PAT
WHERE PAC.ALLOCATED_CHECKLIST_ID = NVL(:P_ALLOCATED_CHECKLIST_ID,PAC.ALLOCATED_CHECKLIST_ID)
AND PAC.ALLOCATED_CHECKLIST_ID=PAT.ALLOCATED_CHECKLIST_ID
AND PAT.PERFORMER_ORIG_SYS_ID = Person.PERSON_ID
AND PersonName.PERSON_ID = Person.PERSON_ID
AND PersonName.NAME_TYPE = 'GLOBAL'
AND GREATEST(TRUNC(PAC.ACTION_DATE),TRUNC(SYSDATE)) BETWEEN LEAST(TRUNC(SYSDATE),PersonName.EFFECTIVE_START_DATE) AND PersonName.EFFECTIVE_END_DATE
AND PAT.STATUS='COM'
AND PAT.allocated_task_id=3000012222113131
BIP Reports – Query to get Position Profile Attachment details

While creating a position profile, there is an option to add the attachments. In order to get the various details of the attachment like name, UCM content ID etc, please use the below query:

select hpb.PROFILE_TYPE_ID,
       PROFILE_CODE,
       PROFILE_STATUS_CODE,
       PROFILE_USAGE_CODE,
       fad.CATEGORY_NAME ,
       fdt.file_name,
       fdt.dm_version_number document_id,
       fdt.dm_document_id UCM_file
  from fnd_attached_documents fad, HRT_PROFILES_B hpb, fnd_documents_tl fdt
 where ENTITY_NAME = 'HRT_PROFILES_B'
   and hpb.profile_id = fad.PK1_VALUE
   AND fad.document_id = fdt.document_id
   AND fdt.language = 'US'

In order to download the attchments from UCM, the user should have AttachmentsRead role attached. Please check the below post on how to create AttachmentsRead role:

https://fusionhcmconsulting.com/2021/03/security-role-to-view-document-of-record-attachments-on-ucm/

Configuration – Extract DFF Setup

Use below query to extract the DFF setups defined in Fusion:

SELECT
 fdsv.descriptive_flexfield_code
      ,fdsv.context_code
      ,fdsv.segment_code
      ,fdsv.name
      ,fdsv.column_name
      ,fvvs.value_set_code
      ,fdsv.display_type
      ,fdsv.prompt
      ,fdsv.short_prompt
      ,fdsv.enabled_flag
      ,fdsv.required_flag
      ,fdsv.read_only_flag
      ,fdsv.description
      ,fdsv.sequence_number
      ,fdsv.derivation_value
      ,fdsv.bi_enabled_flag
  FROM
 fnd_df_segments_vl fdsv,
       fnd_vs_value_sets fvvs
 WHERE fdsv.value_set_id = fvvs.value_set_id(+)
   AND fdsv.descriptive_flexfield_code = 'PER_ASG_DF'