Search for:
BIP – Query to find attachment details

There are multiple objects in Oracle HCM Cloud which support attachments. Often, there is a need to extract the attachment details stored at the object level. All the uploaded attachments are stored in fnd_attached_documents table. Below query is used to extract the attachment details stored at Service Request level in HR Helpdesk. You can replace the table name and primary key join to extract the data as per your requirement:

select ssr.sr_id,
       ssr.title,  
       fad.CATEGORY_NAME ,
       fdt.file_name,
       fdt.dm_version_number document_id,
       fdt.dm_document_id UCM_file
  from fnd_attached_documents fad, svc_service_requests ssr, fnd_documents_tl fdt
 where ENTITY_NAME = 'SVC_SERVICE_REQUESTS'
   and ssr.sr_id = fad.PK1_VALUE
   and fad.document_id = fdt.document_id
   and fdt.language = 'US'

Query to get list of Jobs having an attachment:

 select pjft.name
   from per_jobs_f_tl pjft
  where pjft.language = 'US'
    and exists (select 1  
                  from fnd_attached_documents fad, PER_JOBS_F pjf, fnd_documents_tl fdt
                 where ENTITY_NAME = 'PER_JOBS_F'
                   and pjf.job_id = fad.PK1_VALUE
                   and fad.document_id = fdt.document_id
                   and fdt.language = 'US'
				   and pjf.job_id = pjft.job_id)

Refer below link for attachments on Position profile:

https://fusionhcmconsulting.com/2021/03/bip-reports-query-to-get-position-profile-attachment-details/

In order to download the attchments from UCM, the user should have AttachmentsRead role attached. Please check the below post on how to create AttachmentsRead role:

https://fusionhcmconsulting.com/2021/03/security-role-to-view-document-of-record-attachments-on-ucm/

Configuration – Value set to get Employee Category list

Employee category (EMPLOYEE_CATG) is a delivered user type lookup in Oracle HCM which can be extended. However, in few cases, there is a requirement to populate employee category at DFF attribute(s), in this case, we need to create a table defined value set.

FROM Clausefnd_lookup_values
Value Attributes Table Alias 
*Value Column Namesubstr(meaning,1,80)
Value Column TypeVARCHAR2
Value Column Length80
Description Column Namesubstr(meaning,1,100)
Description Column Type VARCHAR2
Description Column Length 80
ID Column Namesubstr(meaning,1,100)
ID Column TypeVARCHAR2
ID Column Length80
Enabled Flag Column Name 
Start Date Column Name 
End Date Column Name 
WHERE Clauselookup_type= ‘EMPLOYEE_CATG’ and TAG is NULL
and language = ‘US’
and enabled_flag = ‘Y’
ORDER BY Clause substr(meaning,1,100)
BIP – Query to get Position costing details
SELECT hapft.name
      ,pcaa.segment1 
      ,pcaa.segment2 
      ,pcaa.segment3 
      ,pcaa.segment4 
      ,pcaa.segment5 
      ,pcaa.segment6 
      ,pcaa.segment7 
      ,pcaa.segment8 
  FROM PAY_COST_ALLOCATIONS_F pacf
      ,PAY_COST_ALLOC_ACCOUNTS pcaa
	  ,HR_ALL_POSITIONS_F hapf
	  ,HR_ALL_POSITIONS_F_TL hapft
 WHERE pacf.cost_allocation_record_id = pcaa.cost_allocation_record_id 
   AND pacf.source_type = 'POS' 
   AND pacf.source_id = hapf.position_id
   AND hapft.position_id = hapf.position_id
   AND hapft.language = 'US'
   AND TRUNC(SYSDATE) BETWEEN pacf.effective_start_date AND pacf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN hapf.effective_start_date AND hapf.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN hapft.effective_start_date AND hapft.effective_end_date
Extract – Loopback Inbound Extract

Often in Cloud HCM, we encounter situation(s) where we need to update some information at worker assignment, post worker creation as this information was not available at the time of hiring an worker. One such example could be Employee category. Let’s take a hypothetical example, employee category should be auto populated based on worker Job. As, there is no direct link between employee category and job, so it becomes a pain to manually search and put the correct employee category while hiring. So, in this case, the worker is hired with Job with no value for employee category.

A DFF is opened at Job level which store the corresponding employee category. So, in this case we design a solution which will:

  • Read the worker job and then the corresponding employee category from Job.
  • Generate the data for WorkTerms and Assignments METADATA in HCM Data Loader Format.
  • HCM Extract to consume the data and trigger HDL Import and Load Process.
  • Schedule HCM Extract to run daily or depending upon the requirement.

Once, HCM Extract is run, employee category will populated automatically.

Steps to design the integration:

  1. Extract the Workterms and assignment data for all workers where the job is populated and employee category is NULL.
  2. Create a BIP publisher report to organize the data extracted in Step 1 in HCM Data Loader format. Copy the Global Reports Data Model (from path /Shared Folders/Human Capital Management/Payroll/Data Models/globalReportsDataModel) to a folder in /Shared Folders/Custom/HR. This folder can be anything as per your nomenclature specifications.
  3. Add a new data set in the globalReportsDataModel and paste your query in the new data set.
  4. Check below post on sample query to extract the data in HCM Data Loader format: https://fusionhcmconsulting.com/2021/01/hdl-updating-an-attribute-on-assignment/
  5. Create an etext template (sample RTF template is available to above link) and a report based on above data model.
  6. Global Data Model will look like (G_3 is new data set):-

Steps to create HCM Extract:

You can import the below sample extract in your environment:

  • Download the above file. Change the extension to xml.
  • Open the xml file with Notepad or Notepad++ and remove first two rows (these rows were added to make sure the file is uploaded here).
  • Navigate to My Client Groups -> Data Exchange -> HCM Extracts -> Extract Definitions:
  • Click on Import to import the xml file
  • Provide an Extract name. Uncheck the Changes Only checkbox and click on Ok:
  • Once the extract Import is complete, Click on pencil icon to edit:
  • Click on ‘Extract Delivery Option’ in navigation Tree on left side. And on the right side, Under ‘Extract Delivery Options’ click on edit to update the path of your report as created earlier. It should like – /Custom/HR/AssignmentUpdateRPT.xdo
  • Make sure default value for parameter Auto Load is set “Y”.
  • Save the details. Click on Extract Execution Tree next and Click All Formula:
  • Once the formulas are complied, then click on Submit button.

The next step is to refine the extract in order to Submit the Import and Load process:

  • Navigate to My Client Groups -> Data Exchange -> HCM Extracts -> Refine Extracts. Search the extract and click on edit.
  • Select and Add – Initiate HCM Data Loader process
  • Click on Go Task for “Initiate HCM Data Loader” and Click Edit for “ Data Loader Archive Action” and add the relevant parameters:

Parameter Basis – Bind to Flow Task
Basis Value – XX Assignment Update Integration, Submit , Payroll Process

  • Click Edit for “Data Loader Configurations” add relevant parameters

Parameter Basis – Constant Bind
Basis Value -ImportMaximumErrors=100,LoadMaximumErrors=100,LoadConcurrentThreads=8,LoadGroupSize=100

  • Task sequence should look as follows:
  • Go to Review and click on Submit.

Your extract is now ready for submission. You can submit the extract and test it.

BIP – Query to Extract Position FTE values

When position hierarchy is enabled in Cloud HCM, Manage Positions UI shows some dynamically populated fields along with Parent position.

For example:

Use the below query to extract above details:

SELECT Positions.name 				"Position Name"
      ,Positions.FTE 				"Position Current FTE"
      ,Positions.INCUMBENT_FTE      "Current Incumbent FTE"
      ,(Positions.FTE - Positions.INCUMBENT_FTE)      "Difference FTE"
  FROM	  
(SELECT HAPFT.NAME,
        HAPF.FTE, 
	    (select SUM(PAWMF.VALUE)
           from PER_ALL_ASSIGNMENTS_M PAAM,
                PER_ASSIGN_WORK_MEASURES_F PAWMF
		  where 1=1
            AND PAAM.POSITION_ID = HAPF.POSITION_ID 
            AND SYSDATE  BETWEEN PAAM.EFFECTIVE_START_DATE AND PAAM.EFFECTIVE_END_DATE
            AND PAAM.ASSIGNMENT_TYPE = 'E'
		    AND PAAM.ASSIGNMENT_STATUS_TYPE = 'ACTIVE'
            AND PAAM.ASSIGNMENT_ID = PAWMF.ASSIGNMENT_ID
            AND PAWMF.UNIT = 'FTE') AS INCUMBENT_FTE
   FROM HR_ALL_POSITIONS_F HAPF, 
        HR_ALL_POSITIONS_F_TL HAPFT
  WHERE HAPF.POSITION_ID = HAPFT.POSITION_ID 
    AND USERENV('LANG') = HAPFT.LANGUAGE 
    AND TRUNC(SYSDATE)  BETWEEN HAPF.EFFECTIVE_START_DATE AND HAPF.EFFECTIVE_END_DATE
    AND TRUNC(SYSDATE)  BETWEEN HAPFT.EFFECTIVE_START_DATE AND HAPFT.EFFECTIVE_END_DATE
    AND HAPFT.NAME IN('Test Position')
  ORDER BY HAPFT.NAME ) Positions
BIP – Query to extract fast formula details

Below query will extract all the custom fast formulas along with the name of formula and formula type.

select fff.formula_name
      ,fft.formula_type_name
	  ,fff.formula_text
	  ,fff.compile_flag
	  ,fff.legislation_code
  from ff_formulas_f fff
      ,ff_formula_types fft
 where fff.formula_type_id = fft.formula_type_id 
   --and fft.formula_type_name = 'Extract Criteria'
   and fff.LAST_UPDATED_BY <> 'SEED_DATA_FROM_APPLICATION'
   AND TRUNC(SYSDATE) BETWEEN fff.effective_start_date AND fff.effective_end_date 
ORDER BY 2,1

To extract list of fast formulas used in HCM Extracts, run below query:

SELECT pedv.definition_name
      ,pedv.description
      ,pedv.legislation_code
      ,pedv.xml_tag_name
      ,pedv.ext_type_code
      ,fff.formula_name
	  ,fft.formula_type_name
  FROM pay_rep_criteria_f prcf
      ,pay_report_blocks_vl prbv
      ,per_ext_definitions_vl pedv
	  ,ff_formulas_f fff
	  ,ff_formula_types fft
 WHERE prbv.ext_definition_id = pedv.ext_definition_id
   AND prcf.report_block_id = prbv.report_block_id
   AND fff.formula_id = prcf.formula_id
   AND TRUNC(SYSDATE) BETWEEN fff.effective_start_date AND fff.effective_end_date
   AND TRUNC(SYSDATE) BETWEEN prcf.effective_start_date AND prcf.effective_end_date
   AND fff.formula_type_id = fft.formula_type_id 
HDL – Delete custom Action Reason Usages using HDL

Actions and Action Reasons are very important part of any Cloud HCM implementation. Oracle provides a large number of actions and action reasons out of the box. But if needed additional actions and action reasons can be created from UI as well as using HDL.

Each action is tied with an action type. Please note that action types are seeded and can’t be created. You can create a custom action and attach existing action reasons to it. The details are stored in PER_ACTION_REASON_USAGES table.

During implementation, there is a common requirement to delete some of the unwanted action reason usages which were created initially and are no longer required. In such cases finding each reason and deleting it from action is quite a painful task.

This can be achieved easily using HCM Data Loader.

Run the below query in BIP and save the file as Actions.dat. Zip the file and kick Import and Load HCM Data Loader process. You can modify the extract criteria as per your requirement:

SELECT data
FROM (
SELECT 'METADATA|ActionReasonUsage|ActionCode|ActionReasonCode|StartDate|EndDate|SourceSystemId|SourceSystemOwner' data, 1 DATA_SEQ
  FROm DUAL
UNION ALL
Select 'DELETE|ActionReasonUsage|'
||
paru.action_code
||'|'||
paru.action_reason_code
||'|'||
to_char(paru.start_date, 'yyyy/mm/dd')
||'|'||
to_char(paru.end_date, 'yyyy/mm/dd')
||'|'||
hikm.source_system_id
||'|'||
hikm.source_system_owner data, 2 DATA_SEQ
from 
hrc_integration_key_map hikm,
PER_ACTION_REASON_USAGES paru
where 1=1
and paru.ACTION_REASON_USAGE_ID = hikm.surrogate_id
and paru.created_by <> 'SEED_DATA_FROM_APPLICATION'
)
ORDER BY DATA_SEQ

Sample File:

METADATA|ActionReasonUsage|ActionCode|ActionReasonCode|EndDate|StartDate|SourceSystemId|SourceSystemOwner
DELETE|ActionReasonUsage|TEST_NEW|CMP_CHG||2002/01/01|HRC_SQLLOADER_TEST_NEW|HRC_SQLLOADER

Query to check action reason codes for an action:

SELECT paru.action_code
      ,paru.action_reason_code
  FROM per_action_reason_usages paru
 WHERE paru.action_code like '%CHANGE_SALARY%'
Value Set – Table Value set to get list of employees

While extending the DFF attributes for additional functionality, one of the common requirements is to get the list of workers in the system.

For this purpose a table based value set can be defined and attached to the DFF attribute.

FROM Clauseper_all_people_f papf, per_person_names_f ppnf
Value Attributes Table Alias 
*Value Column Namepapf.person_number
Value Column TypeVARCHAR2
Value Column Length30
Description Column Name ppnf.full_name
Description Column Type 
Description Column Length 
ID Column Namepapf.person_number
ID Column TypeVARCHAR2
ID Column Length30
Enabled Flag Column Name 
Start Date Column Name 
End Date Column Name 
WHERE Clausetrunc(SYSDATE) between papf.effective_start_date AND papf.effective_end_date
AND trunc(SYSDATE) between ppnf.effective_start_date AND ppnf.effective_end_date
and ppnf.name_type = ‘GLOBAL’
and ppnf.person_id=papf.person_id
and papf.person_id IN (select distinct person_id from per_periods_of_Service ppos where NVL(ppos.actual_termination_date, trunc(sysdate)) >= trunc(sysdate))
ORDER BY Clause 

The above value set will show a list of only active employees. If you want to include inactive employees as well, please modify the where clause as below:

trunc(SYSDATE) between papf.effective_start_date AND papf.effective_end_date
AND trunc(SYSDATE) between ppnf.effective_start_date AND ppnf.effective_end_date
and ppnf.name_type = 'GLOBAL'
and ppnf.person_id=papf.person_id
and papf.person_id IN (select distinct person_id from per_periods_of_Service ppos)
BIP – Query to extract Business Units
 select hauft.NAME 
   from HR_ORG_UNIT_CLASSIFICATIONS_F houcf, 
        HR_ALL_ORGANIZATION_UNITS_F haouf, 
	HR_ORGANIZATION_UNITS_F_TL hauft
  where haouf.ORGANIZATION_ID = houcf.ORGANIZATION_ID 
    AND haouf.ORGANIZATION_ID = hauft.ORGANIZATION_ID 
    AND haouf.EFFECTIVE_START_DATE BETWEEN houcf.EFFECTIVE_START_DATE AND houcf.EFFECTIVE_END_DATE 
    AND hauft.LANGUAGE = 'US'
    AND hauft.EFFECTIVE_START_DATE = haouf.EFFECTIVE_START_DATE 
    AND hauft.EFFECTIVE_END_DATE = haouf.EFFECTIVE_END_DATE 
    AND houcf.CLASSIFICATION_CODE = 'FUN_BUSINESS_UNIT' 
    AND trunc(SYSDATE) BETWEEN hauft.effective_start_date AND hauft.effective_end_date
HDL – Sample File to Load Succession Plan

You can make use of HCM Data Loader to upload Succession Plans (Incumbent, Job and Position Types).

Use the below sample file to load Succession Plans:

METADATA|SuccessionPlan|PlanName|PlanType|Status|AccessTypeCode|Description|IncumbentPersonNumber|JobCode|JobSetCode|DepartmentName|DepartmentSetCode|GradeSetCode|GradeCode|PositionCode|BusinessUnitName|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlan|Test_Person_Succession_Plan|INCUMBENT|ACTIVE|PUBLIC|Sample description for Incument Plan Type.|111222|||||||||HRC_SQLLOADER_Test_Person_Succession_Plan_INCUMBENT|HRC_SQLLOADER
MERGE|SuccessionPlan|Test_Job_Succession_Plan|JOB|ACTIVE|PUBLIC|Sample description for Job Plan Type.||XX052|COMMON|Test HR Department|COMMON|COMMON|GRD_90||XX BU|HRC_SQLLOADER_Test_Job_Succession_Plan_JOB|HRC_SQLLOADER
MERGE|SuccessionPlan|Test_Position_Succession_Plan|POSITION|ACTIVE|PUBLIC|Sample description for Position Plan Type.||||Test HR Department|COMMON|||POS_0301|COMMON|HRC_SQLLOADER_Test_Position_Succession_Plan_POSITION|HRC_SQLLOADER

METADATA|SuccessionPlanOwner|PlanName|PlanOwnerPersonNumber|OwnerTypeCode|SourceSystemId|SourceSystemOwner
MERGE|SuccessionPlanOwner|Test_Person_Succession_Plan|99997|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99997_Test_Person_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Job_Succession_Plan|99997|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99997_Test_Job_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Position_Succession_Plan|99997|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99997_Test_Position_Succession_Plan|HRC_SQLLOADER
MERGE|SuccessionPlanOwner|Test_Position_Succession_Plan|99998|ADMINISTRATOR|HRC_SQLLOADER_ADMINISTRATOR_99998_Test_Position_Succession_Plan|HRC_SQLLOADER

Once the data is loaded, you can run the query to check loaded plans in HRM_PLANS table.

HCM Extract – Using Multiple values in Parameter

There is a frequent requirement of using a parameter in HCM Extracts which can accept multiple comma separated values. For example, in Worker Extract, we need to have a Parameter called PER_NUMBER which should accept multiple comma separated values.

To split the input values, we can use REGEXP_SUBSTR function.

SELECT * 
   FROM (SELECT trim(regexp_substr(per_num_param,'[^, ]+',1,LEVEL)) 
           FROM (SELECT pay_report_utils.get_parameter_value('PER_NUMBER') per_num_param 
		           FROM DUAL) 
		CONNECT BY regexp_substr(per_num_param,'[^, ]+', 1,LEVEL)>0)
BIP – Query to get Worker Career Statement
SELECT papf.person_number, ITEM_TEXT2000_1 STATEMENT, ITEM_CLOB_1
FROM HRT_PROFILE_ITEMS ProfileItemPEO,
HRT_PROFILES_VL ProfilePEO,
HRT_PROFILE_TYP_SECTIONS ProfileTypeSectionPEO,
PER_ALL_PEOPLE_F papf
WHERE ProfilePEO.PROFILE_ID= ProfileItemPEO.PROFILE_ID
AND ProfileTypeSectionPEO.SECTION_CONTEXT = 'PERSON_CAREER_STATEMENT'
AND ProfileTypeSectionPEO.BUSINESS_GROUP_ID=ProfilePEO.BUSINESS_GROUP_ID
AND ProfileItemPEO.CONTENT_TYPE_ID=ProfileTypeSectionPEO.CONTENT_TYPE_ID
AND papf.person_id = ProfilePEO.person_id
AND ITEM_TEXT2000_1 IS NOT NULL
BIP – Query to extract File details loaded through File Import and Export

Use below query to extract files loaded through Tools -> File Import and Export along with UCM Content ID and the account used:

SELECT DDOCTITLE 		"File Name"
      ,DWEBEXTENSION	        "File Extension"
      ,DDOCACCOUNT		"Account"
      ,DDOCAUTHOR		"Owner"
      ,DINDATE			"Upload Date"
      ,DDOCNAME 		"Content Id"
      ,DDOCTYPE			"Doc Type"	
  FROM revisions
 WHERE DWEBEXTENSION <> 'log' 
   AND DDOCTITLE NOT LIKE 'ESS%'
 ORDER BY DCREATEDATE DESC
BIP – Query to get Configure HCM Data Loader parameters

Configure HCM Data Loader is a task that is used to define the HCM Data Loader parameters. The parameters are broadly divided into following categories:- Availability, Diagnostic, File Definition, Performance and Scheduling Default. These are the default settings which are applied to any HCM Data Load.

You can override some of these parameters at the individual dat file level i.e. using SET commands or some of the parameters can be overridden while submitting the Import and Load job.

Please note that the default values vary from HDL to HSDL (Spreadsheet Loader).

You can use below mentioned query to get the details from backend using a BIP:

select PARAM_CATEGORY
          ,PARAM_NAME
	  ,DEFAULT_VALUE
	  ,HSDL_DEFAULT_VALUE
          ,HDL_ENABLED
	  ,HDL_OVERRIDE_ENABLED
	  ,HSDL_ENABLED
	  ,HSDL_OVERRIDE_ENABLED
	  ,VALUE_LOOKUP_TYPE
	  ,CREATED_BY
	  ,CREATION_DATE
	  ,LAST_UPDATED_BY
	  ,LAST_UPDATE_DATE
 from hrc_dl_all_parameters
 ORDER By 1,2

Query to get list of overridden values:

select hdap.PARAM_CATEGORY
      ,hdap.PARAM_NAME
	  ,hdap.DEFAULT_VALUE
	  ,hdap.HSDL_DEFAULT_VALUE
      ,hdap.HDL_ENABLED
	  ,hdap.HDL_OVERRIDE_ENABLED
	  ,hdap.HSDL_ENABLED
	  ,hdap.HSDL_OVERRIDE_ENABLED
	  ,hdap.VALUE_LOOKUP_TYPE
	  ,hdap.CREATED_BY
	  ,hdap.CREATION_DATE
	  ,hdap.LAST_UPDATED_BY
	  ,hdap.LAST_UPDATE_DATE
	  ,hdpo.OVERRIDE_LEVEL "OVERRIDDEN_AT_LEVEL"
	  ,hdpo.OVERRIDE_VALUE "OVERRIDDEN_VALUE"
 from hrc_dl_all_parameters hdap
     ,hrc_dl_parameter_overrides hdpo
where hdap.parameter_id = hdpo.parameter_id	 
 ORDER By 1,2
BIP – Specialization Details

In Oracle Learning Cloud, a specialization can be created as a combination of multiple sections with each section comprising of multiple courses.

A learning admin can setup the specialization by navigating to My Client Groups -> Learning -> Learning Catalog -> Specializations:

Example:

Use the below query to get the above details:

WITH specialization as
(SELECT
      LearningItemDEO.LEARNING_ITEM_ID,              
      LearningItemDEO.EFFECTIVE_START_DATE SPESD,         
      LearningItemDEO.EFFECTIVE_END_DATE SPEED,           
      LearningItemDEO.LEARNING_ITEM_TYPE SPLIT,
      LearningItemDEO.LEARNING_ITEM_SUB_TYPE SPLIST,          
      LearningItemDEO.LEARNING_ITEM_NUMBER,         
      LearningItemDEO.STATUS SP_STATUS,                    
      LearningItemDEO.START_DATE SPSD,                   
      LearningItemDEO.END_DATE SPED,                               
      to_date(TO_CHAR( LearningItemDEO.CREATION_DATE,'MM/DD/YYYY'),'MM/DD/YYYY') AS SPCreationDate,                
      LearningItemTranslationDEO.NAME,
      LearningItemDEO.LI_START_DATE as SP_LiStartDate,
      LearningItemDEO.LI_END_DATE as SP_LiEndDate
 FROM WLF_LEARNING_ITEMS_F LearningItemDEO,
      WLF_LEARNING_ITEMS_F_TL LearningItemTranslationDEO
WHERE LearningItemDEO.LEARNING_ITEM_TYPE IN ('ORA_SPECIALIZATION')
  AND LearningItemDEO.LEARNING_ITEM_ID          = LearningItemTranslationDEO.LEARNING_ITEM_ID
  AND TRUNC(SYSDATE) BETWEEN LearningItemTranslationDEO.EFFECTIVE_START_DATE AND LearningItemTranslationDEO.EFFECTIVE_END_DATE
  AND TRUNC(SYSDATE) BETWEEN LearningItemDEO.EFFECTIVE_START_DATE AND LearningItemDEO.EFFECTIVE_END_DATE
  AND LearningItemTranslationDEO.LANGUAGE = USERENV('lang')
  --AND LearningItemTranslationDEO.NAME LIKE ('Working From Home Guidelines and Tips')
 )
, Section as
(SELECT 
        LearningItemDEO.LEARNING_ITEM_ID,             
        LearningItemDEO.EFFECTIVE_START_DATE,         
        LearningItemDEO.EFFECTIVE_END_DATE,           
        LearningItemDEO.LEARNING_ITEM_TYPE,
        LearningItemDEO.LEARNING_ITEM_SUB_TYPE,          
        LearningItemDEO.LEARNING_ITEM_NUMBER,         
        LearningItemDEO.STATUS,                    
        LearningItemDEO.START_DATE,                   
        LearningItemDEO.END_DATE,                                
        to_date(TO_CHAR( LearningItemDEO.CREATION_DATE,'MM/DD/YYYY'),'MM/DD/YYYY') AS LearningItemDEOCreationDate,                
        LearningItemTranslationDEO.NAME,
        LearningItemTranslationDEO.DESCRIPTION,
        LearningItemDEO.LI_START_DATE as LearningItemDEOLiStartDate,
        LearningItemDEO.LI_END_DATE as LearningItemDEOLiEndDate
   FROM WLF_LEARNING_ITEMS_F LearningItemDEO,
        WLF_LEARNING_ITEMS_F_TL LearningItemTranslationDEO
  WHERE LearningItemDEO.LEARNING_ITEM_TYPE IN ('ORA_SPECL_SECTION')
    AND LearningItemDEO.LEARNING_ITEM_ID          = LearningItemTranslationDEO.LEARNING_ITEM_ID
    AND TRUNC(SYSDATE) BETWEEN LearningItemTranslationDEO.EFFECTIVE_START_DATE AND LearningItemTranslationDEO.EFFECTIVE_END_DATE
    AND TRUNC(SYSDATE) BETWEEN LearningItemDEO.EFFECTIVE_START_DATE AND LearningItemDEO.EFFECTIVE_END_DATE
    AND LearningItemTranslationDEO.LANGUAGE = USERENV('lang')
 )
, course as 
(
 SELECT 
      LearningItemDEO.LEARNING_ITEM_ID,             
      LearningItemDEO.EFFECTIVE_START_DATE,         
      LearningItemDEO.EFFECTIVE_END_DATE,           
      LearningItemDEO.LEARNING_ITEM_TYPE,
      LearningItemDEO.LEARNING_ITEM_SUB_TYPE,          
      LearningItemDEO.LEARNING_ITEM_NUMBER,         
      LearningItemDEO.STATUS,                    
       to_date(TO_CHAR( LearningItemDEO.CREATION_DATE,'MM/DD/YYYY'),'MM/DD/YYYY') AS LearningItemDEOCreationDate,                
       LearningItemTranslationDEO.NAME,
       LearningItemTranslationDEO.DESCRIPTION,
       LearningItemDEO.LI_START_DATE as LearningItemDEOLiStartDate,
       LearningItemDEO.LI_END_DATE as LearningItemDEOLiEndDate
       FROM WLF_LEARNING_ITEMS_F LearningItemDEO,
       WLF_LEARNING_ITEMS_F_TL LearningItemTranslationDEO

       WHERE  LearningItemDEO.LEARNING_ITEM_TYPE IN ('ORA_COURSE')
       AND LearningItemDEO.LEARNING_ITEM_ID          = LearningItemTranslationDEO.LEARNING_ITEM_ID
       AND TRUNC(SYSDATE) BETWEEN LearningItemTranslationDEO.EFFECTIVE_START_DATE AND LearningItemTranslationDEO.EFFECTIVE_END_DATE
       AND TRUNC(SYSDATE) BETWEEN LearningItemDEO.EFFECTIVE_START_DATE AND LearningItemDEO.EFFECTIVE_END_DATE
       AND LearningItemTranslationDEO.LANGUAGE = USERENV('lang')
)
, offering as
(
select WLFI.LEARNING_ITEM_NUMBER
      ,TO_CHAR(WLFI.EFFECTIVE_END_DATE,'YYYY/MM/DD')
      ,TO_CHAR(WLFI.EFFECTIVE_START_DATE,'YYYY/MM/DD')
      ,WLFT.DESCRIPTION
      ,WLFT.DESCRIPTION_LONG
      ,WLCF.ENABLE_CAPACITY
      ,WLCF.ENABLE_WAITLIST
      ,WLFI.LANGUAGE_CODE
	  ,WLFI.STATUS
      ,WLCLF.MAXIMUM_ATTENDEES
      ,WLCLF.MINIMUM_ATTENDEES
      ,TO_CHAR(WLCLF.ENROLMENT_END_DATE,'YYYY/MM/DD')
      ,TO_CHAR(WLCLF.ENROLMENT_START_DATE,'YYYY/MM/DD')
      ,WLCLF.DELIVERY_MODE
      ,TO_CHAR(WLFI.END_DATE,'YYYY/MM/DD')
      ,TO_CHAR(WLFI.START_DATE,'YYYY/MM/DD')
      ,WLFT.NAME
	  ,WLCF.LEARNING_ITEM_ID WLCF_LEARNING_ITEM_ID
 FROM WLF_LEARNING_ITEMS_F WLFI,
      WLF_LI_COURSES_F WLCF, --LEARNING_ITEM_ID course
      WLF_LI_CLASSES_F WLCLF,
      WLF_LEARNING_ITEMS_F_TL WLFT
WHERE WLFI.LEARNING_ITEM_ID = WLFT.LEARNING_ITEM_ID 
  AND WLFI.LEARNING_ITEM_ID = WLCLF.LEARNING_ITEM_ID 
  AND WLCLF.COURSE_LEARNING_ITEM_ID = WLCF.LEARNING_ITEM_ID
  AND WLFT.LANGUAGE = USERENV('lang')
  AND TRUNC(SYSDATE) BETWEEN WLFI.EFFECTIVE_START_DATE AND WLFI.EFFECTIVE_END_DATE 
  AND TRUNC(SYSDATE) BETWEEN WLCF.EFFECTIVE_START_DATE AND WLCF.EFFECTIVE_END_DATE
  AND TRUNC(SYSDATE) BETWEEN WLCLF.EFFECTIVE_START_DATE AND WLCLF.EFFECTIVE_END_DATE
  AND TRUNC(SYSDATE) BETWEEN WLFT.EFFECTIVE_START_DATE AND WLFT.EFFECTIVE_END_DATE
  AND WLFI.LEARNING_ITEM_TYPE = 'ORA_CLASS'
)  
SELECT specialization.name 						"Specialization Name"
      ,specialization.LEARNING_ITEM_NUMBER		"Specialization Number"
	  ,Section.name								"Section Name"
	  ,Section.LEARNING_ITEM_NUMBER				"Section Number"
	  ,WLF_LI_HIERARCHIES_F_SS.position			"Section Position"
	  ,course.name								"Course Name"
	  ,course.learning_item_number				"Course Number"
	  ,WLF_LI_HIERARCHIES_F_SC.HIERARCHY_NUMBER	"Activity Number"
	  ,WLF_LI_HIERARCHIES_F_SC.position			"Course Position"
      ,offering.name							"Offering Name"
      ,offering.LEARNING_ITEM_NUMBER			"Offering Number"
      ,offering.DELIVERY_MODE					"Offering Delivery Mode"
      ,offering.STATUS							"Offering Status"
  FROM WLF_LI_HIERARCHIES_F WLF_LI_HIERARCHIES_F_SS, 
	   WLF_LI_HIERARCHIES_F WLF_LI_HIERARCHIES_F_SC, 
       specialization, 
	   section,
	   course,
	   offering
 WHERE specialization.name =  'Essential Worker'
   AND WLF_LI_HIERARCHIES_F_SS.LEARNING_ITEM_ID = specialization.LEARNING_ITEM_ID
   AND WLF_LI_HIERARCHIES_F_SS.CHILD_LEARNING_ITEM_ID = section.LEARNING_ITEM_ID
   AND TRUNC(SYSDATE) BETWEEN WLF_LI_HIERARCHIES_F_SS.EFFECTIVE_START_DATE AND WLF_LI_HIERARCHIES_F_SS.EFFECTIVE_END_DATE
   AND WLF_LI_HIERARCHIES_F_SC.LEARNING_ITEM_ID = section.LEARNING_ITEM_ID
   AND WLF_LI_HIERARCHIES_F_SC.CHILD_LEARNING_ITEM_ID = course.LEARNING_ITEM_ID
   AND TRUNC(SYSDATE) BETWEEN WLF_LI_HIERARCHIES_F_SC.EFFECTIVE_START_DATE AND WLF_LI_HIERARCHIES_F_SC.EFFECTIVE_END_DATE
   AND course.LEARNING_ITEM_ID = offering.WLCF_LEARNING_ITEM_ID
ORDER BY specialization.name,WLF_LI_HIERARCHIES_F_SS.position,WLF_LI_HIERARCHIES_F_SC.position
HDL – Updating Work Day Information at Department Level

Organization.dat can be used to upload work day information at department level.

METADATA|OrgInformation|OrgInformationContext|SequenceNumber|FLEX:PER_ORGANIZATION_INFORMATION_EFF|EFF_CATEGORY_CODE|EffectiveStartDate|EffectiveEndDate|OrganizationName|ClassificationName|_STD_WORKING_HOURS(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_STD_WORKING_HOURS_FREQUENCY(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_STD_WORKING_HOURS_FREQUENCY_Display(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_WORK_END_TIME(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|_WORK_START_TIME(PER_ORGANIZATION_INFORMATION_EFF=PER_WORK_DAY_INFO)|OrganizationId
MERGE|OrgInformation|PER_WORK_DAY_INFO|1|PER_WORK_DAY_INFO|DEPARTMENT|1951/01/01||XYZ Markets|Department|40|W||18:00|09:00|5000000026311

Please verify the updated information from UI:

Validate Version – 21A

HDL – Inactivating Worker Unions

HCM Data Loader can be used to bulk upload worker unions. In Cloud HCM, Worker Unions are just another type of Organization.

You can run below query to find different Organization Types and their codes:

https://fusionhcmconsulting.com/2021/01/reports-bip-organization-classifications-in-fusion/

Below is the sample file to inactivate a worker union:

METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate|Name|ClassificationCode|ClassificationName|InternalAddressLine|ActionReasonCode|LocationSetId|LocationCode|LocationSetCode|EstablishmentId|EstablishmentName|GUID|SourceSystemOwner|SourceSystemId
MERGE|Organization||1951/01/01|2021/05/12|Test WU2|ORA_PER_UNION||||||||||FUSION|300000217254520
MERGE|Organization||2021/05/13|4712/12/31|Test WU2|ORA_PER_UNION||||||||||FUSION|300000217254520

METADATA|OrgUnitClassification|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|OrganizationId(SourceSystemId)|OrganizationName|ClassificationCode|ClassificationName|CategoryCode|SetCode|Status|LegislationCode|GUID|SourceSystemOwner|SourceSystemId
MERGE|OrgUnitClassification||1951/01/01|2021/05/12|300000217254520|Test WU2|ORA_PER_UNION||||A|IN||FUSION|300000217254521
MERGE|OrgUnitClassification||2021/05/13|4712/12/31|300000217254520|Test WU2|ORA_PER_UNION||||I|IN||FUSION|300000217254521

The file makes use of Source keys. You can as well use the surrogate keys.

Below sample user surrogate keys:

METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate
MERGE|Organization|300000217254520|1951/01/01|2021/05/12
MERGE|Organization|300000217254520|2021/05/13|4712/12/31

METADATA|OrgUnitClassification|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|OrganizationId|Status
MERGE|OrgUnitClassification|300000217254521|1951/01/01|2021/05/12|300000217254520|A
MERGE|OrgUnitClassification|300000217254521|2021/05/13|4712/12/31|300000217254520|I

Sample Query to extract the data for Organization METADATA in HDL format:

SELECT 'METADATA|Organization|OrganizationId|EffectiveStartDate|EffectiveEndDate|OrganizationName' DATAROW, 1 sequence
FROM DUAL
UNION ALL
SELECT 'MERGE'
       ||CHR (124)
	   ||'Organization'
	   ||CHR (124)
	   ||haou.organization_id
	   ||CHR (124)
	   ||to_char(haou.effective_start_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||to_char(haou.effective_end_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||haout.name DATAROW, 2 sequence
FROM hr_all_organization_units haou,hr_all_organization_units_tl haout
where haou.organization_id = haout.organization_id
and haout.language ='US'
--AND haou.organization_id = 300000047274447
UNION ALL
SELECT 'METADATA|OrgUnitClassification|OrgUnitClassificationId|EffectiveStartDate|EffectiveEndDate|OrganizationId|Status' DATAROW, 3 sequence
FROM DUAL
UNION ALL
SELECT 'MERGE'
       ||CHR (124)
	   ||'OrgUnitClassification'
	   ||CHR (124)
	   ||houcf.org_unit_classification_id
	   ||CHR (124)
	   ||to_char(houcf.effective_start_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||to_char(houcf.effective_end_date,'RRRR/MM/DD')
	   ||CHR (124)
	   ||haou.organization_id
	   ||CHR (124)
	   ||houcf.status DATAROW, 4 sequence
FROM hr_all_organization_units haou,hr_org_unit_classifications_f houcf
where haou.organization_id = houcf.organization_id
--AND haou.organization_id = 300000047274447
BIP – Query to get element entry details
Select DISTINCT peevf.element_entry_value_id
,peef.element_entry_id
,petf.base_element_name
,peevf.effective_start_date
,paam.assignment_number
,pivf.base_name
from per_all_assignments_m paam
,pay_element_types_f petf
,pay_element_entries_f peef
,pay_element_entry_values_f peevf
,pay_input_values_f pivf
where 1=1
and paam.person_id = peef.person_id
and peef.element_type_id = petf.element_type_id
and pivf.element_type_id = petf.element_type_id
and peef.element_entry_id = peevf.element_entry_id
and paam.ASSIGNMENT_TYPE in ('E')
and paam.primary_assignment_flag = 'Y'
and petf.base_element_name = 'Dental Plan'
and pivf.base_name = 'Amount'
and paam.assignment_number = 'E1111'
and trunc(sysdate) between petf.effective_start_date and petf.effective_end_date
and trunc(sysdate) between paam.effective_start_date and paam.effective_end_date
and trunc(sysdate) between pivf.effective_start_date and pivf.effective_end_date

BIP – Query to get position profile details
select hapf.position_code
,hpb.profile_code
,hpt.description profile_desc
,hpt.summary
,hpeiv.DESCRIPTION
,hpeiv.RESPONSIBILITIES
,hpeiv.QUALIFICATIONS
from HRT_PROFILE_ITEMS hpi
,HRT_PROFILES_B hpb
,HRT_PROFILES_TL hpt
,HRT_PROFILE_RELATIONS hpr
,HRT_PROFILE_EXTRA_INFO_VL hpeiv
,HR_ALL_POSITIONS_F hapf
where hpi.profile_id = hpb.profile_id
and hpb.profile_usage_code = 'M'
and hpi.profile_id = hpr.profile_id
and hpi.profile_id = hpt.profile_id
and hpi.profile_id = hpeiv.profile_id
and hapf.position_id = hpr.object_id
and trunc(sysdate) between hapf.effective_start_date and hapf.effective_end_Date
and hapf.position_code= '1099'
and hpt.language = 'US'

Query to extract only the profile data:

select hpb.profile_code
,hpt.description
,hpt.summary
,hpeiv.DESCRIPTION desc1
,hpeiv.RESPONSIBILITIES
,hpeiv.QUALIFICATIONS
,hikm.source_system_id
,hikm.source_system_owner
from HRT_PROFILES_B hpb
,HRT_PROFILES_TL hpt
,HRT_PROFILE_EXTRA_INFO_VL hpeiv
,HRC_INTEGRATION_KEY_MAP hikm
where hpeiv.profile_id = hpb.profile_id
and hpb.profile_usage_code = 'M'
and hpt.language = 'US'
and hpb.profile_code like '%TEST%'
and hikm.surrogate_id = hpeiv.PROFILE_EXTRA_INFO_ID
order by hpb.creation_date desc
BIP – Query to get Community Member details in OLC

Use the below query to get the details of Community in Learning:

SELECT asg.status,
asg.EVENT_TYPE,
asg.LEARNER_ID,
asg.EVENT_SUB_TYPE,
itm_tl.name learning_community_name,
(select full_name from per_person_names_f ppnf
where name_type ='GLOBAL'
and person_id = asg.LEARNER_ID
and trunc(sysdate) between effective_start_date and effective_end_date) member_name,
(select person_number from per_all_people_f papf
where person_id = asg.LEARNER_ID
and trunc(sysdate) between effective_start_date and effective_end_date) member_number
FROM wlf_assignment_records_f asg, wlf_learning_items_f itm,wlf_learning_items_f_tl itm_tl, WLF_LI_COMMUNITIES_F wlcf
WHERE TRUNC(SYSDATE) BETWEEN asg.effective_start_date AND asg.effective_end_date
and asg.status <> 'ORA_ASSN_REC_WITHDRAWN'
and itm.learning_item_id = asg.learning_item_id
and itm.learning_item_type in ('ORA_COMMUNITY')
and itm.learning_item_id =wlcf.learning_item_id
and itm.learning_item_id = itm_tl.learning_item_id
and itm_tl.language = 'US'
and TRUNC(SYSDATE) BETWEEN itm.effective_start_date AND itm.effective_end_date
and TRUNC(SYSDATE) BETWEEN itm_tl.effective_start_date AND itm_tl.effective_end_date
and itm_tl.name = 'Essential Workers'
ORDER by 5