Shell Script to Check for Status of a informatica workflow

pmcmd command to check workflow status
unix shell script to execute workflow in informatica
run informatica workflow from command line
how to schedule informatica jobs in unix
start workflow from task in informatica
pmcmd starttask
pmcmd connect
pmcmd gettaskdetails

We have two Informatica jobs that run in parallel.

One starts at 11.40 CET and it has around 300 Informatica workflows in it out of which one is fact_sales.

The other job runs at 3.40 CET and it has around 115 workflows in it many of which are dependent on fact_sales in term of data consistency.

The problem is fact_sales should finish before certain workflows in process 2 starts for data to be accurate, but this doesnt happen generally.

What we are trying to do is to split the process 2 in such a way that fact_sales dependent workflows run only after the fact_sales has finished.

Can you provide me a way to go about writing a unix shell script that check the status of this fact_sales and if it successfull then kicks off other dependent workflows and if not then it should send a failure mail. thanks

I don't see the need to write a custom shell script for this. Most of this is pretty standard/common functionality that can be implemented using Command Task and event waits.

**Process1 - runs at 11:50**
 fact_sales workflow. **Add a command task at the end 
                      **that drops a flag, say, fact_sales_0430.done

And all the dependent processes will have an event wait that waits on this .done file. Since there are multiple dependant workflows, make sure none of them deletes the file right away. You can drop this .done file at the end of the day or when the load starts for the next day.

dependantworkflow1 -- Event wait, waiting on fact_sales_0430.done (do not delete file).
dependantworkflow2 -- Event wait, waiting on fact_sales_0430.done (do not delete file).
dependantworkflow3 -- Event wait, waiting on fact_sales_0430.done (do not delete file).

Shell Script to Check for Status of a Informatica Workflow, we have two informatica jobs that run parallel . one starts at 11.40 cet and it has around 300 informatica workflows in it out of which one is fact_sales. the other  You can incorporate one command task and use one script at the last in the workflow for process 1. so once all the sessions in the workflow completes the commandtask runs the script which can start the workflow in process2. You can search for pmcmd command in informatica help menu for kicking the

A second approach can be as follows -

You must be running some kind of scheduler for launching these workflows.. since Informatica cant schedule multiple workflows in a set, it can only handle worklet/sessions at that level of dependency mgmt.

From the scheduler, create a dependency across the sales fact load wf and the other dependent workflows..

Get Workflow status using Command Line, I can use command to start WF1 but I cannot get the workflow status button in the Value field to edit pre- or post-session shell commands. I have to write a single shell script such that those 200+ informatica workflows must (I dont mind manually entering those workflows into the script) execute one after another, and if one fails the script should halt displaying which workflow failed. Since this is a production environment I will not be able to share here.

I think below mentioned script will work for you. Please udpate the parameters.

    while [ ${WAIT_LOOP} -eq 1 ]
        WF_STATUS=`pmcmd getworkflowdetails -sv $INFA_INTEGRATION_SERVICE -d $INFA_DOMAIN -uv INFA_USER_NAME -pv INFA_PASSWORD -usd Client -f $FOLDER_NAME $WORKFLOW_NAME(fact_sales) | grep "Workflow run status:" | cut -d'[' -f2 | cut -d']' -f1`
            echo ${WF_STATUS} | tee -a $LOG_FILE_NAME
                    case "${WF_STATUS}" in
                    if [ ${WAIT_LOOP} -eq 1 ]
                            sleep $WAIT_SECONDS
            if [ ${WF_STATUS} == "Succeeded" ]
                    pmcmd startworkflow -sv $INFA_INTEGRATION_SERVICE -d $INFA_DOMAIN -uv INFA_USER_NAME -pv INFA_PASSWORD -usd Client -f $FOLDER_NAME -paramfile $PARAMETER_FILE $WORKFLOW_NAME(dependent_one) | tee $LOG_FILE_NAME
                    (echo "Please find attached Logs for Run" ; uuencode $LOG_FILE_NAME $LOG_FILE_NAME )| mailx -s "Execution logs" $EMAIL_LIST
                    exit 1

Scripting pmcmd Commands - Informatica, For example, the following UNIX shell script checks the status of Integration Service exit fi # Get task details for session task "s_testSessionTask" of workflow  File c:\powershell\check_memory.ps1 cannot be loaded because the execution of scripts is disabled on this system. See get-help about_signing for more details. At line:1 char:26 + c:\powershell\check_memory <<<< That is because PowerShell does not automatically allow scripts. It actually has four levels of security (known as execution policies):

Integrate With Informatica PowerCenter Through a CLI, On Linux, the Informatica CLI is a shell script. For example, you can define a UNIX (command) job to start an Informatica workflow using the CLI. When using the CLI, the number of jobs that you can submit and monitor simultaneously is  I have a similar scripts I use to check execution status of SharePoint 2013 workflow on a list. but I think it will be more productive if can use this PowerShell scripts on TechNet. the scripts found on TechNet works as expected but I think it will be better if I can output result to a file our send the output to an email.

You can fire a query from repository database using tables such REP_SESS_LOG and check if the status of the fact sales has succeeded or not. Then only you can proceed with the second job.

automation of Informatica jobs using Unix, How will you do automation of Informatica jobs using Unix ? 2. How u run workflow from Unix? UNIX script to check file and start the informatica server. Hi, Script to check the ETL standards in an informatica mapping ( xml file ). Example: To check the naming standard of mapping . To check the transformation naming convention. To check the datatypes. To the session name , it should start with s. To check the workflow name , it should (0 Replies)

Using PowerCenter Command-line Tasks: Manipulating Workflows, The following shows how to create a reusable set of tasks set as pre or post session shell commands in a dummy reusable session that can be  In a UNIX C shell: echo $status 1 Running Commands in Interactive Mode Use pmcmd in interactive mode to start and stop workflows and sessions without writing a script. When you use the interactive mode, you enter connection information such as domain name, Integration Service name, user name, and password.

Informatica Scripts, In this post we will see pmcmd command used in Unix to perform some of the important task in startworkflow switch of pmcmd is used to start the informatica workflow. Integration Service status: [] Get Informatica Workflow Task Details. Created unit test cases to test data loads and check whether the components adhere to the technical design. Created and scheduled worklets, configured email notifications. Set up Workflow to schedule the loads at required frequency using Power Center Workflow Manager, Generated completion messages and status reports using Workflow Manager.

Informatica Developer Resume CLEVELAND, OH, Proficient in using Informatica workflow manager, Workflow monitor to create, schedule and in writing scripts and also automation of the ETL processes using UNIX shell scripting. Involved in team weekly and by monthly status meetings. -Writing shell scripts to check whether the extract file generated by informatica is as per business requirements. -Creating a table by writing query similar to mapping logic, subtracting data set

  • Are you saying that informatica doesn't support job/script/flow dependencies as a basic function? Good luck.
  • it does and i never said it doesnt but the system is such we cant use the functionality
  • i was all guns for this approach but the fact is the dependencies are huge and the number of workflows we are dealing is say about 200 workflows. the problem seems to be that we have to split up the process 2 according to fact_sales. using command task and event wait at every dependent workflows doesnt seems to a logical task though it is the safest.
  • Understood. In that case, you should probably start building a custom scheduler based on UNIX Cron. You can have tables such as WKF_STATUS, WKF_LOGS and WKF_DEPENDENCIES. Your script will run as a background process and check for any parent processes that are complete (STATUS="COMPLETE") and submit all the dependant processes that have not yet run (STATUS="NOTRUN"). Takes time, but doable. Take a look at… and see if it helps. Your script would be on similar lines except for the deamon process.
  • Considering Sales Fact table load needs to be finished for 20 workflows (in the second set) to start. (just an example number, yours can be different). Like Rajesh said, create a file on your file system after the first load finishes. In the second set of workflows, the dependent ones should have this as a extra dependency in the workflow. Wait for the file to be there, but dont delete it. sounds fairly simple to me.