Logs of a command task in Informatica - informatica

How to See logs of a command task as except sessions we can't see the logs in Informatica. Any help would be appreciated.

In order to see a log of a command task you have to create it this way:
your command >path at your server
For example:
C:\Script\backup.bat >C:\Logs\MyLog.txt

Related

How to add a shell script after AWS EMR has started

Currently, I am using a transient cluster, whenever my shell script encounters a failure in "add_step", it shuts down. I have started an EMR to debug this, but don't know where to add and test my script after it has launched.
I clicked on the steps and selected "Custom Jar" and
If I give my shell script in the S3 path as shown in the below screenshot. It fail's. How can I execute the script when EMR is running.
Thanks,
Xi
Here are the detailed steps on how to add
https://emr-etl.workshop.aws/spark_etl/steps.html

Scheduling Autorep command to run daily

I am a newbie to autosys. I am trying to get the jobs execution information on daily basis into a csv file. For this I am trying to write an autosys job which I can schedule to run daily. Below is the snippet of the code:
insert_job: job_run_time job_type: CMD
box_name: box_job_run_time
command: autorep -J box_job1 -r -1
But this is giving the below error:
'autorep' is not recognized as an internal or external command,
operable program or batch file.
Please help with the solution
Shanky, before you run autorep command directly on the terminal, it's required to configure first to read Autosys DB storing the run details.
Mostly it would be a few variables and instance names.
Please check with the scheduling team on how to configure Autosys client on the server.
Thanks Piyush and Mansi for your reply!
Mansi, where this command autorep -J box_job1 -r -1 >> Output.csv can be configured? so that it can be scheduled to run daily.

How to Trigger a Workflow B start right after Workflow A completed?

I am new to Informatica Power Center.
My task is to trigger/start Workflow B right after when Workflow A just completed using infacmd command.
Suggestion is after all session in workflow a add a command task with "infacmd.sh startworkflow" to start the workflow b with all the options.
I've tried some guides but the version was too old. I'm using Informatica 10.1.1.
Thank you.
Fro the command task you can use the following command.
pmcmd startworkflow -sv $INFA_SERVICE -d $INFA_DOMAIN -u $INFRAREPO_USERID -p $INFRAREPO_PASSWD -f $INFA_FOLDER -wait $INFA_WORKFLOW
Replace the variable according to your domain/folder/workflow name etc.
Otherwise, you can create a shell script from where you have to call the workflow using the above command and call the shell script from your last session 'Post Session success command'
Consider creating a Command task that will touch a file and having Workflow B started together with Workflow A, with a Wait task that will wait for a file, and delete the file as a last step.
This way you don't need to invoke the pmcmd with hardcoded username and password.

sample script to use sql commands in informatica cloud pre/post processing

Required sample script to run sql commands through pre/post processing commands. I tried by using sqlcmd but I'm getting this -
I CMN_1949 Error: [Pre/Post Session Command] Process id 5540. The
shell command failed with exit code 1.
Use Pre SQL and Post SQL properties of your Target defined in the session's Mapping tab. There you can use pure SQL statements directly, there's no need for any shell scripting at all.
Trying running your script in bash to see what the real error is.

AWS Elastic Beanstalk - Starting SWF Background Workers

I have been trying to find out the best way to run background jobs using PHP on AWS Elastic beanstalk, and after many hours searching on Google and SO, I believe that one good solution is using SWF and activity workers.
I found this example buried in the aws-sdk-for-php: https://github.com/amazonwebservices/aws-sdk-for-php/tree/master/_samples/AmazonSimpleWorkflow/cron
The read-me file says:
To run this sample, you need to execute three scripts from the command line in separate terminal/console windows
and
Note that the start_cron_example_workflow.php script will exit quickly
while the decider and activity worker scripts keep running until you
manually terminate them.
the decider and activity worker will loop "forever", and trying to run these in EB is what I'm having trouble doing.
In my .ebextensions directory I have a file that executes these files:
container_commands:
01background_task:
command: "php -f start_cron_example_activity_workers.php"
02background_task:
command: "php -f start_cron_example_workflow_workers.php"
But I get the following error messages:
ERROR
Failed to deploy application version.
ERROR
Some instances have not responded to commands. Responses were not received from [i-a5417ed4].
Any way I can do this using config files? How can I make this work in AWS EB without introducing a single point of failure?
Thank you.
You might consider using a service like IronWorker — this is specifically designed for what you are trying to do and will probably work better than putting together your own solution on a micro instance.
I have not used Iron.io yet, but was evaluating it as I am looking to move my stuff over to AWS so I need to have cron jobs handled as well.
Have you taken a look at the Fat Controller ? It can daemonise anything. There's documentation and examples on the website: http://fat-controller.sourceforge.net