How to Auto rerun failed session in Informatica - informatica

I am using Informatica and I want to auto run the failed session. How can I achieve this?
I tried it by enabling Auto recover terminated task property in workflow. But it is not working. It is not being rerun on failure. Anyone can help?

There's no such out-of-the box feature. But there can be many solutions:
Ideally you'd use a separate tool / script to run the workflow, check it's state and re-run if failed. This can be achieved using second workflow with a command task, for example, but:
Why don't you try to resolve all the issues and make sure the workflow runs successfully?
Yet another option would be to have a touch command executed in case any session in the workflow fails, and have a second continuously running workflow with event-wait that executes the first workflow if the specified file shows up.

Related

wrong Google Cloud Build trigger is activated from GitLab merge web hook

Working on usage of GitLab web hooks for activating Google Cloud Build triggers. Problem that I'm facing is that my web hook to push trigger is working fine, but merge trigger is never started, instead push trigger is activated. On GitLab side web hook URLs are different and pointing to right trigger URLs.
While trying to troubleshoot has following: when using dummy build trigger that has inline default step ( ubuntu image to run echo hello world) it works fine, when I'm adding Substitution vars getting an error:
Your build failed to run: generic::invalid_argument: generic::invalid_argument: invalid value for 'build.substitutions': key in the template "URL" is not a valid built-in substitution
_URL is using $(body.project.git_ssh_url), which in accordance to doc should be fine for Merge request event , so it could mean that there is a difference on event type.
Please advise or suggest direction for debugging like get logs of trigger build events.
Best,
Alex
P.S. Used/read docs for web hooks and substitution already, so please do not recommend to read general docs again.

It is possible to re-run a job in Google Cloud Dataflow after succeded

Maybe the question sounds stupid but I was wondering if once the job is successfully finished and having ID, is it possible to start the same job again?
Or is it necessary to create another one?
Because otherwise I would have the job with the same name throughout the list.
I just want to know if there is a way to restart it without recreating it again.
It's not possible to run the exact same job again, but you can create a new job with the same name that runs the same code. It will just have a different job ID and show up as a separate entry in the job list.
If you want to make running repeated jobs easier, you can create a template. This will let you create jobs from that template via a gcloud command instead of having to run your pipeline code.
Cloud Dataflow does have a re-start function. See SDK here. One suggested pattern (to help with deployment) is to create a template for the graph you want to repeatedly run AND execute the template.

Can gcloud builds submit run in background?

I'm trying to automate my builds using Google Cloud Build. For various reasons the primary use flow will be to manually kick off builds rather than using a Github trigger.
Right now it looks like when you run
gcloud builds submit .
it kicks off the build process but the gcloud command must stay running for the build to continue. If I Ctrl-C it then it stops the entire build process.
I could run it in the background but presumably if I get disconnected from the network or my laptop goes to sleep that will also interrupt the build process(?)
Clearly the build can run in the background since that's what a Github trigger does; is there any way I can do the same thing from the command line?
If you add --async to your gcloud builds ... command, the job will be run asynchronously, as a long-running operation.
You may query the state of this operation in order to determine the state of the build.
https://cloud.google.com/sdk/gcloud/reference/builds/submit
Alternatively, you may use something like Linux screen to keep any job running after you detach.

Informatica - Trigger next workflow upon completion of the first workflow

I am working on Informatica to automatically run the Workflow B upon the completion of the Workflow A. I did research on how to do this and the best that I encountered is using PMCMD but I cannot find the PMCMD.exe file in the installation folder of my Informatica power center. I am using version 8.1.1. I don't know if the PMCMD is available in this version. Kindly advise for alternative solutions. Thank you in advance.
It's possible with pmcmd utility, but there's another option. You can use an Event Wait task in Workflow B, right after the Start task and make it wait for a flat file, e.g. workflowA.done. And add a Command Task as the last one in your WorkflowA to perform a touch workflowA.done command. Use the appropriate path for your case (might be $PMTargetFileDir for example).
Start both your workflows at the same time, Workflow B will process the tasks after the control file gets created.
pmcmd.exe is available in the Informatica installation folder for Informatica server.
For my system it was in the below path:
/infa/server/bin
Usually this is controlled by an external independant scheduler

Trigger build in Jenkins/Hudson using hashtag in commit-message

Is it possible to trigger a Hudson/Jenkins build only when a certain string appears in a commit-message?
For instance, I want to trigger a build that rolls out my application to the dev environment by writing a commit message like:
MYPROJECT-123 Fixed NPE in MyClass.java #deploy:DEV
The general idea is described in this great talk on Continuos Deployment but I couldn't find any information on how to do this in Hudson.
I would prefer to have this behavior in Hudson itself and not in an external system like commit-hooks or web-hooks.
I don't know of an out of the box way you can parse the SCM message as part of the trigger. You have a couple of options that might achieve what you want though
Write your own Hudson SCM plugin
Chain your jobs together into a build pipeline. The first job could simply look for that message in the changelog.xml to determine if the next build is triggered or not.
If you are looking at building a pipeline of build jobs, check out the build-pipeline-plugin. http://www.centrumsystems.com.au/blog/?p=121
Anyone got a more elegant solution??
Cheers,
Geoff
There is a plugin called Commit Message Trigger Plugin, but it had just a 0.1 release.
Maybe the easiest way is to use a version control post commit (or push) trigger to start a Hudson Job. You'd one anyway to automatically start your build.