I'm checkin out from 2 mercurial repositories in the same build plan of bamboo and I'd like to know the revision number of each repositories.
Is there any way to get this information?
I configured each checkout operation in a different stage but then using ${bamboo.repository.revision.number} always return the first checkout.
Thanks
Given that I didn't find a way to retrieve this info using bamboo variables, I created a Task that inject a custom variable for each checkout operation:
echo myChangeset=`hg id --id` > myRepo.properties
Any other option will be appreciated.
Related
I'd like to ask, how may I do a migration of mappings, worklets and workflows from Informatica PowerCenter Integ, to Prod.
Integ Enviroment and Prod are in different servers, so I can't just mouve folder from folder.
Is it possible? I can't find any refernece or tutorial.
Thank you in advance.
In Powercenter, its possible to copy form one env to another. Request everyone to check in their objects first adn log off from both source and target repo.
Open Repository Manager, connect to the source repository and select the folder you want to copy.
Click Edit > Copy.
Connect to the target repository. Connect to the target repository with the same user account used to connect to the source repository. If you do not have same user you need to use deployment group/deployment folder.
In the Navigator, select the target repository, and click Edit > Paste. You will get many options like - replacing objects, use latest version, check out etc. You can follow below link to get help.
https://docs.informatica.com/data-integration/powercenter/10-5/repository-guide/copying-folders-and-deployment-groups/copying-or-replacing-a-folder/steps-to-copy-or-replace-a-folder.html
Now, my preference would be to use deployment group or deployment folder. Its easy to use and easy to control - like if you want to replace 10 objects out of 100s, or you want to create a standard process for future migrations, or deploy using command task automatically, you can do as well.
I am trying out AWS CodePipeline and currently have it hooked up to our Github account where it can checkout the master branch no issues, as this is set in the Source settings.
However, this is obviously quite restrictive, and I'd like to be able to specific a version tag from Github to checkout, but canot see anyway of achieveing this?
Ideally, I want to specify a version number (tag in Github) before the Pipeline runs, so I can use 1 Pipeling to checkout, build, test, deploy the codebase for a specific version tag. Again, I cannot find any information on how to achieve this.
This is not natively supported at the moment.
But you could configure your CodePipeline to create a Full Git Clone instead of just an artifact, and then pass that to a CodeBuild where you can use git to checkout the specific tag as outlined in the AWS documentation.
I am trying to list Google Cloud builds and filter by source.repo.commit_sha as specified in the Viewing build results documentation, but my list is coming back with no items. I am using the following command:
gcloud builds list --filter "source.repo.commit_sha='${LONG_COMMIT_SHA}'"
I have tried using the short commit and the long commit SHA-1, but I am not getting any results. The SHA-1 value is the value from the commit that was pushed to github. I am using a trigger to initiate the build, the trigger works correctly.
I have search the internet for information about filtering with a given commit SHA-1, but I have been unable to find any useful information.
Can someone please help with a command to filter with a given commit SHA?
this looks like an Gcloud SDK issue I found a error report on this Public issue tracker about similar behavior with the filters.
I think that is better continue in the public issue tracker.
While this issue still remains unresolved, there is a somewhat hacky way to perform commit sha filtering. Assuming that your builds are invoked by triggers, a fair amount of build execution information will be exposed through built-in substitutions. Notably this includes commit sha & branch name. See below link for full list of builtin substitutions. Even if you do not currently use triggers and just manually launch your builds through the CLI, you can still create a trigger and just manually launch it through the UI to also expose these builtins.
https://cloud.google.com/build/docs/configuring-builds/substitute-variable-values
You can then create a new tag for your builds, and just use the exposed branch built in substitution for this tag name. See about halfway down the below link for adding tags to your cloudbuild file.
https://cloud.google.com/build/docs/view-build-results#filtering_build_results_using_queries
You can then filter on tag being set to a specific commit sha / branch name
E.g., on the GDB Buildbot instance http://gdb-build.sergiodj.net, I want to check the builds for git commit 68b975af7ef47a9d28f21f4c93431f35777a5109 , Git tag binutils-2_25, which happened in Dec 2014 to compare with my local results.
Is that possible?
Things which complicate that feature:
there may be multiple builds per commit, e.g. different platforms. OK, give me a list of all of them.
some commits don't have builds.
That buildbot seems configured to run only every few commits to save CPU.
In that case, I would like to see the nearest built parent on my search result.
There has also been some discussion for this on Google Groups for Chromium, but I couldn't find a good solution: https://groups.google.com/a/chromium.org/forum/#!topic/infra-dev/T_7S9HXLWlo
I have also opened a feature request at: http://trac.buildbot.net/ticket/3320
For GDB in particular, I know about the gdb-testers mailing list https://sourceware.org/ml/gdb-testers/ , which seems to get daily automated emails from Buildbot. I'm not very satisfied with it because there is not one email per build (there is more than one daily build), but searching that list is a possible workaround.
Besides the automated email results, GDB maintainers have also set up one Git repository per test environment that contains one commit for every build under a cgit web interface at http://gdb-build.sergiodj.net/cgit
For example, for Debian x86_64, you can clone it with:
git clone http://gdb-build.sergiodj.net/git/Debian-x86_64-m64/.git
Each commit contains the gdb.sum and gdb.log, which are the main outputs of DejaGnu's tests.
Note however that those repositories are pretty big, 350M currently, and take a long time to clone.
The Buildbot configuration for the Git and email outputs seems to be on the personal Git server of the responsible developer: http://git.sergiodj.net/?p=gdb-buildbot.git;a=summary as mentioned on the ml announcement: https://sourceware.org/ml/gdb/2015-01/msg00043.html
I have to set up a project which has multiple active branches in Jenkins .The source repository is SVN .What is the best Stratergy for building different branches in Jenkins with SVN ,should i create a job for every branch or is there some way to build every branch in one job.
Yes, creating a job for every branch seems to be the way people do it for now, see e.g. http://zeroturnaround.com/rebellabs/continuous-integration-and-feature-branches/#!/ and the links under material used.
It seems a common thing to do is to have some sort of script create the jobs for you either using the api or modifiyng directly in the config.xml's on the filesystem.