how to use regex in jenkins advanced configuration to trigger builds based on changes in specific project folders - regex

We have a microservice project architecture where there is a single project repository with several folders. Each folder has files etc for a specific API. We would like to have that as a single repo but configure separate jobs in jenkins for each API folder. As such we would like to know how to use same repo for scm checkout in jenkins but trigger builds for commits made only to the folders where changes are made. i know it supports regex to include and exclude. But would like to know how best to use that.
So say for example i have a project sample-project with 3 folders abc, def and xyz.
we now have a job in jenkins that checkouts sample-project. Now we would like that jenkins job to be configured in a way that only when anything inside abc folder is changed or committed, it triggers that job otherwise not. How to best implement this.

Related

Workflow migration from repo to repo

I'd like to ask, how may I do a migration of mappings, worklets and workflows from Informatica PowerCenter Integ, to Prod.
Integ Enviroment and Prod are in different servers, so I can't just mouve folder from folder.
Is it possible? I can't find any refernece or tutorial.
Thank you in advance.
In Powercenter, its possible to copy form one env to another. Request everyone to check in their objects first adn log off from both source and target repo.
Open Repository Manager, connect to the source repository and select the folder you want to copy.
Click Edit > Copy.
Connect to the target repository. Connect to the target repository with the same user account used to connect to the source repository. If you do not have same user you need to use deployment group/deployment folder.
In the Navigator, select the target repository, and click Edit > Paste. You will get many options like - replacing objects, use latest version, check out etc. You can follow below link to get help.
https://docs.informatica.com/data-integration/powercenter/10-5/repository-guide/copying-folders-and-deployment-groups/copying-or-replacing-a-folder/steps-to-copy-or-replace-a-folder.html
Now, my preference would be to use deployment group or deployment folder. Its easy to use and easy to control - like if you want to replace 10 objects out of 100s, or you want to create a standard process for future migrations, or deploy using command task automatically, you can do as well.

How can I publish serverless application on AWS faster?

I do not know how this question is logic but if there is anyway to solve this issue, it will cause to not waste my time.
I have a ASP.net core application that consist of many libraries like jquery, modernizer and etc. All of them stored in lib folder in wwwroot folder.
When I start publishing on AWS (with AWS Toolkit) it start zipping and publishing on the server as usual.
The point is that it will take a lot of time for zipping all of the libraries. these library does not any change during the project and I just change some pages or classes.
Is there any way to cancel zipping some folders to publish faster?
You can add this in your AWS serverless template to remove unwanted packages from the bundle.
package:
exclude:
- scripts/**
- dynamodb/tables/**
- policies/**
- dynamodb/seeds/**
If you are using a CI/CD methodology then you can ask the code builder to use a script in a root folder structure to run your package resolvers and all. Please refer this documentation

AWS CodeCommit getDifference for many directories

Would like to build some integration with Amazon CodeCommit service. I would like to receive notification to my Lambda function with every push to master branch. I would like to use getDifference API method to check commit details but only in certain directories. I can call it multiple time for each directory I am interested in. But I would like to know if it is possible to fetch differences from all directories in one call using afterPath parameter. It works smoothly while fetching diff for one directory.
There two task here.
Trigger to Lambda
Lambda should interact with git to find the difference of the changed files in a certain directory.
CodeCommit Trigger:
http://docs.aws.amazon.com/codecommit/latest/userguide/how-to-notify-lambda.html
npm git module:
While there are lot of npm modules available, we use simple-git to achieve the work you want to do.
https://www.npmjs.com/package/simple-git
It can go through git repository and interact whatever you want to do with repo.
Hope it helps.
As far as GetDifferences goes, it looks like it will get differences in the root directory (and all subdirectories) if you don't specify afterPath. However, it limits the scope to the directory you provide in afterPath (and subdirectories therein). However, I don't think there's a way to provide multiple specific directories to afterPath in a single call, so making multiple calls is going to be your best bet.
Docs: http://docs.aws.amazon.com/codecommit/latest/APIReference/API_GetDifferences.html

Using Container Builder Build Triggers in repository with multiple projects

I have a single Cloud Source Repository with multiple projects. I am able to create a cloudbuild.yaml file in the repo root that builds all projects. However, I don't want to have a build trigger that rebuilds all of the projects since most commits will be for a single project. Ideally I would like to have a cloudbuild.yaml file in each project subdirectory and a build trigger that detects changes in the project subdirectory of the repository. Is something like this possible?
As a possible workaround, I believe I may be able to keep my cloudbuild.yaml in the repository root and create a custom step that will get the commit sha (via the COMMIT_SHA substitution) and then get the list of files committed (via "git show --name-only --pretty=format: $COMMIT_SHA") to determine which project should be built and what image should be created. An alternative may be to have a tagging naming convention that will contain the project name and basing the trigger on that but I don't want to tag each commit.
Note, it seems like build triggers work very well when you have multiple repos but we made the decision to go with a mono repo and I don't want to rehash that debate in this question. I'd like to understand how to best use the Build Triggers in a mono repo.

Django: pre-deployment

Question 1:
I am about to deploy my first Django website and I was wondering what tools are recommended to gathering all your Django files.
Like for example I don't need my sass and coffeescript files I just want the compiled css and js files. I also want to use the correct production settings file.
Question 2:
Do I put these files ready for deployment into their own version control repository? I guess the advantage is that you can easily roll back changes?
Question 3:
Do I run my tests before gathering the files or before deploying?
Shell scripts could be a solution but maybe there is a better way? I looked at jenkins/hudson but that seems more like a tool that sits on top of the tools that I am looking for.
For questions one and two, I'd recommend using a version control system for this. I'm sure you're already using some sort of version control, so you can just say which branch of your repository you would like to deploy. And yes, this makes rollbacks incredibly easy. Probably the most popular method for Django deployment is to package your files using git, and then deploy these files and run any deployment scripts using fabric.
Using git, packaging your files using your local repository would look something like:
git archive --format=tar HEAD | gzip > my_repo.tar.gz
Alternately, you can first push your changes to a github repository, and then in your deployment script just clone your repository from your production server.
For your third question, if you use this version control method for packaging your files, then just make sure when you are testing you have the deployment branch checked out.
I'll typically use Fabric for deploying most Django projects:
http://docs.fabfile.org/en/1.0.0/?redir
It has a decent api for communicating with remote servers and it's all in Python – bonus!
You don't need to store your concatenated media files in a separate repo. They're only needed for production. In that case I've found libraries like django-mediasync and django-compress to be useful. They both provide template tags/settings that can concatenate and cache your static files for you depending on the DEBUG setting/environments (production vs development).
You can run your tests whenever. Some people will run them as a version control hook to prevent broken code from being checked in or during deployment, stopping the deployment in case of test failure.