Moving files in workspace and commit Jenkins SVN - unit-testing

The thing is that I have in my repository 2 folders, one for my development code, and another one for my preproduction code, and I need to upload those files from development environment to preproduction after a job finished checking my dev code, how can I do this with Jenkins jobs?
I mean making kind of a commit moving files to one another?
Thanks!!

Add Build → Add build step → Execute shell or Execute Windows batch command after your checking job and add the commands you would use on the command line there.

Instead of preserving both the development and pre-production files in the same repository, you can use 2 different repositories(instead of 2 folders on 1 repository).
It would be easy to push the files to pre-production repository on a successful build and it looks more organised.
The post section defines actions which will be run at the end of the Pipeline run. A number of additional Conditions blocks are supported within the post section: always, changed, failure, success, and unstable. These blocks allow for the execution of steps at the tail-end of the Pipeline run, depending on the status of the Pipeline.
Check this link:
https://jenkins.io/doc/book/pipeline/syntax/#post
stages {
stage('Example') {
steps {
echo 'Hello World'
}
}
}
post {
success{
echo 'You can checkout your pre-production repository here and push files on a successful build'
}
}

Related

How can I have two script commands in one build process for Azure DevOps YAML build definitions?

(There's controversy going on, but this should be well-enough defined to ask the question.)
We find ourselves needing three script jobs in one build path (that is, one YAML (.yml) file). I tried:
Steps:
- script: step1 1>&2
# # Do this after other tasks such as building
- task: DotNetCoreCLI#2
inputs:
command: test
projects: 'TEST/*/*.csproj'
arguments: '--configuration Release'
displayName: 'Run Unit Tests'
- script: step2 1>&2
- script: step3 1>&2
But the build upchucks when there's more than one - script under steps. Normally I'd just pile them into one script and all that, but I can't because there's a test job in the middle. How can I make this thing run?
I copy and pasted it back and it started working. It must have been a non-printable character in the file.
I copy/pasted the original script lines from a MSDN document. On adding the second script line I got a nonsense error out of the yml parser (that I didn't capture) with the location of the second script - indicator as the location of the error and part of the script line as the error body. I then copy/pasted into stackoverflow to ask the question.
On copy/paste back from stackoverflow the error no longer appears. The only reasonable explanation is there was a non-printable character that got picked up on the original copy/paste from msdn that got eaten in the subsequent steps.
In the pipeline job, the scripts you set in script tasks will be packaged as the corresponding script files to run on the agent. The script file for each script task is independent, and the pipeline will not package all of the scripts into one script file.
Normally I'd just pile them into one script and all that, but I can't because there's a test job in the middle.
In your case, you can try directly executing the 'dotnet test' command in the script task instead of using the .NET Core CLI task.
With this way, you can write the scripts for 'step1', 'dotnet test', 'step2' and 'step3' into one script file, and execute this script file with only one script task in the pipeline.

GitHub Cloud Build Integration with multiple cloudbuild.yamls in monorepo

GitHub's Google Cloud Build integration does not detect a cloudbuild.yaml or Dockerfile if it is not in the root of the repository.
When using a monorepo that contains multiple cloudbuild.yamls, how can GitHub's Google Cloud Build integration be configured to detect the correct cloudbuild.yaml?
File paths:
services/api/cloudbuild.yaml
services/nginx/cloudbuild.yaml
services/websocket/cloudbuild.yaml
Cloud Build integration output:
You can do this by adding a cloudbuild.yaml in the root of your repository with a single gcr.io/cloud-builders/gcloud step. This step should:
Traverse each subdirectory or use find to locate additional cloudbuild.yaml files.
For each found cloudbuild.yaml, fork and submit a build by running gcloud builds submit.
Wait for all the forked gcloud commands to complete.
There's a good example of one way to do this in the root cloudbuild.yaml within the GoogleCloudPlatform/cloud-builders-community repo.
If we strip out the non-essential parts, basically you have something like this:
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |
for d in */; do
config="${d}cloudbuild.yaml"
if [[ ! -f "${config}" ]]; then
continue
fi
echo "Building $d ... "
(
gcloud builds submit $d --config=${config}
) &
done
wait
We are migrating to a mono-repo right now, and I haven't found any CI/CD solution that handles this well.
The key is to not only detect changes, but also any services that depend on that change. Here is what we are doing:
Requiring every service to have a MAKEFILE with a build command.
Putting a cloudbuild.yaml at the root of the mono repo
We then run a custom build step with this little tool (old but still seems to work) https://github.com/jharlap/affected which lists out all packages have changed and all packages that depend on those packages, etc.
then the shell script will run make build on any service that is affected by the change.
So far it is working well, but I totally understand if this doesn't fit your workflow.
Another option many people use is Bazel. Not the most simple tool, but especially great if you have many different languages or build processes across your mono repo.
You can create a build trigger for your repository. When setting up a trigger with cloudbuild.yaml for build configuration, you need to provide the path to the cloudbuild.yaml within the repository.

Google Container Registry build trigger on folder change

I can setup a build trigger on GCR to build my Docker image every time my Git repository gets updated. However, I have a single repository with multiple folders, and a Docker file in each folder.
Ex:
my_app
-- service-1
Dockerfile-1
-- service-2
Dockerfile-2
How do I only build Dockerfile-1 when the service-1 folder gets updated?
This is a variation on this GitHub feature request -- in your case, differential behavior based on the changed files (folders) rather than the branch.
We are considering this feature as part of the development of support for more advanced workflow control and will post back on that GitHub issue when it becomes available.
The work-around available to you today is to use a bash script that conditionally builds (or doesn't) based on an inspection of the files changed in the $COMMIT_SHA that triggered the build. Note that the git builder can be used to get the list of files changed via git diff-tree --no-commit-id --name-only -r $COMMIT_SHA.

How to force GitHub Pages build?

Every GitHub repository can have (or be) a GitHub Pages website, that can be built with Jekyll. GitHub builds the site every time you push a new commit.
Is there a way to force the refresh of the Github Pages website without pushing a new commit?
From GitHub support, 2014-06-07:
It's not currently possible to manually trigger a rebuild, without pushing a commit to the appropriate branch.
Edit:
As Andy pointed out in the comments, you can push an empty commit with the command:
git commit -m 'rebuild pages' --allow-empty
git push origin <branch-name>
Edit 2:
Thanks to GitHub Actions, it's fairly easy to trigger a daily publish: https://stackoverflow.com/a/61706020/4548500.
If you want a quick script solution, here it is. Just do the following tasks only once, and run the script whenever you want to rebuild your GitHub page.
1. Create a personal access token for the command line:
Follow the official help here to create a personal access token. Basically, you have to log in your GitHub account and go to: Settings > Developer settings > Personal access tokens > Generate new token.
Tick repo scope.
Copy the token.
2. Create the following script:
Create a file called RebuildPage.sh and add the lines:
#!/bin/bash
curl -u yourname:yourtoken -X POST https://api.github.com/repos/yourname/yourrepo/pages/builds
Here,
Replace yourname with your GitHub username.
Replace yourtoken with your copied personal access token.
Replace yourrepo with your repository name.
3. Run the script:
If you use Windows 10:
You need to setup Windows Subsystem for Linux, if not already done. Follow this to do so.
Remove the first line (#!/bin/bash) from the script and save the script as RebuildPage.bat. (i.e., replace .sh with .bat in the script file name)
Alternative to the above point: To get the double-click feature for running the .sh file:
Set bash.exe as the default program for .sh files.
Open regedit.exe and edit HKEY_CLASSES_ROOT\Applications\bash.exe\shell\open\command. Set the (Default) value to:
"C:\Windows\System32\bash.exe" -c " \"./$(grep -oE '[^\\]+$' <<< '%L')\";"
Now double-click the script wheneven you want to rebuild your GitHub page. Done!
If you use Linux/Mac, running the script is as same as running other scripts. Done!
Additional notes for the solution:
This solution utilizes a API of GitHub REST API v3. Here is the official documentation for the API.
Now that GitHub Actions are available, this is trivial to do:
# File: .github/workflows/refresh.yml
name: Refresh
on:
schedule:
- cron: '0 3 * * *' # Runs every day at 3am
jobs:
refresh:
runs-on: ubuntu-latest
steps:
- name: Trigger GitHub pages rebuild
run: |
curl --fail --request POST \
--url https://api.github.com/repos/${{ github.repository }}/pages/builds \
--header "Authorization: Bearer $USER_TOKEN"
env:
# You must create a personal token with repo access as GitHub does
# not yet support server-to-server page builds.
USER_TOKEN: ${{ secrets.USER_TOKEN }}
Sample repo that does this: https://github.com/SUPERCILEX/personal-website/actions
Pages API: https://developer.github.com/v3/repos/pages/#request-a-page-build
I had this problem for a while, and pushing to master branch didn't change anything on myapp.github.io, for two reasons :
1 - Build
No matter how many time I tried to push my work on master, build would not start. I found a workaround by modifying my file in Github online editor (open your index.html and edit it on Github website, then commit)
2 - Caching issues
Even after a successful build, I would still see the exact same page on myapp.github.io, and hard reloading with Ctrl + Shift + R wouldn't solve it. Instead, if using Chrome, inspect your page, head into the Application tab, select "Clear storage" in the left menu, and click on "Clear site data" at the bottom of the menu.
Even after I pushed my changes to GitHub repository, I was not able to see the changes today. Then I checked my repository settings for more information, there I could see, all these times the build was failing and that was the reason I was not able to see the changes.
You may also see a message as "Your site is having problems building: Unable to build page. Please try again later."
Then I was checking my recent commits and tried to find out what causes this issue. At the end I was able to fix the issue.
There was an additional comma in the tags (,) and that caused this issue.
You will not get relevant error messages if there are any issues in your .md file. I recommend you to check for the build status and compare the changes if you are facing the same issue.
This is doable as of v3 of the GitHub API, though it is currently in preview
https://developer.github.com/v3/repos/pages/#request-a-page-build
POST /repos/:owner/:repo/pages/builds
The empty commit didn't work for me, but based on #benett answer, this worked for me:
Open Postman, create a new request with this URL: https://api.github.com/repos/[user_name]/[repo_name]/pages/builds (replace with your name and repo), and select POST method.
Before you run it, go to the headers tab and add a new key Accept with the value application/vnd.github.mister-fantastic-preview+json
Now you can run it and visit your pages again.
I was having trouble refreshing even though my Github Actions was showing that my site has been deployed.
Toggling the publishing source did the trick for me. I switched the publishing source from master to content and then back to master. You can check how to change the publishing source of the branch here
I went through the same problem, to solve it I developed a githu action that works with scheduler and supports updating multiple gh-pages at the same time.
https://github.com/marketplace/actions/jekyll-update-github-pages-without-new-commit, the action update gh-pages without generate new commits.
name: Update all github pages
on:
schedule:
- cron: "30 0 * * *"
jobs:
github-pages:
runs-on: ubuntu-latest
name: Update Github Pages Initiatives
steps:
- name: Jekyll update github pages without new commit
uses: DP6/jekyll-update-pages-action#v1.0.1
with:
DEPLOY_TOKEN: ${{ secrets.GH_PAGES_DEPLOY_TOKEN }}
USER: ${{ secrets.GH_PAGES_USER }}
FILTER: 'is%3Apublic%20org%3Adp6'
Log action
Alternative Solution
You may have received an email from GitHub telling you that Jekyll did not succeed at building your site when you pushed it to your gh-pages. If this is the case, you can try to force push to trigger another build.
If you use a dedicated folder for the final website, let's say a public folder, you can try to rebuild your folder and add the folder to your commited changes. After that, you'll need to split those file into your gh-pages branch and force them to trigger another build even if the files did not change at all. The rest of the code bellow just removes the commits for the public folder for convenience and removes it from the local filesystem.
Code
git add public
git commit -am ":bug: triggering another jekyll build"
git push origin $(git subtree split --prefix public master):gh-pages --force
git reset HEAD~1
rm -rf public
Tips
If there are uncommited changes that are not part of the final site, you can stash them with the following command.
git stash
Then do the above command to manually force the Jekyll build and unstash them.
git stash pop
References
Online Git Manual
I surmise from other answers that this was once difficult?
Go to Settings->Pages
Just under "Change theme" you'll see a link to the actual Github action labeled "pages build and deployment workflow".
Click Re-run all jobs

Committing data to Mercurial at the end of a build

I have a build script that is triggered by Jenkins.
First Jenkins will get the latest version from the repo (Bitbucket) and then it will initiate the build script.
Now if the build script is started in 'release' mode the script will make changes to some files (to keep track of version numbers and build dates, and to create a tag on the repo)
These changes need to be pushed back up to the remote repo.
How do I implement this?
The build takes a couple of minutes, so if someone pushes to the remote repo during the build then the push will fail because a merge is needed first. If that was not the case the merge will fail because there was nothing to merge...
Consider having Jenkins do its commits in a named branch all its own. This has a lot of advantages -- the biggest being that Jenkins never has to worry about someone else pushing a change to the release branch -- only Jenkins will be. Your Jenkins build script could look something like this:
hg clone --updaterev release http://path/to/repo
hg merge default || true # merge the latest from master
...build here...
hg commit -m "Auto commit from Jenkins for build $BUILDNUMBER" || true
hg tag build_$BUILDNUMBER
hg push
With a setup like that you're getting some advantages:
failed builds aren't creating new commits
Jenkins's push will always succeed
Jenkins's tag commits are in the 'release' branch, but still accessible from the default branch
Notice that the || true tells Jenkins not to fail the build on non-zero exit codes for merge (if there's nothing to merge) and nothing to commit.
Instead of cloning fresh each time you could just hg pull ; hg update -C release but for repos of reasonable size I like to start w/ a guaranteed clean slate.