Google Container Registry build trigger on folder change - google-container-registry

I can setup a build trigger on GCR to build my Docker image every time my Git repository gets updated. However, I have a single repository with multiple folders, and a Docker file in each folder.
Ex:
my_app
-- service-1
Dockerfile-1
-- service-2
Dockerfile-2
How do I only build Dockerfile-1 when the service-1 folder gets updated?

This is a variation on this GitHub feature request -- in your case, differential behavior based on the changed files (folders) rather than the branch.
We are considering this feature as part of the development of support for more advanced workflow control and will post back on that GitHub issue when it becomes available.
The work-around available to you today is to use a bash script that conditionally builds (or doesn't) based on an inspection of the files changed in the $COMMIT_SHA that triggered the build. Note that the git builder can be used to get the list of files changed via git diff-tree --no-commit-id --name-only -r $COMMIT_SHA.

Related

Redeploy Google Cloud Function from command line using Source Repositories

I have a fairly simply Google Cloud Function that I'm deploying from Cloud Source Repositories.
I'm using the Google Cloud Shell as my development machine.
When I make updates to the function as I'm developing, I use the CLI to push updates to my Source Repository. However, running the gcloud functions deploy ... command from the command line doesn't seem to force GCF to pull in the latest source.
Occasionally, the deploy command after pushing new source code will simply state "Nothing to update." (which is incorrect.)
More often, it will go through the deployment process but the function will still run the previous version of the code.
When this happens the only way I can get the function to update is to use the dashboard, "Edit" the function, and then hit the Deploy button (even though I didn't change anything.)
Am I forgetting to do some kind of versioning or tagging that is required? Is there a way to force the CLI to pull the most current commit from the source repo?
I think you're looking for the --source=SOURCE gcloud functions deploy option to point to a source repository instead of the current directory (the default):
--source=SOURCE
Location of source code to deploy. Location of the source can be one
of the following three options:
Source code in Google Cloud Storage (must be a .zip archive),
Reference to source repository or,
Local filesystem path (root directory of function source).
Note that if you do not specify the --source flag:
Current directory will be used for new function deployments.
If the function is previously deployed using a local filesystem path, then function's source code will be updated using the current
directory.
If the function is previously deployed using a Google Cloud Storage location or a source repository, then the function's source code will
not be updated.
The value of the flag will be interpreted as a Cloud Storage location,
if it starts with gs://.
The value will be interpreted as a reference to a source repository,
if it starts with https://.
Otherwise, it will be interpreted as the local filesystem path. When
deploying source from the local filesystem, this command skips files
specified in the .gcloudignore file (see gcloud topic
gcloudignore for more information). If the .gcloudignore file
doesn't exist, the command will try to create it.
The minimal source repository URL is:
https://source.developers.google.com/projects/${PROJECT}/repos/${REPO}
By using the URL above, sources from the root directory of the
repository on the revision tagged master will be used.
If you want to deploy from a revision different from master, append
one of the following three sources to the URL:
/revisions/${REVISION},
/moveable-aliases/${MOVEABLE_ALIAS},
/fixed-aliases/${FIXED_ALIAS}.
If you'd like to deploy sources from a directory different from the
root, you must specify a revision, a moveable alias, or a fixed alias,
as above, and append /paths/${PATH_TO_SOURCES_DIRECTORY} to the URL.
Overall, the URL should match the following regular expression:
^https://source\.developers\.google\.com/projects/
(?<accountId>[^/]+)/repos/(?<repoName>[^/]+)
(((/revisions/(?<commit>[^/]+))|(/moveable-aliases/(?<branch>[^/]+))|
(/fixed-aliases/(?<tag>[^/]+)))(/paths/(?<path>.*))?)?$
An example of a validly formatted source repository URL is:
https://source.developers.google.com/projects/123456789/repos/testrepo/
moveable-aliases/alternate-branch/paths/path-to=source

GitHub Cloud Build Integration with multiple cloudbuild.yamls in monorepo

GitHub's Google Cloud Build integration does not detect a cloudbuild.yaml or Dockerfile if it is not in the root of the repository.
When using a monorepo that contains multiple cloudbuild.yamls, how can GitHub's Google Cloud Build integration be configured to detect the correct cloudbuild.yaml?
File paths:
services/api/cloudbuild.yaml
services/nginx/cloudbuild.yaml
services/websocket/cloudbuild.yaml
Cloud Build integration output:
You can do this by adding a cloudbuild.yaml in the root of your repository with a single gcr.io/cloud-builders/gcloud step. This step should:
Traverse each subdirectory or use find to locate additional cloudbuild.yaml files.
For each found cloudbuild.yaml, fork and submit a build by running gcloud builds submit.
Wait for all the forked gcloud commands to complete.
There's a good example of one way to do this in the root cloudbuild.yaml within the GoogleCloudPlatform/cloud-builders-community repo.
If we strip out the non-essential parts, basically you have something like this:
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |
for d in */; do
config="${d}cloudbuild.yaml"
if [[ ! -f "${config}" ]]; then
continue
fi
echo "Building $d ... "
(
gcloud builds submit $d --config=${config}
) &
done
wait
We are migrating to a mono-repo right now, and I haven't found any CI/CD solution that handles this well.
The key is to not only detect changes, but also any services that depend on that change. Here is what we are doing:
Requiring every service to have a MAKEFILE with a build command.
Putting a cloudbuild.yaml at the root of the mono repo
We then run a custom build step with this little tool (old but still seems to work) https://github.com/jharlap/affected which lists out all packages have changed and all packages that depend on those packages, etc.
then the shell script will run make build on any service that is affected by the change.
So far it is working well, but I totally understand if this doesn't fit your workflow.
Another option many people use is Bazel. Not the most simple tool, but especially great if you have many different languages or build processes across your mono repo.
You can create a build trigger for your repository. When setting up a trigger with cloudbuild.yaml for build configuration, you need to provide the path to the cloudbuild.yaml within the repository.

Using Bitbucket for existing project

I have an existing django project on my local machine (in virtualwrapper). How do I add it to the Bitbucket?
Let say my django project is like this:
project
--manage.py
--apps
--db.sqlite3
...
So should I do 'git init' under 'project' directory?
Since it is develop in the virtualwrapper, so I think only the project files will be pushed to the Bitbucket, is that right? If I want to develop the project on a different computer, and want to pull the project files from Bitbucket, how should I do it? I mean should I create another virtual environment in my new machine, install django and necessary pakcages before import the files from bitbucket?
I am new to git, so I don't know what is the best to do it.
So should I do 'git init' under 'project' directory?
Yes, but after that, don't add everything.
Create a .gitignore file first, where you declare the files that shouldn't be versioned (the one that are generated)
Then add and commit: that updates a local repo.
you can easily link it to an existing empty BitBucket repo:
git remote add origin ssh://git#bitbucket.org/username/myproject.git
git push -u origin master # to push changes for the first time
Normally, you wouldn't store a binary like db.sqlite3 in a source repo.
But this blog post suggests a way to do so through
In a .gitattributes or .git/info/attributes file, give Git a filename pattern and the name of a diff driver, which we'll define next. In my case, I added:
db.sqlite3 diff=sqlite3
Then in .git/config or $HOME/.gitconfig, define the diff driver. Mine looks like:
[diff "sqlite3"]
textconv = dumpsqlite3
I chose to define an external dumpsqlite3 script, since this can be useful elsewhere.
It just dumps SQL to stdout for the filename given by its first argument:
#!/bin/sh
sqlite3 $1 .dump

How to force GitHub Pages build?

Every GitHub repository can have (or be) a GitHub Pages website, that can be built with Jekyll. GitHub builds the site every time you push a new commit.
Is there a way to force the refresh of the Github Pages website without pushing a new commit?
From GitHub support, 2014-06-07:
It's not currently possible to manually trigger a rebuild, without pushing a commit to the appropriate branch.
Edit:
As Andy pointed out in the comments, you can push an empty commit with the command:
git commit -m 'rebuild pages' --allow-empty
git push origin <branch-name>
Edit 2:
Thanks to GitHub Actions, it's fairly easy to trigger a daily publish: https://stackoverflow.com/a/61706020/4548500.
If you want a quick script solution, here it is. Just do the following tasks only once, and run the script whenever you want to rebuild your GitHub page.
1. Create a personal access token for the command line:
Follow the official help here to create a personal access token. Basically, you have to log in your GitHub account and go to: Settings > Developer settings > Personal access tokens > Generate new token.
Tick repo scope.
Copy the token.
2. Create the following script:
Create a file called RebuildPage.sh and add the lines:
#!/bin/bash
curl -u yourname:yourtoken -X POST https://api.github.com/repos/yourname/yourrepo/pages/builds
Here,
Replace yourname with your GitHub username.
Replace yourtoken with your copied personal access token.
Replace yourrepo with your repository name.
3. Run the script:
If you use Windows 10:
You need to setup Windows Subsystem for Linux, if not already done. Follow this to do so.
Remove the first line (#!/bin/bash) from the script and save the script as RebuildPage.bat. (i.e., replace .sh with .bat in the script file name)
Alternative to the above point: To get the double-click feature for running the .sh file:
Set bash.exe as the default program for .sh files.
Open regedit.exe and edit HKEY_CLASSES_ROOT\Applications\bash.exe\shell\open\command. Set the (Default) value to:
"C:\Windows\System32\bash.exe" -c " \"./$(grep -oE '[^\\]+$' <<< '%L')\";"
Now double-click the script wheneven you want to rebuild your GitHub page. Done!
If you use Linux/Mac, running the script is as same as running other scripts. Done!
Additional notes for the solution:
This solution utilizes a API of GitHub REST API v3. Here is the official documentation for the API.
Now that GitHub Actions are available, this is trivial to do:
# File: .github/workflows/refresh.yml
name: Refresh
on:
schedule:
- cron: '0 3 * * *' # Runs every day at 3am
jobs:
refresh:
runs-on: ubuntu-latest
steps:
- name: Trigger GitHub pages rebuild
run: |
curl --fail --request POST \
--url https://api.github.com/repos/${{ github.repository }}/pages/builds \
--header "Authorization: Bearer $USER_TOKEN"
env:
# You must create a personal token with repo access as GitHub does
# not yet support server-to-server page builds.
USER_TOKEN: ${{ secrets.USER_TOKEN }}
Sample repo that does this: https://github.com/SUPERCILEX/personal-website/actions
Pages API: https://developer.github.com/v3/repos/pages/#request-a-page-build
I had this problem for a while, and pushing to master branch didn't change anything on myapp.github.io, for two reasons :
1 - Build
No matter how many time I tried to push my work on master, build would not start. I found a workaround by modifying my file in Github online editor (open your index.html and edit it on Github website, then commit)
2 - Caching issues
Even after a successful build, I would still see the exact same page on myapp.github.io, and hard reloading with Ctrl + Shift + R wouldn't solve it. Instead, if using Chrome, inspect your page, head into the Application tab, select "Clear storage" in the left menu, and click on "Clear site data" at the bottom of the menu.
Even after I pushed my changes to GitHub repository, I was not able to see the changes today. Then I checked my repository settings for more information, there I could see, all these times the build was failing and that was the reason I was not able to see the changes.
You may also see a message as "Your site is having problems building: Unable to build page. Please try again later."
Then I was checking my recent commits and tried to find out what causes this issue. At the end I was able to fix the issue.
There was an additional comma in the tags (,) and that caused this issue.
You will not get relevant error messages if there are any issues in your .md file. I recommend you to check for the build status and compare the changes if you are facing the same issue.
This is doable as of v3 of the GitHub API, though it is currently in preview
https://developer.github.com/v3/repos/pages/#request-a-page-build
POST /repos/:owner/:repo/pages/builds
The empty commit didn't work for me, but based on #benett answer, this worked for me:
Open Postman, create a new request with this URL: https://api.github.com/repos/[user_name]/[repo_name]/pages/builds (replace with your name and repo), and select POST method.
Before you run it, go to the headers tab and add a new key Accept with the value application/vnd.github.mister-fantastic-preview+json
Now you can run it and visit your pages again.
I was having trouble refreshing even though my Github Actions was showing that my site has been deployed.
Toggling the publishing source did the trick for me. I switched the publishing source from master to content and then back to master. You can check how to change the publishing source of the branch here
I went through the same problem, to solve it I developed a githu action that works with scheduler and supports updating multiple gh-pages at the same time.
https://github.com/marketplace/actions/jekyll-update-github-pages-without-new-commit, the action update gh-pages without generate new commits.
name: Update all github pages
on:
schedule:
- cron: "30 0 * * *"
jobs:
github-pages:
runs-on: ubuntu-latest
name: Update Github Pages Initiatives
steps:
- name: Jekyll update github pages without new commit
uses: DP6/jekyll-update-pages-action#v1.0.1
with:
DEPLOY_TOKEN: ${{ secrets.GH_PAGES_DEPLOY_TOKEN }}
USER: ${{ secrets.GH_PAGES_USER }}
FILTER: 'is%3Apublic%20org%3Adp6'
Log action
Alternative Solution
You may have received an email from GitHub telling you that Jekyll did not succeed at building your site when you pushed it to your gh-pages. If this is the case, you can try to force push to trigger another build.
If you use a dedicated folder for the final website, let's say a public folder, you can try to rebuild your folder and add the folder to your commited changes. After that, you'll need to split those file into your gh-pages branch and force them to trigger another build even if the files did not change at all. The rest of the code bellow just removes the commits for the public folder for convenience and removes it from the local filesystem.
Code
git add public
git commit -am ":bug: triggering another jekyll build"
git push origin $(git subtree split --prefix public master):gh-pages --force
git reset HEAD~1
rm -rf public
Tips
If there are uncommited changes that are not part of the final site, you can stash them with the following command.
git stash
Then do the above command to manually force the Jekyll build and unstash them.
git stash pop
References
Online Git Manual
I surmise from other answers that this was once difficult?
Go to Settings->Pages
Just under "Change theme" you'll see a link to the actual Github action labeled "pages build and deployment workflow".
Click Re-run all jobs

get operation in fabric to use hostname

I am pretty new to fabric and trying to setup a deployment in the below fashion:
Main repo --> Local_repo -> Deployment server
I want to
(1)push the build from the main repo to the local repo
(2)Deployment server needs to pull the available code from the local repo
I did the first step sucessfully using put but then I am not able to 2nd step using get operation.
I tried using git pull but then I get an error stating its not a git repo and same goes for hg pull as well.
Is there a way where we can combine get operation with the host name: for ex:
get ('username#localrepo/local_repo_build_path', deployment_server_local_path)
If you want to use a git pull, you have to most likely use the context managers cd/lcd to move into the directory of the repo. Also you can't specify the username/host like that. It's set in the #host or #role definition for the task, and it'll pick that up automatically, though also it's not going to pull down a full dir, you'd need to use the contrib rsync for something like that.