I have a django web application code in github. From time to time, I make necessary updates and arrangements on the repository. I have to pull the project every time and make adjustments on the docker and run on my machine.
Is there a way to run docker synchronously with the code in my github repoitory? When I make a change in github I want the docker to pull it automatically and try to run the project without interrupting.
Using hooks inside Jenkins we configure Git & Docker.
Say:
When ever we push changes to git, then jenkins job will trigger, jenkins will pull the changes and build new docker image and push the image inside docker.
Related
I use py-staticmaps repository for generate static map image, In normally I run this repo via python .main.py, it is working but I use the repo in Django and docker, I have a problem with cache files,. following error. What can I do?
I build it locally al of them working normally but in server I run via docker-compose that is to be problem with cache
I have a django project which is deployed in a docker container. I created a pipeline in jenkins triggered via github webhooks. Everyting works fine but I have some user files in project directory which I want to backup before jenkins pull the repository. Is there a way to add a pre build step to my pipeline script or avoid deleting files when running git pull command ?
I had a few questions about automatic git pulls on a remote server. I am aware there are several questions like this, but I wasn't sure what steps to take exactly, and I don't want to mess up my current setup with a mistake :/
To wit, the environment is on a Google Cloud VM. I am running a flask-based website that renders each page with the render_template() function.
The website resides inside its git folder, i.e. I never set up a bare repo and copied stuff. When I set it up a couple years ago, I just did git clone repo-url, then inside the repo directory, did flask run. Then I set up nginx to connect to the site's socket created with uwsgi inside the repo directory.
--
It has been working fine. I make changes locally to the content, push to github, then log in to the VM, and perform a git pull.
I want to do this automatically. I tried adding a cron job to do this, where the job basically ran a script, and the script did the git pull. Script content was:
cd /repo
git pull
Running the script in the server worked, but cron never managed to do the pull.
--
I have been reading about web hooks, and there is a bunch of stuff about post-receive hooks, post-update hooks, and making bare repos. At this point, I am embarrassed to say I have no idea what I should be doing.
Any help is greatly appreciated.
Another option would be to consider a GitHub Action, which, from GitHub, could interract with your Google cloud VM.
For example, actions-hub/gcloud.
- uses: actions-hub/gcloud#master
env:
PROJECT_ID: test
APPLICATION_CREDENTIALS: ${{ secrets.GOOGLE_APPLICATION_CREDENTIALS }}
with:
args: cp your-file.txt gs://your-bucket/
cli: gsutil
I'm developing an Aurelia Single Page App that will talk to a REST api built with Django Rest Framework.
Without containers, I would have two buildpacks, one for Node that runs a script in the package.json file and another for Python that builds the Django app.
If I push a container image, then what mechanism replaces the node buildpack that calls the script in package.json to trigger webpack to create the asset bundles?
what mechanism replaces the node buildpack that calls the script in package.json
You're not really giving any info regarding your current setup and what you've tried already, so I'll assume you already know how to run docker on heroku, and that you got your current setup to work on heroku without docker.
If you've got a script called build in your package.json that kicks off the webpack build, and start that starts a node.js express app to serve your app from the webpack output folder, you'd do something like this in your Dockerfile:
FROM node:8.9.4
RUN npm install
RUN npm run build
CMD npm run start
Of course this doesn't account for any copying and permission setting you may need to do, but that depends on your project setup.
The important bit is that you're essentially running the thing as a node app, and you need the appropriate scripts in your package.json to which you can delegate the building and running, so you only need to call one or two of those scripts from your Dockerfile. You don't want to be doing too much npm stuff there directly.
I am using django-haystack and whoosh search engine in my django app. Everything is working alright, except when I git push new version to my OpenShift server, search stops working. It simply does not return any results. If I run ./manage.py update_index it starts working.
I have whoosh_index/ in my .gitignore file. I checked by git ls-files and whoosh_index folder is not there. So my localhost files should not be overwriting any whoosh_index files.
Currently I use post_deploy script:
echo "Executing 'python ${OPENSHIFT_REPO_DIR}wsgi/app/manage.py update_index'"
python "$OPENSHIFT_REPO_DIR"wsgi/app/manage.py update_index
But is there another way so that I do not have to update_index everytime I push new version of my app? What am I missing?
From Modifying Applications
All OpenShift applications are built around a Git source control workflow - you code locally, then push your changes to the server. The
server then runs a number of hooks to build and configure your
application, and finally restarts your application. Optionally,
applications can elect to be built using Jenkins, or run using hot
deployment which speeds up the deployment of code to OpenShift.
There are 5 phases for the changes done:
Pre-Receive
Pre-Build
Build
Deploy
Post-Deploy
You can add the index update operation to the build phase by adding it to the file:
.openshift/action_hooks/build
You can disable the whole operation of modifying of open shift by require the hot deploy mode:
$ touch .openshift/markers/hot_deploy
With hot deployment the changes to application code are applied
without restarting the application cartridge, resulting in increased
deployment speed and minimized application downtime.