In our company we have really powerful linux based build servers (double Xeon with 40 core) and not so powerful win7 laptops. We building our product in C/C++ language for an esoteric CPU. The compiler only exist in Linux. I can edit my git repo with Qt Creator. It is working and quite fast and everything. But I can't build the source on our Laptop. We have a main git repo and I can clone the same repo to my laptop and to our build server. I want to achieve that when I press the build button my code magically building on build server. I did a proof of concept solution where my build script do a git diff on my repo and scp it to the build server than it ssh to build server apply that diff on the server repo than start and wait the compilation. But that solution is not so fool proof. I think a better approaching/method is exist. So how can I build my git repo on external server?
If you can push to a bare repo on the build server, then you can associate to that bare repo a post-receive hook (.git/hooks/post-receive) which will:
checkout the code
#!/bin/sh
git --work-tree=/var/www/domain.com --git-dir=/var/repo/site.git checkout -f
trigger the compilation.
That way, you don't have to handle the diff yourself.
You only have to associate to the build button the action to push your branch to the bare repo of the build server, and the post-receive hook will do the rest.
You could switch to a forking Workflow, where each developer in the company has a personal public bare repo, which is a fork of the official central repository.
Then, when you want to build your changes, you push them to (a branch or the master of) your own personal public repo.
The build server not only clones the official central repository, but also your public repo. So when you push to your personal public repo, the build server merges the changes and does a personal build for you. Just like it probably already does for the official central repository?
Note that this is not too different from #VonC s answer, just focusses a bit more on the workflow. The personal public repo may well be on the build server, like #VonC suggests. Or it could be somewhere else. As long as it's some place public enough that the build server and you and your colleagues can find it.
Consider integrating http://jenkins-ci.org/ to your workflow, to take care of the build process, using a "git post-receive hook" to trigger the build as (suggested by #VonC).
If you want to use the "Forking Workflow" as suggested by #flup, you can take a look to http://gitlab.com which provides an easy way to manage pull/merge requests, fork repositories and to add hooks.
Related
I'm writing my only build scripts and setup jenkins, Jenkins provide plugins to zip the build results, meanwhile I can zip the result in my own build scripts and call that scripts from Jenkins. Which way is better?
If you are in a corporate environment where a number of teams share the same Jenkins master, each plugin you add increases the probability of a plugin failure as and when you upgrade Jenkins. In addition, a bad plugin can bring down your entire CI server. So, in a common master scenario, be very conservative on adding plugins, don't add a plugin unless it is absolutely necessary. For something as simple as creating zips, any build tool worth its salt has a task that can zip contents in a given folder. Read through Maven and Gradle for a start.
I am deploying a React app on AWS Elastic Beanstalk. I bundle the app using webpack. However, I'm slightly confused about what best practices are from the production build process. Should I build the app locally (with NODE_ENV=production) using webpack, and then just upload the resultant bundle.js file, along with all node_modules to the Elasticbeanstalk instance? Or, should I upload all the source files, and run webpack on the actual cloud AWS server during deployment?
You should never build for production locally (unless you're the only developer).
Ideally, you have a build process that gets triggered manually or automatically from a git commit which then builds your project for production for you.
By using a centralized build process, you can then be sure that all your builds are built the same way (e.g. same node version, same npm or yarn version).
Both approaches are not really good to be honest. Local building is not a best way to build anything you want to have on production. You might have packages locally that may have inpact on what you're building. Same applies to the OS your doing it on.
And, again, same applies to the building during deployment. As the name of 'deployments' stands for, it's deploying. Just placing your application setup on the server so it may serve as it is supposed to.
That's the point where all CI/CD comes in. Having those kinds of solutions guarantee that each build is done with the same steps and on the same solution stack. No difference between each build is desired, because it allows you to assume that any bug or a change comparing to the 'desing' is because of the code, not environment it was build within.
Assuming that you're the only developer here (because you're asking for such a thing), CI/CD might be definitive overkill here, so just create shell script with steps and use Docker as the environment for build, so it stays the same between each build. That's the closest to the CI/CD option you can get without a hassle.
We are trying to build into an Open Source ocaml project a way to automatically kick off a Travis CI build & test session when any of the 3rd party OPAM packages the project depends on get changed. If there was some clean way to get a change notification, then programmatically we could touch a file in a test branch and do a pull request which would start the Travis CI process to test compatibility so that our end users don't trip over the issue. We're trying to avoid wasting OPAM resources polling.
Thank you for your time!
opam.ocaml.org is a mirror of the OPAM repo https://github.com/ocaml/opam-repository, therefore you can periodically pull it and check the new commits.
You may have to be careful of silent source code change of packages, since their sources are out of the OPAM repo. If packages are OPAM registered without their checksums, you have to periodically check their sources themselves, too.
I'm new to django and my very first project is my blog. I wonder how django developers who use pydev normally synchronize with remote hosting server, updating their sites?
I also would like to know, how do you combine usage of git with a django project? Should I just make a repository for the entire project?
At my company we've got an entire git repository for each project, including the Django sources that are put in the PYTHONPATH for each project, making Django versions project dependant. The folder structure is something like:
/.git
/projectname/app1
/projectname/app2
/projectname/manage.py
/django-lib/django/...
As django-lib is not a Python module, we include both / and /django-lib in the PYTHONPATH. If your project is becoming large, you might want to consider using git submodules on your apps.
We've also setup several servers to support the developers. There's a testing server running a central testing database and a setup including Apache with WSGI to make testing on a real server possible, which sometimes is a bit different then the local manage.py the developers use before committing their changes.
The testing server is updated with the master branch of our git repository. We've made several scripts to allow all developers to do this without letting them login to the server via SSH, but that is just during pre-release. After release, that server will become our staging server, and we'll remove all scripts from it to make it just like our production server.
Every developer has setup their local project to make sure that it communicates with the central testing database, containing several test data. I myself push my changes from the commandline, but you could also use EGit for this.
When we've got a release, we put it in a separate branch, called 'release' (obviously) and the production server will pull only from that branch. This is done via SSH, but I don't really know how your server setup looks like, so I guess that that last step is entirely up to you.
I hope that this has helped you a bit. I won't say that this is the best workflow possible, but it works for us and you should figure out what works for you.
Most experienced Django developers use pip(or distribute) and virtualenv deal with all the python packages you might need for your Django projects (including Django itself).
Personally, all I keep in my projects git repository is a bunch of segregated requirements lists generated by pip :
. ~/Dev/environs/$PROJECT_NAME/bin/activate
pip freeze > ./docs/requirements/main.list
I'm fairly sure most django developers would be familiar with Fabric, which I use for :
streamlining local interaction with git and,
pushing to our central repository,
pulling from our production or test server
touching the wsgi on the relevant server
and pretty much any other kind of task you might find yourself using ssh terminal session for.
For those cases where I need to make changes to someone elses django application in order to make it work or suit our purposes, I :
fork it on github,
clone from my forked repo
make the changes
push it up to my own repo
and provide merge requests to the original repo owner
This way, i have a repo where i can use pip requirement lists to keep pulling from until the original application owner gets their own repo updated.
This is specific to my current project. But maybe the answers will reveal some more generic solutions.
Here is the situation:
I develop django project on my Windows box
I use SVN to commit to a SVN repository
while developing I use development server that comes with django
There is a testing server (apache) that runs somewhere else, and everytime i finish something I need to manually copy my work via WinSCP/Putty and make sure it works on testing server
Testing server is accessible for our testers to use and test and report bugs
I would like to automate this process as it is very painful. It involves me to export the whole repository, copy to the testing server, get rid of the pyc files, sometimes restarting apache, use the correct settings.py (usually some renaming).
I would like to for the testing server to automatically retrieve new files after each SVN commit. I could write a custom script to do all this stuff, but something tells me that there are some easy-to-use solutions I could use to change my workflow to make things less painful.
One extra bonus. There is a designer that works on HTML/CSS on the templates directly on the testing server. I need to check whether he made changes and I transfer them to my computer and subsequently to SVN rep. My boss thinks it's too dangerous to give him SVN access. Any ideas to help me out with this, also?
deployment:
I would say its better to do the deployment the same way you do for the production. Fabric is a good solution.
SVN way:
If you want to do it in the SVN way, create a branch called testing, once you have a working version of your code and ready for testing merge the development branch to the testing branch. Make sure you have permission on the testing branch to restrict everyone from merging to the test branch. After merging the test team should take a update to a specific version.
.pyc
It unnecessary to manually remove pyc files you can add a svn hook which can ignore the pyc files on commit. Create a file .svnignore
*.pyc
and run this command
svn -R propset svn:ignore -F .svnignore .
If you’ve already got yourself into the mess of versioning your compiled files, then you can do either of these things.
find -name "*.pyc" -exec svn revert {} \;
find -name "*.pyc" -exec svn delete {} \;
Django settings file
You can set the environment variable through which django can take up corresponding settings file. Django set env
Designer
well designer working directly on the test server is not a bonus point. :) its a headache. In a ideal environment no one should touch the code in the testing server. Create a separate branch for the designer or he can commit to the dev branch which all the developers can merge.
One option is to create a read only svn user and have it checkout the svn repository on the apache server. Then to do a build you run 'svn update'. You can check if the designer modified files by doing a 'svn status'.
If your svn repository is on the same machine as your qa django instance you could use a post-commit hook to have it svn update after each commit, and bounce apache if needed. See http://subversion.tigris.org/faq.html#website-auto-update