I am researching about how to set up CI and continuous deployment for a small team project for a Django based web application. Here are needs:
Developer check in the code into a hosted SVN server (unfuddle.com)
A CI server detects new checkin, check out the source, build, run functional tests.
If tests all passed, deploy the code to the webserver on Amazon EC2.
For now, the CI server is also responsible to run the functional tests. I figured out that I can use Husdon as the CI server, use Selenium to run functional tests, and use Fabric to deploy the build to remote web server in Amazon cloud.
I am new to Django development and not very familiar with opensource tools. My questions are:
I can find some information to integrate hudson with selenium, but I couldn't find much information on how to integrate Fabric to Hudson as well. Is this setup viable? Do you see problems?
How do I integrate and deploy database changes? Most likely in the early stage we will change database schema very often with code changes. I used to use Visual Studio and the database project made it very simple to deploy. I wonder if there is "established, well-supported" way to do that.
Thanks!!
Can't help you very much with continuous integration/deployment - at my work we used CruiseControl and it acted very much as you describe, but I didn't have anything to do with setting it up.
However in answer to your second question, on database changes, the current state of the art in Django is to use South, which has just released version 0.7. It works well in a CI environment, as db migrations are stored alongside the code for each app, so the CI server can be set to run them automatically before running the tests.
Related
I created a Project on GCP. It has a postgres database, a node Appengine web app, and some other stuff. Now I am developing the app, and when everything is set up and running nicely I'd like to clone this project somehow and create a staging and a production environment/project.
So my project now is called dev-awesomeapp. Can I somehow make a staging-awesomeapp for staging and a awesomeapp for production from my existing dev-awesomeapp?
Edit: there is an other question from 2017 that asks the same thing, but maybe it's possible now after 2,5 years?
You can't, but if you don't want to configure everything form the beginning each time, you can use "architecture as code" with tools like deployment manager or Terraform.
This could help you in replicating your infrastructure, moreover it can be really helpful in automating any architectural changes if you use it in a CI/CD pipeline, making your release phase quicker and more reliable :)
For a Django project, I'm looking for a tool that would:
update the code on my server from a given branch in a remote repository (example: a master branch from Bitbucket)
run basic django command (migrate, collectstatic, etc...)
restart the project
notify me that all went ok (on Slack for instance)
I've seen many possible ways of doing this (Ansible, DeployBot, Pipelines, etc...), but I was wondering if there is a tool to recommend for a simple app?
Generally, on my Django project I am using Fabric (http://www.fabfile.org) for all activities mentioned by you. So deploy from particular branch, Django commands (eg. static collect), server restart etc.
I am searching for a solution to do continuous deployment in a cloud environment, more specific, in an Amazon AWS environment.
The code to be deployed are mainly Microsoft's ASP and PHP, so this framework should work on both platforms. As I have an auto-scale environment, this framework will work if it pulls the new code, like Puppet does.
My first thought was to deploy direct from the VCS, but I ended in a problem where all repository information was mirrored to the servers, as GIT, for instance, works. This is a problem because the repository keeps growing and the servers will demand more and more space.
I found Ansible, that works the way I need, but does not work on Windows environment. It only sends to the servers the production code, not the VCS repository, and keeps track which servers are updated.
Without using an easy-to-setup framework like this, I will need to create a Puppet + Jenkins + a VCS framework, where Jenkins creates the package from a VCS source code and Puppet delivers it.
Does anybody know any small framework for my needs or the Puppet + Jenkins + VCS is the way to go?
Consider CloudMunch (www.cloudmunch.com) for this. The platform is built exactly to solve this kind of polyglot requirements.
Disclaimer: I work for CloudMunch
I'm looking at setting up a continuous integration solution for a Django framework project hosted in AWS. Their are client and server tests. The client tests will fire up a headless browser, but it would also be good if it could fire up real browsers like Chrome and Firefox.
Repo hosted on BitBucket. What would be great is if code is committed to a branch, then the CI setup will pull the changes and run the tests in an environment that closely matches production.
Can you help me set up the best possible stack given the above constraints?
I think Fabric would be suitable for this. Fabric allows you to run code on your local machine and the server. Once you learn the basics of Fabric, you will see that you can run a command like 'fab deploy' and Fabric will commit your changes to a branch, then pull the changes onto the server.
You could have steps in between to launch a new server, prepare the environment, then run tests against the code. I have been using Fabric to simplify some deployments and have found it a great experience. Because of the flexibility fabric gives you, you can pretty much deploy however you want, whether it's onto a test server, QA, live etc...
Recently, I have started to deploy my work-in-progress django site from my local to server. But I have been doing it manually, which is ugly, unorganized, and error-prone.
I am looking for a way to automate and streamline the following deployment tasks:
Make sure all changes are committed and pushed to remote source repository (mercurial) and tag the release.
Deploy the release to the server (including any required 3rd-party apps missing from the server)
Apply the model changes to the database on the server
For 2), I have two further questions. Should the source of the deployment be my local env or the source repository? Do I need a differential or full deployment?
For 3), I use South in my local for applying model changes to database. Do I do the same on the server? If so, how do I apply multiple migrations at once?
I think Fabric is the defacto lightweight python deployment tool. http://docs.fabfile.org/en/1.3.4/index.html. It is very simple and will help you keep your deployment organized and streamlined. It allows for easy scp or rsync. Additionally it is easy to integrate with django tests.
For my smaller projects I just make the source of my deployments my local env. I checkout a clean copy and deploy from there. It would probably be better to integrate this with my version control for a quick rollback if there are any errors once I deploy.
I have never used south, but i'd imagine you could just write a fab command to sync your production server. If you're using south on dev, i couldn't imagine why you wouldn't want to use it on production too?