Searching for a project skeleton for Chef + Django on Linux - django

Is there a pre-existing, best practices project skeleton for Chef + Django web applications on Linux (Ubuntu preferably)?
For production Django systems our preferred setup is Supervisor, Nginx, Ubuntu and Uwsgi. Additionally we use Chef to do configuration management and Vagrant + Chef to do development environment management.
While this system is great once they're all up and running they can be very time consuming to setup properly.
My ideal solution would be pre-made Chef Github repository which was a skeleton for a best-practices Django deployment. (It would come with a chef-solo.rb ready to be used to deploy to some cloud ubuntu instance and a Vagrantfile ready to be used to create Vagrant dev machine.) Basically all you would have to do is add a Chef cookbook to deploy your application code and tweak a few settings.
Does anything like that ideal solution exist?

Here's typical chef based configuration solution:
one git repo saves chef-repo. you can use knife solo init <repo-name> to create it. Or just clone that from git repo of opscode.com
one git repo per cookbook. you can use berkshelf cookbook <your-cookbook-name> to create a full set of cookbook content including cookbook itself, test-kitchen, vagrant and berks. Please install berkshelf first via gem install berkshelf
For any other cookbooks that from cookbook community or git repo, you can use berkshelf to download them and managed as local cookbooks.

Related

Setup puppetDB with puppet opensource on AWS

I have a working setup of puppet open-source 3.8 with a puppet master and several nodes.
I would like to install puppetDB and a dashboard so I can get a good overview about my nodes.
To not destroy anything from the current setup, I wish to install puppetDB on a separate server. ( Everything is running on AWS EC2 instances.)
I was trying to use the following descriptions and let puppet install puppetDB:
PuppetDB 2.3.8-1.el6 - I believe this is the latest version compatible with puppet 3.8
I've managed to install puppetDB on the DB node, however I can't connect my puppet master to the node.
Based on this documentation:
Connecting Puppet Masters to PuppetDB
I need to install puppetdb-terminus. However I'm using Puppet Open-source, so "sudo puppet resource package puppetdb-terminus ensure=latest" is failing on puppet agent as a dependency.
It's fine, but I have also issues with the alternative solution:
Download the PuppetDB source code, unzip it and navigate into the resulting directory in your terminal.
Run sudo cp -R puppet/lib/puppet/ /usr/lib/ruby/site_ruby/1.8/puppet.
There is no puppet/lib/puppet in neither puppetdb-2.3.8.tar.gz nor puppetdb-3.2.4.tar.gz archives.
As my last hope, I went to github to grab the source:
https://github.com/puppetlabs/puppetlabs-puppetdb/tree/master/lib/puppet
I've copied these files to /usr/lib/ruby/site_ruby/1.8/puppet.
/var/log/messages:Mar 18 13:08:03 ip-10-84-4-172 puppet-master[25616]: Could not configure routes from /etc/puppet/routes.yaml: Could not find terminus puppetdb for indirection facts
At this point I'm completely stuck. How can I verify my puppet-terminus installation? If this way is not good, how can I install it on my puppet master?
(I'm using RHEL6, Puppet Open-source 3.8, I've did all the other changes on puppet master based on the documentation.)
Just for the sake of completeness, here is my puppetDB puppet manifest:
class { 'puppetdb::globals':
version => '2.3.8-1.el6',
}
class { 'puppetdb::database::postgresql':
listen_addresses => $postgres_host,
}
class { 'puppetdb::server':
database_host => $puppetdb_host,
}
I've just migrate my PuppetDB from puppetmaster server to a standalone one. To handle the installation of PuppetDB I've used this module from Puppet Labs.
Its was straightforward. The db migration was done with puppetdb --export from master server and puppetdb --import in the new server. The last thing was change the address for PuppetDB within puppet master config.
[]'s
The puppet repo pages is tricky, there is a separate repo for the Pre4.0 Open Source binaries:
https://docs.puppetlabs.com/guides/puppetlabs_package_repositories.html#pre-40-open-source-repositories
After using this repo, there was no more issue during installation.

docker unit test setup

I want to setup a unit test environment for my product. I have a web application build on nginx in Lua which use mysql and redis. I think docker will be good for this although i am new to docker. My application runs on centos server (production server).
I am planning to setup different container for mysql,redis and webapp and then write UT application (unit test for Lua using Busted framework) in my mac (My development machine is MAC) or VM to test it. The UT application will talk to docker container nginx and nginx will use container mysql and redis. Is this good ? If yes ,can someone guide me how to do this? maybe some good link? If no , what could be better way. I have already tried using vagrant but that took too much time which shouldn't be in my UT case.
For an example how we setup our project template you may have a look at phundament/app and its testing setup.
We are using a dockerized GitLab installation with a customized runner, which is able to execute docker-compose.
Note! The runner itself is running on a separate Docker host.
We are using docker-compose.yml to define the services in a stack with adjustments for development and testing.
The CI configuration is optimized to handle multiple concurrent tests of isolated stacks, this is just done by specifying a custom COMPOSE_PROJECT_NAME.
Some in-depth documentation about our testing process and useful information about docker-compose and dockerized CI.
#testing README
#testing Docs
CI builds
Extending services and Compose files
Docker-in-Docker for CI?
Finally, Travis CI also supports Docker since a while, but I haven't tested this approach at all.
If you are new to Docker based CI, please look at Drone:
Official page
Github repo
Tutorial
There some are drawbacks to this solution (like size of images), but it will get you off the grounds.

AWS, OpsWorks and Chef dependencies: what's the cleanest solution?

I've got a Chef project that, locally with Vagrant, works really nicely. I'm using librarian-chef, which means I can specify my dependencies in a Cheffile like this:
site 'http://community.opscode.com/api/v1'
cookbook 'jenkins'
When I then run librarian-chef install, it pulls down jenkins and all the cookbooks it depends on into a cookbooks directory.
There's also another directory, site-cookbooks, which is where I'm writing all of my own custom cookbooks and recipes.
In the Vagrantfile, you can then tell it to look at two different paths for cookbooks:
config.vm.provision "chef_solo" do |chef|
chef.cookbooks_path = ["cookbooks", "site-cookbooks"]
# snip
end
This works perfectly when I run vagrant up. However, it doesn't seem to play nicely with AWS OpsWorks – as this requires all cookbooks to be at the top level of the Chef repository.
My question then, is: what's the nicest way to use Chef with OpsWorks without including all of the dependencies at the top level of my repository?
OpsWorks doesn't play nice with a number of tools created for Chef. I tried using it not long after it came out and gave up on it (OpsWorks was using Chef 9 at the time).
I suggest you either move from OpsWorks to Enterprise Chef, or try the following:
1. Create a separate repo for your cookbooks
Keep all cookbooks in a separate repo. If you want it a part of a larger repository, you can include it as git submodule. Git submodules are generally a bad thing but cookbooks are a separate entity, that can live independently of the rest of your project, so it actually works quite well in this case.
To add a cookbooks repository inside another repo, use:
git submodule add git://github.com/my/cookbooks.git ./cookbooks
2. Keep your cookbooks together with community cookbooks
You can either clone the cookbooks into your repository, add them as submodules, or try using librarian-chef/Berkshelf to manage them. You could try using this method, it should work with librarian-chef: https://sethvargo.com/using-amazon-opsworks-with-berkshelf/
As of recently, OpsWorks supports Chef 11.10 and Berkshelf, giving you a much nicer way of managing cookbook dependencies.

Berkshelf loading wrong dependencies

I'm trying to create a cookbook to install my developemnt environment in vagrant.
I am using vagrant 1.4.3 over OSX 10.9 and Berkshelf 2.0.13.
The cookbook I'm developing is in github (https://github.com/Batou99/console-development)
When I run berks install inside my cookbook folder everything is fine, berkshelf downloads the right dependencies, I am specially interested in 7even/oh-my-zsh which berkshelf downloads just fine.
But when I use my new cookbook in a vagrant machine I want to provision, I load my cookbook in the Berksfile using
cookbook 'console-development', git: 'http://github.com/Batou99/console-development'
But somehow berkshelf ends up loading the oh-my-zsh cookbook stored in opscode which is completely different.
What am I doing wrong? Why berkshelf ignores the path I set in my cookbook?
I've been trying to fix this for several hours and its driving me crazy, any help would be much appreciated.
This is a known issue with Berkshelf. For a full list of open issues, please see: https://github.com/berkshelf/berkshelf/issues

Spree Starting Server

I cloned the git repository onto a local machine and Amazon Web Services.
I've tried using script/server and rails s
The code is cloned using git://github.com/spree/spree.git
How do you start the server?
I'd rather use the full git so I can ultimately change the template.
Check out the section on the github page under "Working With Edge Source"
https://github.com/spree/spree
Clone the Git repo
git clone git://github.com/spree/spree.git
cd spree
Install the gem dependencies
bundle install
Create a sandbox Rails application for testing purposes (and automatically perform all necessary database setup)
bundle exec rake sandbox
Start the server
cd sandbox
rails server
I just checked it out and it works, but edge currently isn't working. You might just want to fork it and use your own fork as a gem for a standard spree-store set up.