Is it possible to create multi-leve environment definitions in rails? - ruby-on-rails-4

Let's say I have an app that has the usual environments: development, staging, and production.
Then let's say I have a set of tasks that I need to run in an environment where a specific set of configuration options have been overridden -- let's say the DB host -- and these scripts (and their overrides) need to run in each environment.
One solution that comes to mind is to create a whole set of environments for each of these special cases, i.e.: dboverride-development.rb, dboverride-staging, and dboverride-production. Each of these environments would inherit from its main environment, but then override the necessary configuration options. But this seems cumbersome and isn't at involves a lot of code replication.
Are there existing strategies or conventions for this use-case in rails (v4 specifically)?

You can use environment variables to override any specific options as shown in the Rails guides.
If you have a config/database.yml but no ENV['DATABASE_URL'] then this
file will be used to connect to your database:
If you have both config/database.yml and ENV['DATABASE_URL'] set then
Rails will merge the configuration together. To better understand this
we must see some examples.
When duplicate connection information is provided the environment
variable will take precedence:

Related

How to run two separate django instances on same server/domain?

To elaborate, we have one server we have setup to run django. Issue is that we need to establish "public" test server that our end-users can test, before we push the changes to the production.
Now, normally we would have production.domain.com and testing.domain.com and run them separately. However, due to conditions outside our control we only have access to one domain. We will call it program.domain.com for now.
Is there a way to setup two entirely separete django intances (AKA we do not want admin of production version to be able to access demo data, and vice versa) in such a way we have program.domain.com/production and program.domain.com/development enviroments?
I tried to look over Djangos "sites"-framework but as far as I can see, all it can do is separate the domains, not paths, and it has both "sites" able to access same data.
However, as I stated, we want to keep our testing data and our production data separate. Yet, we want to give our end-user testers access to version they can tinker around, keeping separation of production, public test and local development(runserver command) versions.
I would say you use the /production or /development path to select which database to use. You can read more about multitenancy from here https://books.agiliq.com/projects/django-multi-tenant/en/latest/

Django on a single server. Web and Dev environments with web/dev databases

In a couple of months, I'm receiving a single (physical) Ubuntu LTS server for the purpose of a corporate Intranet only web tools site. I've been trying to figure out a framework to go with. My preference at this point would be to go with Django. I've used RoR, CF and PHP heavily in the past.
My Django concern right now is how to have both a separate '/web/' and '/dev/' environment, when I'm only getting a single server. Of course this would include also needing separate 'web' and 'dev' databases (either separated by db name or having two different db instances running on the single server).
Option 1: I know I could only setup a 'web' (production) environment on Ubuntu and then use my corporate Windows laptop to develop Django tools. I've read this works fine except that a lot of 3rd party Django packages don't work on Windows. My other concern would be making code changes and then pushing to the Ubuntu server where I might introduce problems that didn't show up on the local Windows development environment.
Option 2: Somehow setup a separate Django 'web' and 'dev' environment on the same server. I've seen a lot of different and confusing information on this. Also adding to the complication is what I assume would be the need to have two database instances running on the same server. Or, how could you have two different Django environments for 'web' and 'dev' and have them point to different db tables based on name instead of needing two different db instances running?
Thanks for any advice. I'm actually having trouble relaxing and learning Django not knowing how bad this is going to deal with. I could easily just deal with the pain of developing in basic PHP if this is too over complicated. With plain PHP it's dead simple to have a '/web/' and and '/dev/' path and separate db's just by checking the URL or file path for '/web/' or '/dev/' (and then pointing to the right db for example - 'mytool_dev_v1' / 'mytool_web_v1').
There are multiple ways to solve this problem:
You can run 2 separate instances of django in the same server in different virtual environments. You can configure them in a multiple ways: using environment variables or just separate 'production' and 'dev' config-files and choose which gonna be used.
You can use docker containers to serve different django instances. It is the best way I think. You can configure them in the same way: by the environment variables or multiple config files for 'dev' and 'prod' options.
If you want to serve 2 (or more) sites in the same server youll probably need to configure nginx server to redirect requests to the separate containers or django instances depends on the domain name or something else (url, for example).
As I know there is no problem to configure separate database for each instance. You also can run your postgres or mysql instance in container. The same way you can run nginx.
I can't recommend you to develop your app in the same server where production app is running. I convinced that development must going in the developer's computer, but yeah... Windows is not the best for django development, but it mostly works. Otherwise I can recommend you to use dualboot or at least VirtualBox with Ubuntu.

Terraform Multiple State Files Best Practice Examples

I am trying to build out our AWS environments using Terraform but am hitting some issues scaling. I have a repository of just modules that I want to use repeatedly when building my environments and a second repository just to handle the actual implementations of those modules.
I am aware of HashiCorp's Github page that has an example but there, each environment is one state file. I want to split environments out but then have multiple state files within each environment. When the state files get big, applying small updates takes way too long.
Every example I've seen where multiple state files are used, the Terraform files are extremely un-DRY and not ideal.
I would prefer to be able to have different variable values between environments but have the same configuration.
Has anyone ever done anything like this? Am I missing something? I'm a bit frustrated because every Terraform example is never at scale and it makes it hard for n00b such as myself to start down the right path. Any help or suggestions is very much appreciated!
The idea of environment unfortunately tends to mean different things to different people and organizations.
To some, it's simply creating multiple copies of some infrastructure -- possibly only temporary, or possibly long-lived -- to allow for testing and experimentation in one without affecting another (probably production) environment.
For others, it's a first-class construct in a deployment architecture, with the environment serving as a container into which other applications and infrastructure are deployed. In this case, there are often multiple separate Terraform configurations that each have a set of resources in each environment, sharing data to create a larger system from smaller parts.
Terraform has a feature called State Environments that serves the first of these use-cases by allowing multiple named states to exist concurrently for a given configuration, and allowing the user to switch between them using the terraform env commands to focus change operations on a particular state.
The State Environments feature alone is not sufficient for the second use-case, since it only deals with multiple states in a single configuration. However, it can be used in conjunction with other Terraform features, making use of the ${terraform.env} interpolation value to deal with differences, to allow multiple state environments within a single configuration to interact with a corresponding set of state environments within another configuration.
One "at scale" approach (relatively-speaking) is described in my series of articles Terraform Environment+Application Pattern, which describes a generalization of a successful deployment architecture with many separate applications deployed together to form an environment.
In that pattern, the environments themselves (which serve as the "container" for applications, as described above) are each created with a separate Terraform configuration, allowing each to differ in the details of how it is configured, but they each expose data in a standard way to allow multiple applications -- each using the State Environments feature -- to be deployed once for each environment using the same configuration.
This compromise leads to some duplication between the environment configurations -- which can be mitgated by using Terraform modules to share patterns between them -- but these then serve as a foundation to allow other configurations to be generalized and deployed multiple times without such duplication.

What is the correct way to use different const data in local dev and production?

There are many things that are different in deployment and production. For example, in case of using Facebook API, I need to change id of application(because there are different id for testing and production) every time I push update to the app.
I update only app, so what do usually django developers do in this case? Possibly saving a variable to settings.py and then getting it from there or creating separated file in virtual environment folder, which in my case at least is also separated ?
There is no official way of splitting your Django settings for prod and dev -- developers are encouraged to find a way that works for them. The Django docs list out several good options here: https://code.djangoproject.com/wiki/SplitSettings

How to specify custom database-connection parameters for testing purposes in Play Framework v2?

I want to run my tests against a distinct PostgreSQL database, as opposed to the in-memory database option or the default database configured for the local application setup (via the db.default.url configuration variable). I tried using the %test.db and related configuration variables (as seen here), but that didn't seem to work; I think those instructions are intended for Play Framework v1.
FYI, the test database will have it's schema pre-defined and will not need to be created and destroyed with each test run. (Though, I don't mind if it is re-created and destroyed with each test run, but I don't want to use "evolutions" to do so; I have a single SQL schema file I'm using at this point.)
Use alternative configuration files while local development to override DB credentials (and other settings) ie. like described in the other answer (Update 1).
Tip: using different kinds of databases in development and production leads fast to errors and bugs, so it's better to install the same DB locally for development and testing.
We were able to implement Play 1.x style configs on top of Play 2.x - though I bet the creators of Play will cringe when they hear this.
The code is not quite shareable, but basically, you just have to override the "configuration" method in your GlobalSettings: http://www.playframework.org/documentation/api/2.0.3/scala/index.html#play.api.GlobalSettings
You can check for some system of conf setting like "environment.tag=%test" then override all configs of for "%test.foo=bar" into "foo=bar".