Is there a way to run Pinax without virtualenv?
I want to run it without virtualenv as I want to run it on a django-container on mediatemples grid-hosting service. Their containers can scale upto 1Gb of dedicated memory, so I wouldnt have to worry about my own VPS or scaling issues. But their response was:
" because of the way the DjangoContainer works, you won't be able to configure your server to use your virtualenv. Essentially the DjangoContainer is a virtualized server (to which you don't have access other than the AccountCenter tools, or the 'mtd' command line tool) with the specific purpose of serving your Django applications. It mounts your django container folder so that it has your application code, but you cannot modify the version or location of python it uses. This probably means you'll have to use Pinax without virtualenv support, as the general idea of using virtualenv in this way would be to create a custom environment for your Pinax application, which as I mentioned here is impossible to instruct the server to use. "
As of 0.9a1, Pinax can be used without pinax-boot.py which was the virtualenv dependency (we bundled it). Requirements are project-level and must be installed with pip. However, setup_project does enforce a virtual environment when installing requirements (it calls pip for you as a convenience; I would be open to not enforcing a virtual environment here). You can pass --no-reqs to setup_project forcing it to skip dependency installation. You can then run pip yourself and install it however you like.
technically yes, but you would have to change out quite a bit of the configuration that is handed out and hand install a lot of libraries. Pinax has virtualenv as a very low level built in assumption.
you can, all you need to do is find out what is in the virtualenv. set it up and install yolk in the virtual env and type yolk -l to see what you need to install to get it to work.
Related
Hopefully this isn't too stupid of a question, concerning the use of pipenv for the same Django project on different computer systems. The scenario is that I'm using pipenv with a test Django project on one laptop, everything works fine, using VS Code and it's using the proper pipenv environment for the Python environment within VS Code.
The project however is within Dropbox so when I'm using a different laptop, which I do sometimes, one is my work laptop the other is my personal one at home, I can work on the same project wherever I left off. So you can probably deduce the issue I'm having.
I'm using pipenv environment A on my work laptop for the Django project. But when I open the project in VS Code on my personal laptop at home I have to keep specifying the proper pipenv environment to use, which obviously is different than the one on my work laptop.
Maybe I shouldn't be working this way and should just work on one laptop for the project, but I imagine others have done the same with similar setups before.
Is there a "proper" way to do this, using different pipenv environments on different machines for the same Django project? Should I copy over the pipenv environment to each machine as well? Or am I missing something really simple here?
First, use version control. (git is the defacto standard these days and there are plenty of services to choose from)
For using pipenv, you want to keep your Pipfile and Pipfile.lock in your repository (these are the only files you should be copying related to your environment from system to system).
The local environment is then built on each machine where it is needed by using pipenv sync or pipenv sync --dev (to install dev specific requirements).
So the related virtualenv will be created on each system in which you are developing your project.
Assuming the django project repository is on github and I have had no interaction with it previously.
So: I cd to a new directory on my computer.
I clone the repository.
If the django project is using postgresql, do I have to have postgresql installed on my local machine?
Do I have to be running in a virtual environment to use a specific interpreter?
Thanks Peter
Database
You can actually use another database on your local copy if you choose, although in general it's a good idea to use the same database locally.
If you're going to be using postgres locally, yes you'll need to install it and then create your local database. Once you have your local database setup, you'll need to change some config values of your DATABASES property in settings.
Packages
Your project will also have some dependencies which should be listed in a requirements.txt file at the root directory. If it is not, you'll need to find out which packages need to be installed via pip freeze in the production console.
Virtual Env
You should use a virtual environment, but it's not completely necessary to get your project up and running. Virtualenvs allow you to have different installs and runtimes for different projects.
Other
Every project is different, and there will most likely be some other things that pop up. However, this should get you going in the right direction.
This is a Django and Python and maybe just a general web development question.
what's the difference between using virtualenv vs vagrant vs virtual box and etc... ?
I'm kinda confused as to when to use which one :/ I've been using virtual env this whole time and creating new virtual environments for different projects....
Is this the right way to do it?
One virtualenv per project?
I'm not really sure when and where vagrant comes into play...Am I supposed to set up vagrant and then use virtualenv?
This is probably a silly question but...if I were to be working on this project with other people. Would they too have to set up a virtual env? Just to collaborate?
Wouldn't it make more sense that we all work on our local machines and then push it into the main branch? I'm just kinda confused .... I feel like I'm doing it all wrong...
Thanks for the replies everybody!
Virtualenv sets up a local sandbox for you to install Python modules into.
Vagrant is an automation tool for creating Virtual Machines.
VirtualBox is a free, open source environment for running virtual machines, like those created by Vagrant.
Virtualenv is really about all you'll need to do sandboxed development on your local machine. We use Vagrant at my work to automate the creation of VMs. This way new developers coming on to a project have basically zero configuration to do in order to start working.
If you're collaborating with other devs, they don't need to do any of the above to work on your Django project, but if there's a lot of configuration involved that can't be done with pip and a requirements.txt, then you might look at Vagrant to ease some of that automation.
But you are correct in your assumption that you can all just work on a local branch and push back to the repo. Everything else is just icing.
Virtualenv is a python construct that holds a specific set of packages, separate from your system packages. The version of Python and its packages that came with your OS or that you installed separately is a "system package".
Virtualbox is totally different -- it's a VM, an entire operating system in a box.
I'm not familiar with Vagrant.
All you need is virtualenv. Create a new virtualenv for each project (they're very lightweight!) You need to do this because the whole point of virtualenv is to isolate the exact packages and versions of those packages you need for your project. Then activate the virtualenv and use pip install to install the packages you need, presumably starting with Django itself.
Once you have all the packages you need, use pip freeze > requirements.txt to create a file called requirements.txt that records all of the packages you've decided to use.
When other people collaborate on your project, they can start a virtualenv, pull your code into it, and run pip install -r requirements.txt to replicate your environment. They can even modify requirements.txt, push that back to you via your version control system, and you can run pip install -r requirements.txt yourself to modify your environment to match their changes.
This is all essential because without virtualenv, the problem of, for instance, having one project on your computer that requires Django 1.4 and one that requires Django 1.5 becomes very complicated.
Virtualenv is not an entire operating system in a box, just a python environment, so even if you are using it, you are still working on your local machine.
We use virtualenv and a Ubuntu virtual machine. Here's why:
virtualenv allows us to have isolated Python environments on a given operating system instance
Using Ubuntu dekstop in a virtual machine for our Python development mimics what it will look like when deployed on the server which is also Ubuntu. This means that we understand precisely the external OS package dependencies and configuration. You don't get this easily when you use OSX or Windows for development and Linux for deployment.
One important point is that a virtual machine is portable. You can take a snapshot and deploy it elsewhere easily. With Vagrant and Ansible combination you can automate a remote deployment.
I have multiple django projects running different django versions in their own virtualenv. I want to use sphinx-api-doc command to generate api docs for the django projects. However i dont want to install sphinx directly in the system and would like to install it in a separate virtualenv.
Since only one virtualenv can be activated at a time, i am not able to use sphinx-api-doc. Is there a way to use sphinx-api-doc with sphinx and django in independent virtualenv or is installing sphinx directly in the system the only way to go?
The api documentation for your code can only be generated with proper access to your code, so the anser will be "no, you'll need to have them both in the same virtualenv".
Some extra thoughts:
If your code virtualenv isn't isolated from the system's python packages, you could install sphinx globally, but you probably don't and shouldn't want that.
I'd just add sphinx to your code's virtualenv. I don't think you'll have to worry about extra overhead of a few extra kilobytes.
I have to deploy a Django application onto a SuSE Linux Enterprise 11 system. Corporate rules say I need to deploy using RPMs only. While I can use ./setup.py bdist_rpm for each dependency, it's not really sane, since RPM doesn't record all of the dependencies yet. Therefore I'd have no real advantage in using RPMs and managing dependencies manually is somewhat cumbersome and I would like to avoid it.
Now I had the following idea: While building a package, I could create a virtualenv, install all my dependencies via pip there and then package it up with the rest of the code into one solid RPM.
How sensible is this approach?
I've been using this approach for about a year now and it has worked out pretty well.
One gotcha is that you'll want to check out the bang lines in any python scripts written to the virtualenv's bin directory. These will end up being full path names used in your build environment, which probably won't be the same directory where you end up installing the virtualenv. So you may need to add some sed calls in your RPM's postinstall to adjust the paths.