This is a beginner question, but I haven't found the answer. I'd like to transfer my django project from virtual machine with Ubuntu 18.04 to another virtual machine with Ubuntu 18.04.
This is the example directory structure
pd_videowebapp/
├── db.sqlite3
├── env
│ ├── bin
│ ├── lib
│ └── pyvenv.cfg
├── manage.py
├── media
│ ├── images
│ └── video
├── mysite
│ ├── core
│ ├── __init__.py
│ ├── __pycache__
│ ├── settings.py
│ ├── static
│ ├── templates
│ ├── urls.py
│ └── wsgi.py
├── Pipfile
├── requirements.txt
└── static
├── admin
├── style2.css
└── style3.css
In env directory there is a Python virtual environment.
Before I transfer it I would run
$ pip freeze > requirements.txt
Then I would zip all the directory structure except for db.sqlite3 and media directory.
Then unzip it on another VM.
Then copy the db.sqlite3 and media directory to the right place.
Then create a virtual environment on another VM.
Then run
$ pip install -r requirements.txt
Or should I rather copy the whole project with env directory in the beginning? What is better? Did I omit something? Or is there a better approach?
Or should I rather copy the whole project with env directory in the beginning? What is better? Did I omit something? Or is there a better approach?
It is better not to copy the env directory. Exclude this directory.
There are lots of ways to do this. I suggest you use Git. For this:
create a git repository from current project
use proper .gitignore file to ignore env directory and other environment-related stuff:
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Other stuff
clone the project from other VM and config the virtual environment in this VM
Simpler Way:
zip your whole project while excluding env directory and other
ignored stuffs manually.
move the zip file to other VM and config the virtual environment in this VM
You can use docker for a simpler transfer
and you can use the Dockerfile below:
# syntax=docker/dockerfile:1
FROM python:3.9
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
RUN pip install --upgrade pip
COPY requirements.txt /code/
RUN pip install -r requirements.txt
#COPY daemon.json /etc/docker/daemon.json
COPY . /code/
EXPOSE 80
you can find more on dockerizing a django app here https://docs.docker.com/samples/django/
Hope you find this helpful!
You're thinking the right way. You may compress the project without the env directory.
Create a virtual environment in the new system: assuming that you have already installed python3-pip, python3-dev etc. as required, and then set up the project in your new system by going to the project dir from the terminal and performing these commands (as you know) -
# install virtual env
pip install virtualenv
# create virtual env
virtualenv -p /usr/bin/python3 venv
# activate virtual env
source venv/bin/activate
# install project dependencies
pip install -r requirements.txt
Then you're all good. Besides, you may create a remote repository (e.g. on Github) and ignore the virtual environment dir env via the .gitignore file]
If you're looking for a solution that doesn't require reconfiguring the new system, I highly recommend using Docker
Each and every answer has an element that is missing from the others. I'll try and compile all of them into one actionable and reasonable plan.
Do not copy virtual env directories
The Python virtual environment directory is optimized to use links rather than copies, so that if you create similar virtual environments on the single machine, you won't bloat up your disk with copies of the same pip packages and Python files.
Copying an environment break this model. It might not work or misbehave.
Use git and .gitignore to specify volatile resources
If you manage your project using git, you should definitely have a .gitignore that says which resources are auto-generated, user-specific, volatile, etc. and should not be part of the shared repo.
You should set it up like #hamraa suggested before continuing to the next step.
Use git to relocate the project
Assuming you have an internet connection on both machines, why do you have to zip anything and move it?
If both machines have access to the same Git server, you should just push your changes to the global repo, and pull those from the other VM.
If that's not the case, but the target VM has an SSH connection to the source machine, you could just call:
git clone ssh://<username>#<hostname>:/<path-to-repo>
Use git-bundle to create a moveable bundle from your repo
If there's no network connection between the machines, you can use a cool Git feature of moving repos easily using a single file.
The command is as simple as:
git bundle create app.bundle <commit/tag/branch>
You can then move this bundle to any other machine and use:
git clone app.bundle
Consider using Docker
I won't argue against Docker, I love Docker. But(!) I don't think that has to be the solution. Using good bring-up scripts, you could just have a very easy to use Git repo that you can setup quickly and easily. Rarely there are cases where's there's a single right way. I don't think this is one of them.
Since you are transferring from one VM to another, assuming you are on AWS you can use the AMI to create a snapshot of your current instance (VM) and then launch a new instance (VM) with the saved AMI. The advantage of this is that it will retain all your current settings, codebase, and data. You don't have to start configuring the new machine from scratch.
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/creating-an-ami-ebs.html
Related
I am new to Django and to programming overall, and I am trying to deploy my personal website developed with Django to heroku.
I feel like I am at the last step but one error keeps bugging me and I cannot solve it on my own or with documentation.
My project folders are like this:
enter image description here
My Procfile:
enter image description here
My wsgi file:
enter image description here
And in the heroku logs I get the following:
2020-01-04T15:45:47.211072+00:00 app[web.1]: ModuleNotFoundError: No module named 'blog'
There are many blue lines but I believe this is the one that is problematic (i.e. not being able to find the settings file from this)
Is there anything else I can provide if this is not clear?
Thanks for any help
I welcome any questions
UPDATE!
So, I updated the db I use from SQLite3 to PostgreSQL and I still get a Server Error 500.
Here is my manage.py where I put the PostgreSQL:
DATABASE in manage.py
The app runs when I do manage.py runserver but it does not when heroku deploys it.
The heroku log looks like this:
heroku log
To summarise the process of investigating startup problems. There were two issues:
1. The project layout was different for local development and for production on heroku whereas the codebase was set up for highest level mysite as a homepath except wsgi.py file what caused spending extra time on troubleshooting.
.
├── Procfile <- web: gunicorn mysite.mysite.wsgi
├── mysite
│ ├── __init__.py
│ ├── blog
│ ├── db.sqlite3
│ ├── manage.py <- python manage.py runserver
│ ├── mysite
│ ├── static
│ ├── staticfiles
│ └── templates
├── requirements.txt
└── venv
Solution:
Fix codebase for mysite as a homepath and run gunicorn with either --chdir or --pythonpath setting to have a proper path settings for gunicorn.
--chdir: Chdir to specified directory before apps loading:
web: gunicorn --chdir mysite mysite.wsgi
--pythonpath: Add directories to the Python path
web: gunicorn --pythonpath mysite mysite.wsgi
2. Migrations. There wasn't any migration applying action.
Solution:
Adding release phase to Procfile which will apply migrations every time the app is deployed:
release: cd mysite && python manage.py migrate
After solving these two issues the app got up and running.
I have the below file hierarchy
├── my_project
│ ├── Dockerfile
│ └── my_files
│ ├── test.py
│ └── main.py
and the Dockerfiles is:
FROM alpine:latest
MAINTAINER JohnSmith "johnhdf#gmail.com"
I have created an image under the my_project folder by:
docker build -t my_container .
and then started a container by running:
docker run -it <image id> /bin/sh
However, in the container, I could not find the folder my_files and main.py / test.py
Am I missing some steps? I thought running docker build with . would load the files in the current directory into the container somewhere, but I'm not sure where.
Mount the folder in your server with the folder in the container. For that, update your run command as follows.
docker run -it -v <fullpath>/my_project/my_files:/root/my_files <image id> /bin/sh
So that you can see all files in my_files in the /root/my_files of your container
I want to use Django and create virtual environments. I don't quite understand the initializing steps documentation on the virtualenvwrapper website. I've installed virtualenvwrapper in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages. I've already installed XCode, Homebrew and Posgres as well.
The documentation tells me to:
$ export WORKON_HOME=~/Envs
$ mkdir -p $WORKON_HOME
$ source /usr/local/bin/virtualenvwrapper.sh
$ mkvirtualenv env1`
I'm especially confused about the first line. Is it telling me that I need to create a project folder named 'WORKON_HOME' and export it into another folder called 'Envs'? (I've searched for both folders on my mac but didn't find them). And then in the second line I make another directory 'WORKON_HOME'?
If you have suggestions or links to better explanations/tutorials, I would greatly appreciate it. Thanks.
Place this 3 lines in your ~/.bash_profile file:-
export WORKON_HOME=$HOME/.virtualenvs
export PROJECT_HOME=$HOME/work
source `which virtualenvwrapper.sh`
The $HOME environment variable points to your user's home. Also known as the tilda "~", i.e. /Users/your_osx_username/.
WORKON_HOME is the new environment variable that you are assigning by using the export call in your ~/.bash_profile file. This is where all your newly created virtualenv directories will be kept.
PROJECT_HOME is where you normally place all your custom project directories manually. Nothing to do with your virtualenvs per say but just an easy reference point for you to cd to using the cd $PROJECT_HOME syntax.
which virtualenvwrapper.sh points to the location where the bash script virtualenvwrapper.sh is located and hence when you source it, the functions in that bash script becomes available for your mkvirtualenv calls.
Whenever you open a "new shell" (new tab, close your current tab after you first update your ~/.bash_profile file), all these environment variables and bash functions will be thus available in your shell.
When we create a new virtualenv using the mkvirtualenv -p python2.7 --distribute my_new_virtualenv_1, what actually happens is that a new directory called my_new_virtualenv_1 containing a symlink to your global python2.7 is being created and new python site-packages sub-directory are created in your ~/.virtualenvs/ directory. Reference:-
calvin$ mkvirtualenv -p python2.7 --distribute my_new_virtualenv_1
Running virtualenv with interpreter /opt/local/bin/python2.7
New python executable in my_new_virtualenv_1/bin/python
Installing distribute..........................................................................................................................................................................................................done.
Installing pip................done.
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/predeactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/postdeactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/preactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/postactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/get_env_details
So if you do
cd ~/.virtualenvs/my_new_virtualenv_1
calvin$ tree -d
.
├── bin
├── include
│ └── python2.7 -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7
└── lib
└── python2.7
├── config -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/config
├── distutils
├── encodings -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/encodings
├── lib-dynload -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload
└── site-packages
├── distribute-0.6.28-py2.7.egg
│ ├── EGG-INFO
│ └── setuptools
│ ├── command
│ └── tests
├── pip-1.2.1-py2.7.egg
│ ├── EGG-INFO
│ └── pip
│ ├── commands
│ └── vcs
└── readline
You will see this directory structure in it.
Note of course that you are using Envs and I am using .virtualenvs to act as the virtual env holding directory.
I want to include a Django app into the project I'm working on. The app is hosted on Github ( https://github.com/lmorchard/django-badger ). As well as the app's directory containing the goodies, there are some files in the root - README, LICENCE, TODO and setup.py. If I clone the app into my project's root directory, the app folder will be in the correct place, but those root files will be in my project's root. How can I add the app while still tracking the upstream code in Github?
I had a similar issue where I was working on two independent projects where both were in a repo, and one of them used the other as an app:
Create a virtualenv and install all dependencies for both projects. I usually like to have a virtualenv for each project/repo but in this case you need one env which can execute Python from both repos.
Clone both repos to independent location. Do not clone the depending app inside the other project. Your file-structure then might look like this (assuming Django 1.3 project layout):
project/
manage.py
project/
__init__.py
settings.py
...
...
app/
README
...
app/
__init__.py
models.py
...
And final step is to create a symlink (or shortcut on Windows) from the app directory which has __init__.py in it to the project path.
$ ln -s /abs/path/to/app/app /abs/path/to/project/
Now you can use the virtualenv to run the project!
The final result is that you have two independent repos however one of projects is using the other project without directly copying the code, hence allowing you to maintain two repos.
U can install it by running
python setup.py
or through pip
sudo pip install -e git+https://github.com/lmorchard/django-badger#egg=django-badger
Clone the repository from github using git://github.com/lmorchard/django-badger.git. Then open the cloned folder in terminal. Install the app using the command sudo python setup.py install. This will work good. If you want to have the app included in your project, create a folder named badger(or anything you wish) and copy the installed app from dist-packages to created folder.
I started to create in Django sample project, first command:
django-admin.py startproject test
gives me:
- root
- test
- __init__.py
- settings.py
- urls.py
- wsgi.py
- manage.py
Now I create first app:
python manage.py startapp foo
it created for me folder root/foo
so how I should understand my root/test folder. Is this folder for global config of my project and nothing more? (similar to Symfony 2 app folder)
I am confused because Django docs tells:
The inner mysite/ directory is the actual Python package for your
project
but manage.py startapp foo create app under root, not under root/test (mysite equivalent)
[EDIT]
Two commands:
python manage.py startapp app
and:
django-admin.py startapp app
gives me app inside project root, not under root/name_of_generated_project
Django 1.4
[EDIT] 2
Sorry guys, my fault, now is everything ok.
[EDIT] 3
I want to create another project again:
django-admin.py startproject jobeet
my initial structure is similar to above.
Now I want to try create app (inside jobeet folder):
django-admin.py startapp jobs
and I end up with jobeet/jobs not jobeet/jobeet/jobs
again :/
so inside my project root I have:
- jobeet
- jobs
- manage.py
another command:
python manage.py startapp app
gives me the same result
So let's say you create a new Django project testproject:
django-admin.py startproject testproject
This creates a new project with the following minimal set of files:
testproject/
├── __init__.py
├── manage.py
├── settings.py
└── urls.py
To create a new app for your first site mysite1 go into testproject directory and run:
python manage.py startapp mysite1
which results in a new directory mysite1 for the mysite1 app.
I.e. with just these two commands you would arrive at this hierarchy:
testproject/
├── __init__.py
├── manage.py
├── mysite1
│ ├── __init__.py
│ ├── models.py
│ ├── tests.py
│ └── views.py
├── settings.py
└── urls.py
Refer to the django-admin.py and/or manage.py individual commands here.
In Django there is a one-to-many relationship between a project and an app. An app is usually one individual site component (e.g. comments, ratings), whereas a project is an organisation of several apps and can power many different sites. That's why you get the sites framework. In practice, one project usually serves one website, but with Django sites app with one project you can serve as many websites as you like, for reusability's sake.
P.S. I think creating a project simply called test is not a good practice because with Django unit tests at app level unit tests will go into a file called tests.py or within a folder called tests.
UPDATE for Django 1.4
As #zeantsoi has commented below, my answer:
applies to Django 1.3 and prior. Per the docs, beginning in 1.4, base
configuration files (including settings.py, urls.py, and wsgi.py) are
abstracted into a subdirectory titled the same name as the project.
"test" is the top level of your project. I've never used symphony 2 so I can't comment on that, but it seems like you have a grasp on it. The files that live in there are basically all global config files. Inside your "test" folder you should also have one or more app folders. Inside these app folders live the apps specific models, views, urls, etc.
It seems you've got something a little wrong your foo app should live in root/test/foo not root/foo.
Now in my projects I tend to have things like a virtualenv folder live in the root dir, but you definitively shouldn't have apps at that level (it just won't work)
manage.py doesn't provide a startproject command - that's usually a django-admin command. I'd check which manage.py you're executing, and ideally, use the manage.py from the project directory you've created.