initializing virtualenvwrapper on Mac (10.6.8) for Django - django

I want to use Django and create virtual environments. I don't quite understand the initializing steps documentation on the virtualenvwrapper website. I've installed virtualenvwrapper in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages. I've already installed XCode, Homebrew and Posgres as well.
The documentation tells me to:
$ export WORKON_HOME=~/Envs
$ mkdir -p $WORKON_HOME
$ source /usr/local/bin/virtualenvwrapper.sh
$ mkvirtualenv env1`
I'm especially confused about the first line. Is it telling me that I need to create a project folder named 'WORKON_HOME' and export it into another folder called 'Envs'? (I've searched for both folders on my mac but didn't find them). And then in the second line I make another directory 'WORKON_HOME'?
If you have suggestions or links to better explanations/tutorials, I would greatly appreciate it. Thanks.

Place this 3 lines in your ~/.bash_profile file:-
export WORKON_HOME=$HOME/.virtualenvs
export PROJECT_HOME=$HOME/work
source `which virtualenvwrapper.sh`
The $HOME environment variable points to your user's home. Also known as the tilda "~", i.e. /Users/your_osx_username/.
WORKON_HOME is the new environment variable that you are assigning by using the export call in your ~/.bash_profile file. This is where all your newly created virtualenv directories will be kept.
PROJECT_HOME is where you normally place all your custom project directories manually. Nothing to do with your virtualenvs per say but just an easy reference point for you to cd to using the cd $PROJECT_HOME syntax.
which virtualenvwrapper.sh points to the location where the bash script virtualenvwrapper.sh is located and hence when you source it, the functions in that bash script becomes available for your mkvirtualenv calls.
Whenever you open a "new shell" (new tab, close your current tab after you first update your ~/.bash_profile file), all these environment variables and bash functions will be thus available in your shell.
When we create a new virtualenv using the mkvirtualenv -p python2.7 --distribute my_new_virtualenv_1, what actually happens is that a new directory called my_new_virtualenv_1 containing a symlink to your global python2.7 is being created and new python site-packages sub-directory are created in your ~/.virtualenvs/ directory. Reference:-
calvin$ mkvirtualenv -p python2.7 --distribute my_new_virtualenv_1
Running virtualenv with interpreter /opt/local/bin/python2.7
New python executable in my_new_virtualenv_1/bin/python
Installing distribute..........................................................................................................................................................................................................done.
Installing pip................done.
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/predeactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/postdeactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/preactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/postactivate
virtualenvwrapper.user_scripts creating /Users/calvin/.virtualenvs/my_new_virtualenv_1/bin/get_env_details
So if you do
cd ~/.virtualenvs/my_new_virtualenv_1
calvin$ tree -d
.
├── bin
├── include
│   └── python2.7 -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7
└── lib
└── python2.7
├── config -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/config
├── distutils
├── encodings -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/encodings
├── lib-dynload -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload
└── site-packages
├── distribute-0.6.28-py2.7.egg
│   ├── EGG-INFO
│   └── setuptools
│   ├── command
│   └── tests
├── pip-1.2.1-py2.7.egg
│   ├── EGG-INFO
│   └── pip
│   ├── commands
│   └── vcs
└── readline
You will see this directory structure in it.
Note of course that you are using Envs and I am using .virtualenvs to act as the virtual env holding directory.

Related

Transfer a django project to another machine

This is a beginner question, but I haven't found the answer. I'd like to transfer my django project from virtual machine with Ubuntu 18.04 to another virtual machine with Ubuntu 18.04.
This is the example directory structure
pd_videowebapp/
├── db.sqlite3
├── env
│   ├── bin
│   ├── lib
│   └── pyvenv.cfg
├── manage.py
├── media
│   ├── images
│   └── video
├── mysite
│   ├── core
│   ├── __init__.py
│   ├── __pycache__
│   ├── settings.py
│   ├── static
│   ├── templates
│   ├── urls.py
│   └── wsgi.py
├── Pipfile
├── requirements.txt
└── static
├── admin
├── style2.css
└── style3.css
In env directory there is a Python virtual environment.
Before I transfer it I would run
$ pip freeze > requirements.txt
Then I would zip all the directory structure except for db.sqlite3 and media directory.
Then unzip it on another VM.
Then copy the db.sqlite3 and media directory to the right place.
Then create a virtual environment on another VM.
Then run
$ pip install -r requirements.txt
Or should I rather copy the whole project with env directory in the beginning? What is better? Did I omit something? Or is there a better approach?
Or should I rather copy the whole project with env directory in the beginning? What is better? Did I omit something? Or is there a better approach?
It is better not to copy the env directory. Exclude this directory.
There are lots of ways to do this. I suggest you use Git. For this:
create a git repository from current project
use proper .gitignore file to ignore env directory and other environment-related stuff:
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Other stuff
clone the project from other VM and config the virtual environment in this VM
Simpler Way:
zip your whole project while excluding env directory and other
ignored stuffs manually.
move the zip file to other VM and config the virtual environment in this VM
You can use docker for a simpler transfer
and you can use the Dockerfile below:
# syntax=docker/dockerfile:1
FROM python:3.9
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
RUN pip install --upgrade pip
COPY requirements.txt /code/
RUN pip install -r requirements.txt
#COPY daemon.json /etc/docker/daemon.json
COPY . /code/
EXPOSE 80
you can find more on dockerizing a django app here https://docs.docker.com/samples/django/
Hope you find this helpful!
You're thinking the right way. You may compress the project without the env directory.
Create a virtual environment in the new system: assuming that you have already installed python3-pip, python3-dev etc. as required, and then set up the project in your new system by going to the project dir from the terminal and performing these commands (as you know) -
# install virtual env
pip install virtualenv
# create virtual env
virtualenv -p /usr/bin/python3 venv
# activate virtual env
source venv/bin/activate
# install project dependencies
pip install -r requirements.txt
Then you're all good. Besides, you may create a remote repository (e.g. on Github) and ignore the virtual environment dir env via the .gitignore file]
If you're looking for a solution that doesn't require reconfiguring the new system, I highly recommend using Docker
Each and every answer has an element that is missing from the others. I'll try and compile all of them into one actionable and reasonable plan.
Do not copy virtual env directories
The Python virtual environment directory is optimized to use links rather than copies, so that if you create similar virtual environments on the single machine, you won't bloat up your disk with copies of the same pip packages and Python files.
Copying an environment break this model. It might not work or misbehave.
Use git and .gitignore to specify volatile resources
If you manage your project using git, you should definitely have a .gitignore that says which resources are auto-generated, user-specific, volatile, etc. and should not be part of the shared repo.
You should set it up like #hamraa suggested before continuing to the next step.
Use git to relocate the project
Assuming you have an internet connection on both machines, why do you have to zip anything and move it?
If both machines have access to the same Git server, you should just push your changes to the global repo, and pull those from the other VM.
If that's not the case, but the target VM has an SSH connection to the source machine, you could just call:
git clone ssh://<username>#<hostname>:/<path-to-repo>
Use git-bundle to create a moveable bundle from your repo
If there's no network connection between the machines, you can use a cool Git feature of moving repos easily using a single file.
The command is as simple as:
git bundle create app.bundle <commit/tag/branch>
You can then move this bundle to any other machine and use:
git clone app.bundle
Consider using Docker
I won't argue against Docker, I love Docker. But(!) I don't think that has to be the solution. Using good bring-up scripts, you could just have a very easy to use Git repo that you can setup quickly and easily. Rarely there are cases where's there's a single right way. I don't think this is one of them.
Since you are transferring from one VM to another, assuming you are on AWS you can use the AMI to create a snapshot of your current instance (VM) and then launch a new instance (VM) with the saved AMI. The advantage of this is that it will retain all your current settings, codebase, and data. You don't have to start configuring the new machine from scratch.
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/creating-an-ami-ebs.html

Django i18n translations not working in production (Heroku)

My translations are working locally, but in production at Heroku, my site remains in its default language (English) after changing the language.
These are in my settings.py file:
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
USE_I18N = True
USE_L10N = True
LOCALE_PATHS = [os.path.join(BASE_DIR, 'locale')]
This is my structure:
myproject
├── Procfile
├── locale
│   └── fr
│   └── LC_MESSAGES
│   ├── django.mo
│   └── django.po
├── myproject
│   ├── __init__.py
│   └── settings.py
I thought it was a path issue so I SSH'd into my Heroku app and printed LOCALE_PATHS:
>>> from myproject.settings import LOCALE_PATHS
>>> print(LOCALE_PATHS)
['/app/locale']
And pwd in locale/ returns pwd
/app/locale.
What did I do wrong?
I found the issue:
my django.mo file was ignored by .gitignore as I use the default GitHub Python gitignore file.
The problem is that .mo files (compiled translations) are not present in the repo, and therefore not packaged to be deployed together with the rest of the application during the Heroku build process.
The possible solutions are to:
Add them to the repository
Generate .mo files during the build
I suggest to generate them during the build, for these reasons:
It automates compilation, one less manual step
It will ensure that the translations are always up to date
The compiled files are not source, and therefore should be in the repository
To generate them, you can use the post compile hook of the Heroku build:
Create a file bin/post_compile (no extension, like procfile) with the following line:
python ./manage.py compilemessages
Optionally you can add the specific language (e.g. python ./manage.py compilemessages -l nl)
I got this last part from this answer to a similar question.

Deploy private github repo golang app on elasticbeanstalk

I have been struggling trying to deploy my Golang app to AWS EB for couple days.
I am trying to deploy my app on an EB server Preconfigured Docker - Go 1.4 running on 64bit Debian/2.9.2 using the eb cli via the command eb deploy in my app folder.
After couple seconds, I've got an error message saying that my app wasn't deployed because of an error.
Looking at the eb-activity.log, here what I can see:
/var/log/eb-activity.log
-------------------------------------
Fetching https://golang.org/x/crypto?go-get=1
Parsing meta tags from https://golang.org/x/crypto?go-get=1 (status code 200)
golang.org/x/crypto (download)
Fetching https://golang.org/x/sys/unix?go-get=1
Parsing meta tags from https://golang.org/x/sys/unix?go-get=1 (status code 200)
get "golang.org/x/sys/unix": found meta tag main.metaImport{Prefix:"golang.org/x/sys", VCS:"git", RepoRoot:"https://go.googlesource.com/sys"} at https://golang.org/x/sys/unix?go-get=1
get "golang.org/x/sys/unix": verifying non-authoritative meta tag
Fetching https://golang.org/x/sys?go-get=1
Parsing meta tags from https://golang.org/x/sys?go-get=1 (status code 200)
golang.org/x/sys (download)
github.com/randomuser/private-repo (download)
# cd .; git clone https://github.com/randomuser/private-repo /go/src/github.com/randomuser/private-repo
Cloning into '/go/src/github.com/randomuser/private-repo'...
fatal: could not read Username for 'https://github.com': No such device or address
package github.com/Sirupsen/logrus
imports golang.org/x/crypto/ssh/terminal
imports golang.org/x/sys/unix
imports github.com/randomuser/private-repo/apis: exit status 128
package github.com/Sirupsen/logrus
imports golang.org/x/crypto/ssh/terminal
imports golang.org/x/sys/unix
imports github.com/randomuser/private-repo/app
imports github.com/randomuser/private-repo/app
imports github.com/randomuser/private-repo/app: cannot find package "github.com/randomuser/private-repo/app" in any of:
/usr/src/go/src/github.com/randomuser/private-repo/app (from $GOROOT)
/go/src/github.com/randomuser/private-repo/app (from $GOPATH)
I suppose there is an issue when the server tries to install the app, it seems it's trying to retrieve from my private repo on github ...
I referenced my app sub packages as github.com/randomuser/private-repo/subpackage I supposed this is why it behaves like that.
Is there a way to deploy all my code, forcing my private repo to be populated within the GOROOT src/github.com/randomuser/private-repo/ so the server doesn't have to try to get it?
I didn't found any proper example from Amazon docs (multi-packages apps) nor from Github.
Am I missing anything? Is there a better solution?
On a side note, I also tried to deploy my compiled binary directly (create a folder where I put only the binary, zip it and upload it on the ebs env) but it didn't worked neither ... Maybe this option requires yet another env config (if so, which one?).
Thanks for your help :)
Configuration
Golang app having the following folders:
├── Dockerfile
├── server.go
├── Gopkg.lock
├── Gopkg.toml
├── Makefile
├── apis
│   ├── auth.go
│   ├── auth_test.go
│   ├── ...
├── app
│   ├── config.go
│   ├── init.go
│   ├── logger.go
│   ├── scope.go
│   ├── transactional.go
│   └── version.go
├── config
│   ├── dev.app.yaml
│   ├── errors.yaml
│   └── prod.app.yaml
├── daos
│   ├── auth.go
│   ├── auth_test.go
│   ├── ...
├── errors
│   ├── api_error.go
│   ├── api_error_test.go
│   ├── errors.go
│   ├── errors_test.go
│   ├── template.go
│   └── template_test.go
├── models
│   ├── identity.go
│   ├── ...
├── services
│   ├── auth.go
│   ├── auth_test.go
│   ├── ...
├── util
│   ├── paginated_list.go
│   └── paginated_list_test.go
Here is the content of my server.go
package main
import (
"flag"
"fmt"
"net/http"
"github.com/jinzhu/gorm"
_ "github.com/jinzhu/gorm/dialects/mysql"
"github.com/randomuser/private-repo/apis"
"github.com/randomuser/private-repo/app"
"github.com/randomuser/private-repo/daos"
"github.com/randomuser/private-repo/errors"
"github.com/randomuser/private-repo/services"
)
func main() {
// getting env from command line
// env is either prod, preprod or dev
// by default, env is prod
env := flag.String("env", "prod", "environment: prod, preprod or dev")
flag.Parse()
...
router.To("GET,HEAD", "/ping", func(c *routing.Context) error {
c.Abort() // skip all other middlewares/handlers
return c.Write("OK " + app.Version)
})
...
// Serve on port 5000
My Dockerfile content:
FROM golang:1.4.2-onbuild
ADD . /go/src/github.com/randomuser/private-repo
RUN go install github.com/randomuser/private-repo
EXPOSE 5000
ENTRYPOINT /go/bin/private-repo
I finally managed to make it works.
So I created a brand new eb app (without docker).
I then figured out that my app wasn't able to retrieve env vars set in the console somehow ...
So what I did is that I forced the env variables to be passed to my app at startup time from my production config file using the build.sh script see below :
#!/bin/bash -xe
# See http://tldp.org/LDP/abs/html/options.html
# -x -> Print each command to stdout before executing it, expand commands
# -e -> Abort script at first error, when a command exits with non-zero status
# (except in until or while loops, if-tests, list constructs)
# $GOPATH isn't set by default, nor do we have a usable Go workspace :'(
GOPATH="/var/app/current"
APP_BUILD_DIR="$GOPATH/src/to-be-defined" # We will build the app here
APP_STAGING_DIR="/var/app/staging" # Current directory
DEP_VERSION="v0.3.2" # Use specific version for stability
ENV_VAR_PREFIX="TO_BE_DEFINED_"
# Install dep, a Go dependency management tool, if not already installed or if
# the version does not match.
if ! hash dep 2> /dev/null ||\
[[ $(dep version | awk 'NR==2{print $3}') != "$DEP_VERSION" ]]; then
# /usr/local/bin is expected to be on $PATH.
curl -L \
-s https://github.com/golang/dep/releases/download/$DEP_VERSION/dep-linux-amd64 \
-o /usr/local/bin/dep
chmod +x /usr/local/bin/dep
fi
# Remove the $APP_BUILD_DIR just in case it was left behind in a failed build.
rm -rf $APP_BUILD_DIR
# Setup the application directory
mkdir -p $APP_BUILD_DIR
# mv all files to $APP_BUILD_DIR
# https://superuser.com/questions/62141/how-to-move-all-files-from-current-directory-to-upper-directory
mv * .[^.]* $APP_BUILD_DIR
cd $APP_BUILD_DIR
# Pull in dependencies into vendor/.
dep ensure
# Build the binary with jsoniter tag.
go build -o application -tags=jsoniter .
# Modify permissons to make the binary executable.
chmod +x application
# Move the binary back to staging dir.
# Along with the configuration files.
mkdir $APP_STAGING_DIR/bin
# By default, `bin/application` is executed. This way, a Procfile isn't needed.
mv application $APP_STAGING_DIR/bin
cp -r config $APP_STAGING_DIR
# TODO: Fix the viper not working with env var
# Generate prod config from env variables
/opt/elasticbeanstalk/bin/get-config environment --output YAML | sed s/${ENV_VAR_PREFIX}//g > $APP_STAGING_DIR/config/prod.app.yaml
# Copy .ebextensions back to staging directory.
# cp -r .ebextensions $APP_STAGING_DIR
# Clean up.
rm -rf $APP_BUILD_DIR
echo "Build successful!!"
My build.sh file is called by EBS using this Buildfile:
make: ./build.sh
Et voilà ! Everything works properly now :)

Transform a list of md files to html in Jekyll

I am building a console application that go out to GitHub via octokit and fetch all the matched readme.md files. Then I saved these .md files to _posts folder in my Jekyll project.
I used the jekyll build command to build at the _posts dir level. It created a _site folder only contains .md files but .html. What am I missing here?
jekyll build shouldn't run inside _posts folder, it should be executed at root level, i.e., tipically where your _config.yml is.
.
├── 404.html
├── about.md
├── _config.yml
├── Gemfile
├── Gemfile.lock
├── index.md
└── _posts
   └── 2017-08-06-welcome-to-jekyll.markdown
Then your generated website will be located at /_site.

Merging an independent Git repo with another Git repo that is a conduit with Subversion: avoiding duplication when merging

I am happily developing a Django project in my own Git repo on my localhost. I am creating branches, committing and merging happily. The path is something like:
/path/to/git/django/
and the structure is:
project
├── README
├── REQUIREMENTS
├── __init__.py
├── fabfile.py
├── app1
├── manage.py
├── app2
├── app3
├── app4
└── project
The rest of my development team still use Subversion, which is one giant repo with multiple projects. When I am working with that on my localhost I am still using Git (via git-svn). The path is something like
/path/to/giant-svn-repo/
Projects live under this like:
giant-svn-repo
|── project1
|── project2
|── project3
└── project4
When I want to work with the latest changes from the remote svn repo I just do a git-svn rebase. Then for new features I create a new branch, develop, commit, checkout master, merge branch with master, delete branch, and then a final git-svn dcommit. Cool. Everything works well.
These two repositories (lets call them git-django and git-svn) are completely independent right now.
Now I want to add git-django into the git-svn repo as a new project (ie. in a child directory called djangoproject). I have this working pretty well, using the following workflow:
cd into git-svn repo
Create a new branch in the git-svn repo
Make a new directory to host my django project
Add a new remote that links to my original Django project
Merge the remote into my local directory
Read-tree with the prefix of relative path djangoproject so it puts the codebase into the correct location based on the root of git-svn repo
Commit the changes so everything gets dumped into the correct place
From the command line this looks like:
> cd /path/to/giant-svn-repo
> git checkout -b my_django_project
> mkdir /path/to/giant-svn-repo/djangoproject
> git remote add -f local_django_dev /path/to/git/django/project
> git merge -s ours --no-commit local_django_dev/master
> git read-tree --prefix=djangoproject -u local_django_dev/master
> git commit -m 'Merged local_django_dev into subdirectory djangoproject'
This works, but in addition to the contents of the django git repo being in /path/to/giant-svn-repo/djangoproject it is also in the main root of the repository tree!
project
├── README
├── REQUIREMENTS
├── __init__.py
├── fabfile.py
├── djangoproject
│   ├── README
│   ├── REQUIREMENTS
│   ├── __init__.py
│   ├── fabfile.py
│   ├── app1
│   ├── manage.py
│   ├── app2
│   ├── app3
│   ├── app4
│   └── project
├── app1
├── manage.py
├── app2
├── app3
├── app4
└── project
I seem to have polluted the parent directory where all the projects of the giant-svn-repo are located.
Is there any way I can stop this happening?
(BTW this has all been done in a test directory structure - I haven't corrupted anything yet. Just trying to figure out the best way to do it)
I am sure it is just (re)defining one more argument to either git merge, git read-tree or git commit but I am pretty much at my limit of git kung-fu.
Thanks in advance.