I have been trying to follow this particular guide on setting up a workon function where I am able to quickly switch from project to project within a virtualenv...
https://python-guide.readthedocs.org/en/latest/dev/virtualenvs/
I'm trying to re-organize my virtualenv into one particular folder ( I don't know if this is a smart or dumb idea...):
Structure:
-master_folder/ #I'm currently in this folder
-virtual_enviornments/
-project1/
-project2/
-projects/
-project1/
-project2/
It so happens that I am stuck at the third step:
$ pip install virtualenvwrapper
$ export WORKON_HOME=~/Envs
$ source /usr/local/bin/virtualenvwrapper.sh #STUCK HERE
ERROR: -bash: /usr/local/bin/virtualenvwrapper.sh: No such file or directory
I know the file does not exist within the directory, so am I supposed to create it some other way? I'm really confused... #.#
After running which virtualenv:
/Library/Frameworks/Python.framework/Versions/2.7/bin/virtualenv
Did I install my virtualenv incorrectly?
I did just do pip install virtualenv....
Thanks for all the help guys!
I dont't think you have installed virtual environment correctly.
Please go through this link
http://ayarshabeer.com/post/50973941605/install-multiple-django-version-using-virtualenvwrapper
Related
I'll try to explain it as easy as I can. I tried to include and build package "A" in my Yocto image, but package A depends on libftdi and ftdi-eeprom. Now, "ftdi-eeprom" depends on the "libftdi".
In the newer versions of the "libftdi" the tarball also includes the ftdi-eeprom sources too and when you build the libftdi it builds both of the packages. Although because of the way that package "A" is configured I need two different recipes for each of the dependencies.
long story short, I made the two bitbake recipes as best as I could and successfully built "libftdi". Now when I run the "ftdi-eeprom" recipe, it wants to populate some files into the sysroot that are already installed there by libftdi. Here is where the error occurs... duplicates!
Apparently I need to set a SSTATE_DUPWHITELIST variable and declare that these duplicate files are safe to replace the old ones in the image (this overwrite must happen). Can someone please help me with configuring the SSTATE_DUPWHITELIST? I am not that pro working with Yocto.
Errors that I get on screen are uploaded in Dropbox
Thanks in advance!
The answer is to not use SSTATE_DUPWHITELIST for this at all. Instead, in the libftdi recipe's do_install (or do_install_append, if the recipe itself doesn't define its own do_install) you should delete the duplicate files from within ${D} and then they won't get staged and the error won't occur.
I got it to work by using:
SSTATE_DUPWHITELIST = "/"
Dont forget the quotes. Here's my bb excerpt:
SSTATE_DUPWHITELIST = "/"
DEPENDS = ""
do_unpack() {
mkdir -pv ${S}
tar xvf ${DL_DIR}/${FILENAME}.tar -C ${S}
}
do_install() {
install -d -m 755 ${D}${includedir}
install -m 644 ${S}/${MYPATH}/inc/myHeader1.h ${D}${includedir}
install -m 644 ${S}/${MYPATH}/inc/myHeader2.h ${D}${includedir}
install -m 644 ${S}/${MYPATH}/inc/myHeader3.h ${D}${includedir}
}
I managed to solve this problem by adding the SSTATE_DUPWHITELIST to the bitbake recipe of the package as follows:
SSTATE_DUPWHITELIST = "${TMPDIR}/PATH/TO/THE/FILES"
I added the absolute path of all of the 6,7 files that had the conflict to the list. I did that because they were basically coming from a same source and it was all safe to do that. correct me if there is a better way though.
Hope this helps someone!
I am a brand new Noob and have no idea how to make this Mkvirtualenv install the proper requirements for a tutorial I forked here.
I made the Mkvirtualenv(django-angular-tutorial) but when I try to install the requirements (which is in the forked folder on my desktop) it gives me this error saying that the directory is not found.
Do I have to manually put this forked folder into the Mkvirtualenv folder for it to find it? Or how does this work?
"No such file or directory: 'requirements.txt'". So, there is no requirements file.
From the ~ in your prompt, it appears you're still in your home directory, and never cd-ed into the cloned/forked folder. Your requirements file is probably there.
To answer my own question:
I was not in the proper directory. So terminal came back with an error saying that it could not find "requirements.txt"
What I had to do was first navigate to to the folder that the "requirements.txt" was located in.(The folder was located on my desktop)
In terminal I did this:
1) Navigated to my Forked Folder
Perez-Austin:~ perau$ cd /Users/perau/Desktop/folderThatHasRequirements
2) Now you are pointed to the right directory and should show up like this:
Perez-Austin:folderThatHasRequirements perau$
3) "Create" or "Workon" your MkVirtualEnv (mine is called django-angular-tutorial)
Perez-Austin:folderThatHasRequirements perau$ workon django-angular-tutorial
4) Then Install your -r requirements.txt to your MkVirtualEnv
(django-angular-tutorial)Perez-Austin:folderThatHasRequirements perau$ pip install -r requirements.txt
5) It should install
I installed/ miniconda in the following directory:
/home/arturo/Documents/project1/pwd
but then I deleted it by typing:
rm -r pwd/
Now I can't run python anymore (from any directory). Not really sure what happened. I get this error:
bash: /home/arturo/Documents/project1/pwd/bin/python: No such file or directory
You have at least two choices:
Reinstall miniconda, into the same location as before.
Clear the executable cache with hash -r to eliminate the stale entry, which ties the python command to non-existent /home/arturo/Documents/project1/pwd/bin/python cf. What is the purpose of the hash command?
I'm working on a github repo which I just cloned. I have a new virtual environment and I'd like to add all of the packages from the requirements.txt file to the virtual env.
For some reason it is not finding my requirements.txt file.
Edit the first line of /Users/byrd/Desktop/Github Repositories/herokusite/venv/bin/pip file to correct the path to python. You can obtain this path by calling which python. I think it should be:
#!/Users/byrd/Desktop/Github\ Repositories/herokusite/venv/bin/python
EDIT: Seems like it is a known bug in unixes - you can't use spaces in shebang line.
Also try this workaround, it may help you.
Do not use spaces in any component of the path where your virtual environment is stored.
It causes problems for the bootstrapping process.
Create a new blank environment, in a directory that has no spaces in its path:
$ cd # this takes you to your home directory, in OSX its is /Users/yourlogin
$ cd Desktop
$ virtualenv myvenv
$ source myvenv/bin/activate
(myvenv) $ pip install -r /path/to/requirements.txt
first, execute which pip after activating the environment if you found a space between any of the folders like
as seen in this link
you must have noticed a space between the folder name
2nd july
next, delete the new virualenv (in my case envname) and rename the folder with space between its name
then create a new virual environment and then install the requirements through
pip install -r requirements.txt
on the folder location with the requirements file
I am trying to install a homegrown package that will be used , but python package installation is still a bit of a quagmire for me, and I haven't gotten this to work.
I created a package using setup.py sdist, which I uploaded to a repository
I am trying to install my package on another machine. I tried three methods, each time on an entirely clean machine. But none are doing what I want them to.
Method 1
easy_install http://mysite/mypkg.zip
RESULT: mypkg.egg gets added to \Python27\Lib\site-packages. But none of my folder structure is there
Method 2
pip install http://mysite/mypkg.zip
RESULT: two folders, mypkg and mypkg-1.0-py2.7.egg-info, get added to Python27\Lib\site-packages. All of the files seem to be there. But when I got to import or run nosetests on the folder, I get all sorts of import errors that reference mypkg modules. I have played with PATH and PYTHONPATH to get all variations of including the folder, but nothing has worked.
Method 3
download .zip
extract locally
add folder to PATH
run easy_install . in the local dir
RESULT: unpacks pkg locally. When I run nosetests on this folder, everything runs as expected.
Thing is, I don't want each user to have to do all of the steps in Method 3. I will eventually be running nosetests in a .bat file that does various things with the output. I don't want every user to have to modify the .bat file to indicate where the testsuite is located. Which is why Python27\Lib\site-packages appealed to me.
Any insight as to why these three methods behave so differently would be very helpful!