I'm using a python file as a data (because it is easier to use constant defined in python file this way).
Since my other data files are organized under a static directory, I'd like to make those data directories as a python module (putting __init__.py) so that I can use data.py under those directories as a python module.
Would you advise against it?
According to your last comment, you probably want to use json fixtures : https://docs.djangoproject.com/en/1.5/howto/initial-data/#providing-initial-data-with-fixtures
Related
Assume I have the following files,
pkg/
pkg/__init__.py
pkg/main.py # import string
pkg/string.py # print("Package's string module imported")
Now, if I run main.py, it says "Package's string module imported".
This makes sense and it works as per this statement in this link:
"it will first look in the package's directory"
Assume I modified the file structure slightly (added a core directory):
pkg/
pkg/__init__.py
plg/core/__init__.py
pkg/core/main.py # import string
pkg/string.py # print("Package's string module imported")
Now, if I run python core/main.py, it loads the built-in string module.
In the second case too, if it has to comply with the statement "it will first look in the package's directory" shouldn't it load the local string.py because pkg is the "package directory"?
My sense of the term "package directory" is specifically the root folder of a collection of folders with __init__.py. So in this case, pkg is the "package directory". It is applicable to main.py and also files in sub- directories like core/main.py because it is part of this "package".
Is this technically correct?
PS: What follows after # in the code snippet is the actual content of the file (with no leading spaces).
Packages are directories with a __init__.py file, yes, and are loaded as a module when found on the module search path. So pkg is only a package that you can import and treat as a package if the parent directory is on the module search path.
But by running the pkg/core/main.py file as a script, Python added the pkg/core directory to the module search path, not the parent directory of pkg. You do have a __init__.py file on your module search path now, but that's not what defines a package. You merely have a __main__ module, there is no package relationship to anything else, and you can't rely on implicit relative imports.
You have three options:
Do not run files inside packages as scripts. Put a script file outside of your package, and have that import your package as needed. You could put it next to the pkg directory, or make sure the pkg directory is first installed into a directory already on the module search path, or by having your script calculate the right path to add to sys.path.
Use the -m command line switch to run a module as if it is a script. If you use python -m pkg.core Python will look for a __main__.py file and run that as a script. The -m switch will add the current working directory to your module search path, so you can use that command when you are in the right working directory and everything will work. Or have your package installed in a directory already on the module search path.
Have your script add the right directory to the module search path (based on os.path.absolute(__file__) to get a path to the current file). Take into account that your script is always named __main__, and importing pkg.core.main would add a second, independent module object; you'd have two separate namespaces.
I also strongly advice against using implicit relative imports. You can easily mask top-level modules and packages by adding a nested package or module with the same name. pkg/time.py would be found before the standard-library time module if you tried to use import time inside the pkg package. Instead, use the Python 3 model of explicit relative module references; add from __future__ import absolute_import to all your files, and then use from . import <name> to be explicit as to where your module is being imported from.
I'm working on a system where I can't add a new module by adding it's path to sys.path. Instead, I want to place the module in the same folder as the files using it, and then import the module on runtime using imp or importlib (or similar).
I've tried to use both imp and importlib, but can not get it to work. Time will tell if I'm just misinterpreting how and what params to specify when using either of the two libraries.
The folder structure for my project is defined like this:
root-folder-in-sys-path/
- file1.py
- file2.py
- file3.py
- my-module/
--- __init__.py
--- helper1.py
--- helper2.py
As my example indicates, the root folder is part of sys.path. The files (file1.py etc.) is part of the system and is from where I need access to the module. Only files that contains classes of a specific type is added, so it will not be possible just to add the module files in the root to load them, as they will be ignored. Best case would be if helper1.py to helper-n.py is made available - otherwise It is ok if only one is loaded.
Thanks.
I was able to come up with a solution that loads anyone of the helpers. I guess it could easily be made into a package with many more modules this way as well.
To for example access helper1.py in file1.py, this will work:
import os.path, imp
path = os.path.abspath(os.path.join(os.path.dirname(__file__), "my-module/helper1.py"))
im = imp.load_source("helper1", path)
If you find a better solution, then please let me know!
During a pytest fixture, what is the best way to robustly get the location of a text file for users that may specify different working directories at runtime e.g. I want a person using the cmd line in the test fixture directory find the file as well as an integration server which may work in the project's root. Can I somehow include the text file in a module? What are best practices for including and getting access to non .py files?
I am aware of BASE_DIR = os.path.dirname(os.path.dirname(__file__)), but I am not sure if this will always refer to the same directory given a particular way of running the test suite.
os.path.dirname(os.path.abspath(__file__)) (which is what I think you meant above) worked fine for me so far - it should work as long as Python can figure the path of the file, and with pytest I can't imagine a scenario where that wouldn't be true.
I have an ember app, and a folder with a file playGame/game.js. This file includes game logic, and I want to import it for asset compilation.
If this file is under app/playGame/game.js and my Brocfile is like this:
app.import('app/playGame/game.js')
this gives the error, path or pattern app/playGame/game.js didn't match any files..
but if I put the file under bower_components/playGame/game.js and my Brocfile:
app.import('bower_components/playGame/game.js'), this compiles successfully.
What is the problem and solution here?
There are two parts to this:
Where should I put my file to import it as an asset?
Why isn't putting it in my app-folder working?
The way to do what you want is to create a folder called vendor in your root, put the file somewhere in there, and then import it in your Brocfile.js like so:
app.import('vendor/playGame/game.js');
This is documented on ember-cli.com, although somewhat hidden.
You could also put it in bower_components, but that folder is for things installed with bower, and could theoretically be deleted (in fact, this is a common recommendation to various issues). Things in bower_components is also not checked in to version control by default, which you probably want to do in this case.
This should solve your issue.
Now, why doesn't it work to put it in /app?
app is a special folder. From the documentation:
Contains your Ember application’s code. Javascript files in this
folder are compiled through the ES6 module transpiler and concatenated
into a file called app.js.
This is what makes it possible for you to import stuff from within your app. The folders in app is available directly under your <appname> namespace, along with some other files and folders like config/environment.
Example:
import myWidget from 'my-app/widgets/my-widget';`
The referenced file is /app/widgets/my-widget.js.
The ember-cli website has some more resources for how to use modules. Read those if this doesn't make any sense.
To sum up:
You could put your file in app, but that would make it part of your transpiled package, and you'd have to use it that way internally with an export and everything else that comes with it. It would end up as part of <appname>.js
You could put your file in vendor and import it in your Brocfile.js as explained above. It would be part of vendor.js and load before your app code.
I am venturing into the land of creating C/C++ bindings for Python using pybindgen. I've followed the steps outlined under "Building it ( GCC instructions )" to create bindings for the sample files:
http://packages.python.org/PyBindGen/tutorial.html#a-simple-example
Running make produces a .so file. If I understand how .so files work, I should be able to import the classes in the shared object into Python. However, I'm not sure where to place the file and how to let Python know where it is. Additionally, do the original c/c++ source files need to accompany the .so file?
So far I've tried placing the file in /usr/local/lib and adding that path to DYLD_LIBRARY_PATH to the .bash_profile. When I try to import the module from within the Python interpeter an error is thrown stating that the module can not be found.
So, my question is: What needs to be done with the generated .so file in order for it to be used by a Python program?
Python looks for .so modules in the same directories where it searches python ones. So you have to install it as you would normal python module either somewhere that is on python's sys.path by default (/usr/share/python/site-lib or something like that—it'd distribution-dependent) or add the directory to PYTHONPATH environment variable.
It's python that's loading the module using dlopen, not the dynamic linker, so LD_LIBRARY_PATH (note, there is no DY) won't help you.
Same as all other Python modules. It must be within one of the locations given in sys.path.