I am experimenting with C++ modules, using clang 5.0, and I am trying to understand how can I export from one module something that I have imported from another module. Is that even possible?
For example, I'd like to have something like this:
// root.hehe.cppm
export module root.hehe;
class hehe
{
};
and this:
// root.cppm
export module root;
import root.hehe;
export class hehe; // ... doesn't work!
export hehe; // Also doesn't work!
export import root.hehe; // No dice!
So that in the end I can do something like
import root;
// ...
hehe myhehe;
Is such a thing possible? I also tried figuring out if there could be a way to import all the submodules of root, like import root.*, but that didn't work either.
In C++20 (not whatever prototype version in Clang), you can use either of
export using ::hehe;
export using hehe=hehe;
to do this, with two caveats:
The first form must always use a qualified name (because the syntax was introduced long ago to copy names between namespaces).
You must first be able to use the name you imported, which is not the case in your example because root.hehe did not export it. (For the type alias approach, it’s sufficient to be able to name it via decltype or so.)
You can also use export import root.hehe; to reexport everything exported by the module being imported. There is no wildcard import syntax: module names with dots have no semantics whatsoever (in C++20).
Related
I have a python structure like this:
mymodule/
globalconfig.py # variables to set environment, etc
work.py # has: from mymodule.globalconfig import *
__init__.py
tests/
testspart1/
test_work.py # has: from mymodule.work import *
From inside work.py, all is well and I can access my global config variables and functions.
From inside test_work.py, I cannot access those variables, even if I add a second import,
from mymodule.globalconfig import *
Why is this? I wanted to use the same syntax as used in my modules.
thank you!
I am using py2.7 and, to get nice rspec-style outputs and verbose diffs,
pytest --spec -vv
Ref;
1.This answer reminded me I could use another format of import. If there are no other answers I will post my workaround. how to share a variable across modules for all tests in py.test
The import syntax that worked for me was directly importing the nested python file in addition to importing the file under test.
from mymodule.work import *
import mymodule.globalconfig as myconfigs
I assume it's a name clash or import circularity issue, but I could not figure out what the problem was. It took me a while so I wanted to be sure to post the solution for future me and others.
I am using Ember-CLI and now I faced the problem of importing AmplifyJS in my project. I downloaded Amplify using Bower however the library is not in an ES6 format. Therefore, when I try to use it in my project, I simply can't import it.
Basically I would want to do:
import Amplify from amplify;
//use amplify here
Brocfile.js
app.import('bower_components/amplify/lib/amplify.js');
Since a lot of libraries are no in the ES6 format yet, my question is: "Is there a way to easily import or use ES5 librairies in ES6".
If not, what is the recommended way of doing that in Ember?
You can't import Amplify from amplify; because it's not a module.
You've almost got it but just don't try to import the library. You need to reference it as a global the way that you would outside of an ember-cli app.
From the docs:
Provide the asset path as the first and only argument:
app.import('bower_components/moment/moment.js');
From here you would use the package as specified by it’s documentation, usually a global variable. In this case it would be:
import Ember from 'ember';
/* global moment */
// No import for moment, it's a global called `moment`
// ...
var day = moment('Dec 25, 1995');
Note: Don’t forget to make JSHint happy by adding a /* global MY_GLOBAL */ to your module, or by defining it within the predefs section of your .jshintrc file.
-- http://www.ember-cli.com/#standard-non-amd-asset
If you look at line 15 of the code https://github.com/mikehostetler/amplify/blob/master/lib/amplify.js#L15, library is attaching itself to global which is passed in here https://github.com/mikehostetler/amplify/blob/master/lib/amplify.js#L124
So basically you can directly use the global version of library anywhere like amplify.subscribe(...)
Am quite new to emberjs and ember-cli.
And I have always been wondering how a statement like this works:
import Ember from 'ember'
Does 'ember build' look up for 'ember' in node_modules?
I understand statements like this with relative paths:
import ENV from './config/environment'
but not the ones referred without a path.
This question raises in connection with Could not find module ember-validations, in an effort to find its root cause.
The sort answer is that Ember-CLI registers the global objects directly with the module system. Take a look at the code here. While it's wrapped in a little helper code, they essentially do this:
define('ember', [], function() {
return {
'default': window.Ember,
};
});
Then, Ember-CLI converts your import statement during compilation:
import Ember from 'ember';
Gets converted to:
var Ember = require('ember')['default'];
Keep in mind that this is how it's done when using a transpiler to use AMD modules. I'm not 100% sure how that code would work if we were using a native ES6 implementation, although I know that the syntax supports this kind of thing.
I have a file utils.py containing a function called f1().
From another Python script I can import utils or execfile('utils.py') and have access to f1(). What are the differences between the two methods?
There are many differences, but from your point of view the most significant is probably that import gives you more control over the namespace in which the objects defined in utils.py end up.
Let's consider three variants on import. The first is the one you asked about:
import utils
utils.f1()
utils is the only symbol that has been added to your workspace—any pre-existing f1 in your base workspace will not have been overwritten, and if there is none, then f1() on its own will not be recognized. For code I intend to maintain, I greatly prefer this way of importing, because it makes it easy for me to search my source file for all the places in which it depends on utils.
But if saying utils.f1() every time is too verbose then you can do this:
from utils import f1
f1()
Now if you say f1() that will call utils.f1(), because that is the code object that you have now associated with the name f1 in your workspace. It's now slightly harder to get an overview of where your code is reliant on the utils module. But at least this type of import statement gives you precise control over which symbols were imported and which not. You can even rename symbols during this process:
from utils import f1 as EffOne
EffOne()
Finally you can choose to lose control over the namespace entirely:
from utils import *
Now, who knows what symbols have been imported: basically everything that utils has to offer the world (or, if the utils developer took the trouble to specify an __all__ attribute, then everything listed there). I'd advise you to use import * only for quick-and-dirty programming, if at all.
This is actually the importing style that is closest to execfile from the namespace point of view: execfile('utils.py') does much the same as from utils import * in that it dumps all symbols defined by utils willy-nilly into your workspace. One slight difference is that execfile won't even limit itself to the symbols in __all__ if that is defined—in fact, the __all__ symbol itself will just get dumped in your lap along with everything else.
Beyond namespaces, there are still plenty of differences between from utils import * and execfile('utils.py'). One is caching: a second import call on utils will be very fast (the code will not be re-run), but a second call to execfile('utils.py') may take just as long as the first because the code will be re-run. Also, there may be some code (often test code) inside utils.py that the utils author does not want to run at import time, but only when the file is executed via execfile. Such code is placed inside an if __name__ == '__main__': clause.
I have written two modules m1.py and m2.py each of which uses various modules from the standard library.
For example
#m1.py
import sys
#.
#.
and
#m2.py
import os
#.
#.
What "bothers" me is that when I import in main.py the two modules m1.py and m2.py
I can use the functions defined in sys and os like this:
#main.py
print m1.sys.version
print m2.os.listdir()
Is this normal, or there is something I should consider when importing modules in my code?
Usually you don't need to worry about what is accessible in your namespace. Anyone who messes around with stuff that's not part of the module's documented API deserves whatever trouble they get. Python assumes that it's programmers are responsible adults.
The exception is when you specifically want to allow other code to use the otherwise discouraged from mymodule import * syntax. Then you want to limit what is public, so that you don't clutter up your importer's namespaces with your own internal stuff.
Here's how you can do that:
Names that begin with an underscore (e.g. _foo) are assumed to be private, and won't be imported with from mymodule import *. This isn't "real" privacy, as anyone who does a normal import will still be able to access them via mymodule._foo (but they probably shouldn't!).
If you want to make the modules you're importing private, use an as clause to give them a "private" name as described above. That is, use import os as _os.
Or, rather than messing around with underscores, you can create an __all__ variable that explicitly lists the module's public names. Only the names in the list will be imported with a from mymodule import * statement. Note that an __all__ sequence is required in packages if you want the submodules to be importable via from mypackage import *. That's because Python can't trust the filesystem not to mess with the capitalization of the filenames the package contains.