On clang version 6.0.0 (tags/RELEASE_600/final) I can not import std submodules like std.vector. Whole import std works fine, but import std.vector not. I'm using libc++ modulemap which define this module properly.
Edit
Same problem exists with custom modulemap
module test {
explicit module sub {
header "test.hpp"
export *
}
}
It can not load module test.sub but reports that symbol foo could be find in it.
Edit 2
clang 5.0.2 behaves in same way.
Edit 3
6.0.1-rc1 same
Is there any issue releated to that or infirmation that it is not yet supported?
For import modules or sub-modules, clang 6 does not work perfectly everywhere, clang 7 will be improved in the future.
You can try to set -fmodules-cache-path=<your-cache-path> flag explicitly, and you can see that, if there is some module involved the build, clang will populate the pre-compiled module files (normally *.pcm) to it when it builds.
In clang 7 documentation, you can practice Module Map Language to create your own modules to include some headers and export them, then import your own modules. As the documentation described, it is not stable now. You need to try.
At least for this moment you can use import std as a temporary workaround.
Related
I've got a c++ project called core that compiles successfully and produces a dll and a lib file when compiled on Windows. Now I need to import functionality from core in another project called main. core has a file called core.module.ifc:
export module core
export import :mod
export import :egx
export import :utils
In my main project, I have a single demo.cpp which looks like this:
#include "someOtherLib.h"
import std.core // error occurs here
import core
.....
some other code
However, main does not compile with error:
1>C:\Users\main\Desktop\Projects\demo\src\demo.cpp(8,11): error C2230: could not find module 'core'
I am using VS 16 2019 to compile, with std::c++latest and platform toolset v142. The core.lib file is correctly given as input to the linker in the project's properties. From what I understand, the compiler has no way of knowing that core is an outside library and looks for export module core in the demo project (which obviously fails) and requires a file that has all the declarations of the core lib. Am I correct on this assumption? If so, how would this file look?
So I believe a summary of my question would be, how do I import a module that is exported from a library into my project?
import MySQLdb
try:
dbcon = MySQLdb.connect(host=host_name, user=user_name,
passwd=password, db=db_name)
except MySQLdb.Error:
pass
getting this pylint warning
Module 'MySQLdb' has no 'Error' member (no-member)
The Best:
Using extension-pkg-whitelist option:
A comma-separated list of package or module names from where C extensions may be loaded. Extensions are loading into the active Python interpreter and may run arbitrary code
--extension-pkg-whitelist=_mysql
PyLint parses (by default) the source files, but in Python the shape of a module can change at runtime from the shape defined in the source file. This option tells PyLint to actually import the specified module, and use the runtime definition.
Note that since the MySQLdb package wraps a C extension, you have to pass the name of the C extension (_mysql) instead of the package name (MySQLdb). (source)
Not Bad:
Using unsafe-load-any-extension option
Allow loading of arbitrary C extensions. Extensions are imported into the active Python interpreter and may run arbitrary code.
--unsafe-load-any-extension=yes
You could use the unsafe-load-any-extension option, but that would load every available extension, with its' (potentially dangerous) initialization code. extension-pkg-whitelist is safer, because it only loads the specified modules.
The Worst:
Using disable option
# pylint: disable=no-member
It doesn't really solve the issue, but only makes PyLint silent.
Thanks to #PCManticore, the maintainer of PyLint. Here's the comment of the maintainer.
Thanks to #ZevSpitz, the contributor of the best answer and this not bad answer.
It may help to use the --extension-pkg-whitelist option:
--extension-pkg-whitelist=_mysql
pylint parses (by default) the source files, but in Python the shape of a module can change at runtime from the shape defined in the source file. This option tells pylint to actually import the specified module, and use the runtime definition.
Note that since the MySQLdb package wraps a C extension, you have to pass the name of the C extension (_mysql) instead of the package name (MySQLdb). (source)
You could use the unsafe-load-any-extension option, but that would load every available extension, with its' (potentially dangerous) initialization code. extension-pkg-whitelist is safer, because it only loads the specified modules.
I have some (not so) old code in which I use pyximport, but the code fails right at
import pyximport; pyximport.install()
with
ImportError: No module named pyximport
I've made a few changes to my system since I last ran this code, so perhaps it was removed or not migrated; but I can't find this package anywhere and
pip search pyximport
yields no results.
What happened to pyximport? Where can I find it and, failing that, what should I use instead?
pyximport is a part of cython.
$ pip install cython
You can find the description of it here. In short, pyximport provides an import hook which allows you to import cython files (and compile them) as though they were python.
Let's say I want to use Immutable in my project (or any given npm package). I have npm installed it, so it is in node_modules. Of course, it has CommonJS exports there. I, however, want to use es6 modules in my project.
I am using Webpack to compile it all together, with the 6to5-loader to deal with es6 module syntax.
In my source file, I say import Immutable from 'immutable'; --- but this causes a problem because the es6 import is looking for an es6 default to have been exported, which isn't the case (for Immutable or probably almost any other npm package). The compiled code ends up looking like this: var Immutable = require('immutable')["default"]; --- which of course throws an error, since there is no default property to find.
Can I consume the npm packages with es6 modules?
Babel.js contributor here. You're looking for the following:
import * as Immutable from 'immutable';
// compiles to:
var Immutable = require('immutable');
Interactive demo
Note: This is with either the common or commonInterop modules option. For others, see: https://babeljs.io/docs/usage/modules/
Just figured it out. (The solution is tool-specific --- but es6 modules only exist now insofar as they are tool-enabled, so I think that's enough of an "answer".)
6to5's default module transpilation uses the common option, which results in the very problem I griped about above. But there is another option: commonInterop --- which must have been built to deal with exactly the situation I'm dealing with. See https://6to5.github.io/modules.html#common-interop
So three cheers for 6to5.
I'm using distutils to build a Python extension module written in C++. The problem I have is that in order to compile the extension module, I need to link with a certain shared library. This requires setting an additional compiler flag. So, I searched through the Python docs and found out about the extra_compile_args property of the Extension object. So I tried the following:
from distutils.core import setup, Extension
module = Extension('test', sources = ['test.cpp'])
module.extra_compile_args = ['--std=c++0x', '-l mylib'];
setup(name = 'test', version = '1.0', ext_modules = [module])
This seems to compile, except when I import my module in Python it throws an ImportError exception due to an undefined symbol. So, apparently the library didn't link properly. So I tried writing a throw away C++ program which linked with the shared library, and it ran fine. Then I realized something really odd is going on with distutils, because if I add a compile argument that links to a bogus library name, distutils just compiles everything with no problem:
module.extra_compile_args = ['--std=c++0x', '-l some_fake_library'];
When I run setup.py build, the build runs with no errors!
So, what's going on here? How can I compile an extension module that requires linkage to a shared library?
There's actually a special option for that.
For example:
libraries=["rt"]
You leave off the option and lib parts.
One of the purposes of distutils is to make your code not dependent on a single compiler. Your "-l somelib" looks like it's meant to work with GCC (even though it should be "-lsomelib", no space). This is why you use the libraries option to the Extension class. Distutils will then pass the appropriate link command to whatever compiler it's using.
You can also look at the actual build commands distutils is using and execute them yourself to see what is going wrong.