I'm upgrading an existing project from OL3 to OL6.
That includes some customised extensions to OL which I need to forward port.
What I can't seem to find, is the process to simply rebuild the library from source, to produce the same set of files (with our customisations) as here: https://github.com/openlayers/openlayers/releases/download/v6.14.1/6.14.1-dist.zip
To build the "legacy" ol.js library in npm, use:
npm run-script build-legacy
(with the correct dependencies, npm, shx; I got the openlayers source by cloning it in git)
Related
Let's say I have two projects. Lib and App.
App has Lib in it's conanfile.txt. Normally when conan install of App's dependencies is performed conan downloads and compiles Lib to ~/.conan/data/.
Is it possible to link App to Lib that is currently being worked on instead (e.g. /home/path/to/code/Lib/cmake-build-release/lib/ )?
The reason I want to do this is to debug a Lib's bug whose only known way to reproduce is by using App. I want to be able to quickly add std::cout to certain places and incrementally recompile. Rebuilding the whole conan package and doing conan install each time is too long. I was thinking about some hack that would change include path and linking path.
Yes, it is possible, Conan uses the same approach as Python pip: the "editable" packages. You can read more about it in this section of the docs: Editable packages. The basic idea is:
You cd libfolder and define it as a package in edition: conan editable add lib/1.0
You can build the lib there, like conan install . + cmake .. or conan build .
You can go to the App folder cd appfolder and do a conan install .. You should see that lib/1.0 is marked as "editable" and not from the Conan cache. You can build App, and it will be linking your dependency to lib from the local libfolder.
Every change that you do to libfolder and build there locally (incremental builds), will be directly available to App without needing to create or export-pkg to the Conan cache.
The editable feature relies on the correct definition in lib/1.0 of the project layout, in its layout() method. There are built-in layouts like cmake_layout(self).
If the different packages lib and app generate compatible projects, like Visual Studio, it is possible to join those projects in the IDE and have a convenient development and debugging experience of both packages together. There is a live demo in this C++ Italian Meetup presentation
Just tried Vcpkg Manifest on my cmake project and it is cool, with exceptions however.
My project depends on opencv and it takes a long time for vcpkg to install opencv. So I realized I don't want vcpkg downloawding/installing opencv every time I clone the project in a different folder.
Is it possible to use Vcpkg Manifest but make it install libraries system wide instead of locally to the project?
Or at least not inside the build directory, so will be possible to reuse it?
No, you can't install libraries system-wide in manifest mode.
But binaries are cached so that if you use a library in multiple projects, you don't have to build it from scratch.
https://github.com/microsoft/vcpkg/blob/master/docs/users/binarycaching.md
I abused vcpkg's --x-install-root to achieve similar results as manifest mode.
--x-install-root= (Experimental) Specify the install root directory
Under your project folder, you can install this project's dependencies into a system global directory by using this parameter, so that all projects can share the installed packages system wide. For example, in my case, I installed all packages into $VCPKG_ROOT/installed directory like this:
vcpkg install --x-install-root=$VCPKG_ROOT/installed
You can even use vcpkg list anywhere if you (ab)use it this way.
I am using conan in an enterprise environment where the operating system is rather old and has an old version of glibc (2.11). As a result, a lot of the pre-built binaries on conan.io do not end up working in my environment. However, conan doesn't know that and will happily download and install them on my system, resulting in link-time errors.
I have found that if I build from source I can get the libraries to work.
My desired behavior would be as follows:
The first time using conan install to install the library (e.g. it is not in my cache) then conan will build from source and place it in my cache, then use it.
On subsequent invocations of conan install, conan finds the cached library and uses that without having to rebuild from source.
I am invoking conan install as part of an automated build script, so I would like to not have to modify the invocation depending on if this is the first time the library is installed or not (but modifying a configuration file is fine).
I have had troubles obtaining this behavior in practice. Here are the challenges I have run into:
If I use conan install --build=thelibrary then conan will rebuild that library from source every time I invoke conan install --build=thelibrary, even if it is already present in my cache.
If I use conan install --build=missing, then Ican trick conan into building the library by setting some build options that do not have a pre-built binary associated with them.
This is fragile, as it only works for projects with enough build options that it is not tractable to create pre-built options for all combinations.
It also doesn't work if all the build options I need correspond to a pre-built binary.
Here is what I am looking for (and I assume exists but am not able to find):
Some setting I can place in my conanfile.txt (or some other configuration file) that tells conan to ignore pre-built binaries for a given library or libraries and instead build from source, but use the cached version if it is available.
This ideally should work without me having to tinker with build options.
I don't necessarily want to build all libraries from source, just the ones that won't run on my ancient OS, but if I have to settle for "all-or-nothing" I will take "all".
Is this possible with conan?
glibc version is an old headache for Conan, because it's not part of settings, thus is not counted as part of package ID. The Conan Docker images are running Ubuntu, some of them are old, others are new. But there is a specific Docker image running CentOS6, which was created because of glibc 2.12 and could help with package generation.
For your specific case, we have few options:
Add glibc as part of settings, so Conan won't replace your package because of its package ID. As you should have more coworkers, you can use conan config command for settings distribution.
# ~/.conan/settings.yml
glibc: [None, 2.11, ...]
Adding it, you can update you profile too, making glibc=2.11 as a default setting.
Another alternative is package revisions feature, where you can lock a specific binary package for usage, which means, you want use that specific package. You just need to upload your generated package with glibc and use its binary package revision, e.g. lib/1.0#conan/stable#RREV:PACKAGE_ID#PREV
Also, answering your question:
Some setting I can place in my conanfile.txt (or some other configuration file) that tells conan to ignore pre-built binaries for a given library or libraries and instead build from source, but use the cached version if it is available.
Your cache is Conan first option, it will look for a pre-built package there first, if it's not available, it will look into your remotes, following a sorted order. Your request is not possible, first, because conanfile.txt doesn't support build policies, second, because conanfile.py only supports build all from sources, or build only missing.
My propose is, install an Artifactory instance, build what you need, upload your custom packages, and make it as your default remote.
I don't necessarily want to build all libraries from source, just the ones that won't run on my ancient OS, but if I have to settle for "all-or-nothing" I will take "all".
You can associate some package reference to a remote, running conan remote command. Let's say you want to download zlib/1.2.11 built with glibc-2.11 and it's available only in your organization remote:
$ conan remote add_ref zlib/1.2.11#org/stable my_org_repo
$ conan remote list_ref # only to validate, not mandatory
zlib/1.2.11#org/stable: my_org_repo
Now your specific package is associated to your organization. Conan still will look for that package your local cache first, but when not found, it will try to find at your Artifactory.
As you can see, your case could be solved easier using a new setting, instead of trying to hack build policies. As another alternative, you can replace glib setting by distro and its version.
Boost using this build system I'm not otherwise familiar with, based on "Jam" files. Now, I've forked and cloned a specific Boost library (program_options), and I want to build it and perhaps also run the tests. I notice a build/Jamfile.v2 - what should I do with it?
I tried apt-get install jam on my distribution, but that did not get me very far:
$ jam -fbuild/Jamfile.v2
warning: unknown rule project
warning: unknown rule boost-lib
don't know how to make all
...found 2 target(s)...
...can't find 1 target(s)...
Also, do I have to get the absolute latest development version of all of Boost to build the cloned library against, or can I use a local boost release I already have?
Notes:
I'm on a recent GNU/Linux distribution (Mint 18.3 but this shouldn't matter).
What I've done, based on #SergeyA and others' advice, is:
Clone all of Boost, recursively (see this page (this will create a boost/ folder )
cd boost
in .git/modules/my_boost_lib/config, change the origin URL to your fork
in .gitmodules, under [submodule "my_boost_lib"], change the URL to your fork
execute git submodule update --init libs/my_boost_lib/ (perhaps after deleting that library; not sure if that's actually necessary)
cd libs/my_boost_lib/build
../../../b2
The latter works because b2 looks for a Jamfile.v2 in its current working directory, and that file exists and is intended to build just the library. The build results will be located outside of libs/my_boost_lib though.
Note: To build run the library tests, build the same way but from libs/my_boost_lib/test.
Essentially the build steps is
Run bootstrap to build the build tool b2
Build boost with b2 install or similar. You may want to provide options to it.
Read more in the boost getting started document:
http://www.boost.org/doc/libs/1_66_0/more/getting_started/index.html
(hint, look at lower right to go to next page..)
If you are on windows / VS2017, the use of vcpkg to get boost is very easy.
I am working in a company that they build a project separated in components that are developed separately by different developer teams. Everything in C++.
They use a lot of libraries in common and to manage all of them, they created a tool to somehow relates the version of the project and versions of libraries.
The question is about the existence of some tool in the market that already does this:
I mean, If I go to this tool, I can download for example the version 4.0 of our project that has exactly the version 4.5 of the library 1 and 3.4 of library 2. If I click "Download", I will Download the source code (or binary) of this entire (project + libraries) project and the concrete version of each library.
For example if I want to Download another project of another developers in the company, using same libraries in different version or platforms, I only have to choose that and is gonna download the project 2 with library 1 version 5.0 and library 2 2.5, and so on.
Is there in the market any tool that aloud me to create some relations like that, and btw, connects with code repo (gitlab in our case)?
I checked Gradle, Conan, ... but they build, not manage "relations" between components.
Something like that:
CMake provides enough functionality to create these kinds of relationships within your build system. Unfortunately you have to use CMake to manage the builds for all of your projects for this to work well. CMake does target Visual Studio as well as GCC, Clang, and ICC. If this interests you keep reading.
Use CMake to construct build configurations for your dependent projects.
Use ExternalProject commands to express the dependencies of the parent projects.
ExternalProject supports Git as well as Mecurial, CSV, SVN, and direct tarball downloads.
You can specify the exact commit, tag, or branch in Git.
Supports Git authentication via SSL or Basic HTTP.
Run CMake against the parent project. All dependencies are downloaded and compiled automatically.
Example Dependency
ExternalProject_Add(
Library1
GIT_REPOSITORY https://git#gitlab.yourdomain.com/repo/library_1.git
GIT_TAG tag_S.33.91
HTTP_USERNAME ciserv
HTTP_PASSWORD Obfusc#t3M3
CMAKE_ARGS
-DBUILD_EXAMPLES:BOOL=OFF
-DBUILD_TESTS:BOOL=OFF
-DBUILD_DOCS:BOOL=OFF
)
target_link_libraries(MyTarget PRIVATE Library1)
There are several other commands within the ExternalProject module that can be used to further customize the dependency if required.
To follow up #norman-b-lancaster's answer, consider looking into the Hunter package manager. It is based on CMake's ExternalProject feature and heavily focused on reproducible builds. Each Hunter release provides the CMake script required to lock down the package versions of all dependencies.
Your question suggests that you're concerned with package management within your company (as opposed to publicly available packages from ex: GitHub). Hunter's maintainers are aware of this issue and seem open to supporting it if the demand is present.
I wrote cget that provides a way to install a dependency just by pointing to the source tarball, which hosting services like github and gitlab provide. You can also provide a requirements.txt to install all the dependencies transitively as well.
It is cmake-based, which it will install with very little changes, but it also supports other non-cmake packages. You can also create a distro of recipes for handling packages that don't follow the standard configure, build and install workflow. There is also a lot recipes already available for many C++ projects already here.