I come from Java world. I was looking for Apache Maven alternative in C++ world. I think I found the correct project. I have few questions and have not managed to find an answer.
Is it possible to manage local repository. Let's say, I work on 5 similar but different projects and this project share mostly the same dependencies. Will each project have it's own dependencies stored inside each project or is there a "system" wide (per user) local repository where dependencies are stored?
Is it possible to "publish" only to local folder so other project can "see" dependent block or it has to go over bii internet repository?
Or am I wrong - about how bii works.
Looks nice project. Keep up the good work.
Right now, projects act as virtualenvs, each project contains and build its dependencies. This is intended for fast evolving libraries. Imagine you have 5 similar projects all depending on the same library A, version 0. While working on one of those projects you can make a modification to A and publish a new version, an API breaking modification. The other 4 projects will continue depending on version 0, and will not break. When you move to those projects you can easily update their dependencies and fix the breakages.
You can share the same library among different projects straight ahead with sym links if working in linux, not working by now in windows.
For very stable, large projects that can be installed system-wide, it could be more convenient to depend on the installed version. CMake allows this very easily via FindXXX(). You can install system wide the binaries with CMake install, or you can even use CMake scripts or biicode python hooks to automatically download and install system wide those libraries. Check, e.g.:http://www.biicode.com/diego/opencvex, OpenCV is managed with a biicode python hook and installed system wide.
At this moment there is no "local" publication, and if you want to share that way among projects, yes, you have to go over the biicode cloud servers, simply with "bii publish".
However, we are transitioning to open-source. We will probably release first the client code, then we will release a server that could be deployed in-house. Not implemented yet, but a future feature is that this server could act as a proxy to the cloud one, you can publish to the local instance, but read from the cloud one. With a local installation of this server, you will be able to publish locally.
Related
I'm in the process of creating a custom registry hosted in Azure DevOps.
The plan going forward will be to host some third party libraries as well as our own libraries in this custom registry.
Each project will then be using manifests in order to declare all dependencies and their required versions.
So far everything works as expected. I've already created a port out of one of our libraries and I'm currently distributing it via our custom registry.
Now the part I'm unsure how to handle.
At my company we do an "air gapped" build which means the source code is taken to some machine on a private network with no internet connection where the build is performed.
This is of course problematic as the air gapped machine will not have access to the custom ports registry we're hosting on ADO, nor will it have access to the repos hosting those projects we're distributing via our custom registry.
I'm trying to figure out a solution to this issue.
My first thought was to tell the Air Gap team to first clone the required repos to a USB stick. Then we could configure Visual Studio to use overlay-ports which would use the source that was cloned on to the USB stick and a custom port file. I have no idea if this would actually work.
I'm curious what other folks have done who might be in a similar situation?
Does anyone have any ideas on how I could handle this scenario using vcpkg?
We are updating our sitecore to 8.2 and in the process I am trying to refine our source control and development workflow.
Goals
1. Have a single source of truth for support dlls, configs, lic, etc.
2. Have everything in source control that is needed to recreate the entire site from dev to prod. (excluding packages).
In order to have all of the different configs needed for the various machines I have created gulp tasks that transform the configs on build (dev, staging, prod). Those transformed configs are placed in a folder in the project that is then used to replace the originals on the target machines. This folder publishes all of its contents and seems to be working well so far.
What I don't know is how to deal with all of the config files that do not change.
Is it best to include all of those .config files in the project so that they publish? If not, then the target machine folders will have to be either manually managed (seems like a bad idea) or a script used to ensure the configs are up to date (more customization..by default not a great idea).
The only downside (that I see) to including all of the configs in the project is the weight that it would add to file searches (and that doesn't seem like a very strong argument).
Am I not seeing something?
How are you other Sitecore humans handling this?
Gregory
As a general rule of thumb, do not check in any default files into Source Control.
The main reasons are; bloat, making syncing/downloading from your source control take much longer, and upgrades, the latter being a much more important reason.
If/when you upgrade in the future, if you do not have any Sitecore files checked into source control then you can simply deploy a new/clean instance of Sitecore, fix any conflicts in your own code and then deploy on top. You don't have to try and figure out what has changed in the default install files between releases.
Any changes you need to make to Sitecore configs or settings should be made using patch files and only those custom files added to your solution.
How to handle this for deployments?
There are a few options. You could go done the scripted route, which will take a clean Sitecore install, unzip and made whatever modifications you need, then install/unzip the modules that you use in your solution one by one.
Another option maybe to create a default install with all the modules and then zip this up, then an install would be similar process to above but a more simpler case of just unzipping a single file. You could use Sitecore SIM to both install the instance, modules and then backup or do this manually.
Yet another alternative may be to check everything into Source Control, either under separate repository or a different project so ensure that all default files and configs are kept separate. If you need to upgrade in the future, simply delete the repo/project and add them back in again.
I would also do the same (a separate project) to keep all Support patches/dlls separate, again to help easily identify what fixes have been applied and to easily remove them if a future version resolves the issue.
These may add an additional step to your deploy, but keeping this separation will make your life much much easier when it comes to upgrade time.
I'm new to build tools and gradle. I'm currently developing for android. I've found a library on Github I would like to use in my app.
What is the best way to create this dependency? The library doesn't have ant, maven or even gradle support.
Some options that came to my mind:
1) Fork the repo and add gradle support.
2) Clone the repo and add maven support then add it to the local maven repo.
What do the experts think how to handle such dependencies?
If the library only publish sources, not binaries, you don't have any other option rather than build it yourself.
Once you did that, you can host the binary in binary repository (like Artifactory) for sharing it with your colleagues, or even publish it on a distribution platform (like Bintray) for anyone who want to use it. For the later, you'll be able to pass the ownership (and the maintenance burden) to the original author in any moment.
We're working on a project that has some Clojure-Java interop. At this point we have a single class that has a variety of dependencies which we put into a user library in Eclipse for development, but of course that doesn't help when using Leiningen (2.x). Most of our dependencies are proprietary, so they aren't on a repository somewhere.
What is the easiest/right way to do this?
I've seen leiningen - how to add dependencies for local jars?, but it appears to be out of date?
Update: So I made a local maven repository for my jar following these instructions and the lein deployment docs on github, and edited my project.clj file like this:
:dependencies [[...]
[usc "0.1.0"]]
:repositories {"usc" "file://maven_repository"}
Where maven_repository is under the project directory (hence not using file:///). When I ran "lein deps"--I got this message:
Retrieving usc/usc/0.1.0/usc-0.1.0.pom from usc
Could not transfer artifact usc:usc:pom:0.1.0 from/to usc (file://maven_repository): no supported algorithms found
This could be due to a typo in :dependencies or network issues.
Could not resolve dependencies
What is meant by "no supported algorithms found" and how do I fix it?
Update2: Found the last bit of the answer here.
add them as a dependency to your leiningen project. You can make up the names and versions.
then run lein deps and the error message when it fails to find it will give you the exact command to run so you can install the jar to your local repo then sould you decide to use a shared repo you can use this same process to put your dependencies there.
#Arthur's answer is good but I figured I'd flesh it out a bit more since it leaves some details lacking.
Always keep in mind Repeatability. If you don't make it so that anyone who needs access to the artifacts can get access to the artifacts in a standard way, you're asking for support hell.
The documentation on deployment is a good place to go to find out everything you need to know about deploying your artifacts. Since you're in a polyglot environment you probably can't have lein take care of deploying all your artifacts but at least you can get your clojure specific jars up into S3 or even a file share if you like. The rest of your artifacts will have to use Maven or Ant directly to upload the artifacts to the Maven repo on the file server or S3. At my current company we are using technomancy's excellent s3 wagon private to great effect for hosting our closed source artifacts and clojars for hosting anything that we can open-source.
What #Arthur is referring to is doing a lein install. All that does is install a copy of the current project into your local .m2 directory so that other projects on your box can reference them. Unless you have configured your install of maven to use a shared directory for your .m2 folder (maybe not a bad idea in your environment?), this will mean that anyone else who checks out your project will not be able to build it. If you wanted to go this route, you need to set the localRepository node in your $M2_HOME/conf/settings.xml to be the shared location that the rest of your team has access to. See the docs for more information.
YMMV but I've found it best to use Maven rather than Leiningen when you are working with Polyglot Clojure / Java projects.
It's mainly because the Java based tools (Eclipse etc.) understand Maven projects but don't really understand Leiningen projects. It's getting slowly better with the excellent Counterclockwise Clojure plugin, but the integration still isn't quite good enough yet for an efficient IDE based workflow.
On the repository side of things, I'd suggest setting up a private shared Maven repository. You're going to need it sooner or later if you plan to manage a complex set of dependencies within your team: might as well bite the bullet and get it done now.
I have started using the preview of Microsoft Team Foundation Service (TFS in the cloud, henceforth TFService) for a small project, and I'm currently setting up builds using the online build service included with TFService.
What I want to do is to add an installer of some kind. I've previously worked with InstallShield Limited Edition, WIX and Inno Setup and would like to keep using one of those if possible.
I've previously integrated Inno Setup as part of a build process (TFS 2010). This involved installing Inno Setup on the build computer, and adding a custom build task for running an inno setup script. The last part should be possible with TFService as well, because it's possible to create custom build process templates.
However, I realize that installing anything such as Inno Setup or InstallShield will not work with TFService, since it's not possible to install any 3rd party software on the build computer (it's just a cloud service running on some unknown virtual computer which I cannot access).
So my question is; is there a way to automatically create an installer as part of a build process running on TFService? For example, is the build service capable of building installshield projects out of the box (there's a license included with Visual Studio after all)? Or are there other ways to do this?
I have some experience with this trying to get WiX and InstallShield to work with Microsoft TFS Preview cloud service using their managed build agents. On these agents, you don't have administrator rights and you can't install software.
This currently rules out InstallShield which must be installed.
It is however possible to check the WiX binaries into source control and pull them down as part of your build.
WiX uses .wixproj files (MSBuild) to define their project compile activities. This references a targets file and other properties ( referencing registry values ) that won't exist when you deploy this way. A small bit of hacking will get all of these properties to resolve to workable values.
The one problem you may still have though (and I'm thinking TFS managed build environment ) is that you may have to configure your projects to skip MSI ICE validation suites. On the build machines, I played on the windows installer service was outright disabled and this prevented the tests from running.