How to do automatic releases/nightlies of C++ software with GitHub? - c++

What I'm looking for is something that builds C++ code every night or on every commit, and then, crucially, runs some commands to create a zip or a package which can then be added to a "Release" on GitHub.
I know there's travis-CI, which automatically compiles commits, and it can run for example a CMake INSTALL target and then CPack, which would create a zip or installer package. But it's not possible to upload these files to GitHub or display them somewhere.
I was thinking that maybe there was another service for that available which integrates with GitHub, but couldn't find any Google hits whatsoever. Preferably this would be separate from travis-CI, since on travis you would run debug-like builds (static analysers etc.). While for a release you want to deploy, you'd put release flags, build documentation, etc.
This is for an open source project so I'm looking for something that does this free for open source projects, preferably without setting up own server infrastructure.
There are a few related posts like Travis-CI Auto-Tag Build for GitHub Release or a travis section on deployment but they haven't really answered my question.

You can use travis-CI for this, check out "build artifacts" in the documentation.
https://docs.travis-ci.com/user/deployment/releases/
At time of writing it looks like this:
GitHub Releases Uploading
Travis CI can automatically upload assets from your $TRAVIS_BUILD_DIR to your git tags on your GitHub repository.
Please note that deploying GitHub Releases works only for tags, not for branches.
For a minimal configuration, add the following to your .travis.yml:
deploy:
provider: releases
api_key: "GITHUB OAUTH TOKEN"
file: "FILE TO UPLOAD"
skip_cleanup: true
on:
tags: true
Basically you would have to tag each commit that you want to get uploaded, so you could make a cron job that does that regularly, or do it manually, only on days that interesting work happened.
Alternatively, you could make it upload all builds to a google cloud storage account, or an amazon s3 account, and then you can cron job it from there. See docs for instance here.

Related

Issue downloading dependency from Amazon S3

I am currently trying to download a dependency from an Amazon S3 bucket for a maven framework project but Intellij is unable to download when I compile. In the .m2 repository it shows the folder for the dependency, it just doesn't contain the required information. There is also a settings file in the .m2 providing a username and password to the S3. In the Intellij console all dependencies are underlined in red in the maven window but only the two dependencies relying on the S3 are not being imported. Also, when I install the locally they are found and work fine.
Some of the actions I have taken:
Deletion of the repository
Deletion of the .m2 folder
Invalidate and cache
Reloading all projects
Downloading sources and documentation
Rebuilding
Installing locally (as mentioned above)
Reinstalling Intellij
Deleting the project and re-loading from code commit
If anyone has any ideas then I would be very grateful to try them out!
You can locate the proper Maven dependencies in the POM file that is located in the AWS Example Github located here:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/master/javav2/example_code/s3
This POM file is valid within an INtelliJ project:

what does AWS CodePipeline build provider do?

I am beginner at using AWS .so i want to ask some questions about CodePipeline.
so all i know about CodePipeline ,that i can connect it to github repo to let my app updated automatically ... then when i am doing the steps something called build provider appeared (its optional) .. and when i skip it it says "Your pipeline will not include a build stage"
so why do i have to do a build provider when my project compiled and build successfully locally on my PC ,i know its an optional step ,so can i know what it do exaclty ?
When you are automating your code deployment process you cannot build your code locally as its a manual process.
To Automate the process code-pipeline has a build provider which can build your code as n when required. So, that you don't need to checkout the code locally and build before deploying.

Azure devops to reuse the artifacts already download in previous runs

Currently we are using Nugget packages as our Azure artifacts and during the release process we download the artifact using "Download Package" task . It is working perfectly. But we noticed that even though we have downloaded the package already,during the next run of the pipeline in the same agent, again we have to download it. This is taking lot of time. So we want to prevent the package from getting download if it already present. Could you provide a way to reuse the already downloaded package.
In release pipeline, the System.DefaultWorkingDirectory (Example: C:\agent\_work\r1\a. Same as Agent.ReleaseDirectory and System.ArtifactsDirectory) is the directory to which artifacts are downloaded during deployment of a release. The directory is cleared before every deployment if it requires artifacts to be downloaded to the agent. This is a default behavior, we can not change it unfortunately.

Webpack: Should I build bundle on production server or build it locally and then upload?

I am deploying a React app on AWS Elastic Beanstalk. I bundle the app using webpack. However, I'm slightly confused about what best practices are from the production build process. Should I build the app locally (with NODE_ENV=production) using webpack, and then just upload the resultant bundle.js file, along with all node_modules to the Elasticbeanstalk instance? Or, should I upload all the source files, and run webpack on the actual cloud AWS server during deployment?
You should never build for production locally (unless you're the only developer).
Ideally, you have a build process that gets triggered manually or automatically from a git commit which then builds your project for production for you.
By using a centralized build process, you can then be sure that all your builds are built the same way (e.g. same node version, same npm or yarn version).
Both approaches are not really good to be honest. Local building is not a best way to build anything you want to have on production. You might have packages locally that may have inpact on what you're building. Same applies to the OS your doing it on.
And, again, same applies to the building during deployment. As the name of 'deployments' stands for, it's deploying. Just placing your application setup on the server so it may serve as it is supposed to.
That's the point where all CI/CD comes in. Having those kinds of solutions guarantee that each build is done with the same steps and on the same solution stack. No difference between each build is desired, because it allows you to assume that any bug or a change comparing to the 'desing' is because of the code, not environment it was build within.
Assuming that you're the only developer here (because you're asking for such a thing), CI/CD might be definitive overkill here, so just create shell script with steps and use Docker as the environment for build, so it stays the same between each build. That's the closest to the CI/CD option you can get without a hassle.

WebPack on VSTS Hosted Build

We're using the hosted build agent on VSTS to build and release our ASP.NET Core code to Azure App service.
My question is: can we run WebPack to handle front-end tasks on this hosted build on VSTS or do we have to do it manually before checking the code into our repository?
Update:
I'm utilizing the new ASP.NET Core Build (Preview) template that's available on VSTS -- see below:
Here are the steps -- out of the box:
For VSTS we're working on an extension, currently it's in beta phase, you can ask for a share.
Check the VSTS marketplace.
Check this github repo.
Webpack is definitively not a first class citizen for VS2015 and VSTS. Streamlining webpack for CI/CD has been a real headache in my case, especially as webpack was introduced hastily to solve dreadful performance issues with a large monolithic SPA (ASP.NET 4.6, Kendo, 15,000 files, 2000 folders). To cut short, after trying many scenarios to make sure that freshly rebuilt bundles would end up in IIS and Azure webapp, I did a 2-pass build. The sequence of VSTS tasks is as follows: npm install global, npm install local, npm webpack install local, npm webpack install global, build pass 1, webpack, build pass 2, etc... This works with hosted and private agents, providing you supply the proper path for webpack as webpack is installed in a different location in host and in private (did not find a way to chose the webpack install location for consistency). I scorch everything before starting the build. Also need to do these in VS2015 solution : (1) unload "built" folder, and (2) Add Content Include="Built\StarStar" in project file. The "built" folder contains the bundles and should appear greyed, otherwise more bad surprises and instabilities to deal with...
Build-Pass #2 task in VSTS BUILD allows to collect the fresh bundles generated by Build-Pass #1 and includes them automatically in the package to be published.
Without a second build-pass, collecting the bundles and merging them in the zip package is a nightmare, especially when you have 15,000 files to unzip then rezip (300 ms per file!!). Did not find file-merging capability that I could readily use in VSTS.
I have my hears to the ground listening for someone coming up with a more efficient CI/CD scheme for webpack. In the meanwhile, my 2-pass-build workaround is working flawlessly, but slow indeed.
I anticipate that the advances with ASP.NET core, Angular 2 and webpack will look into solving this elegantly.