CPP include directories setup in azure devops - c++

I am trying to setup a build pipeline in azure devops. I have a cpp project. It needs a Framemaker FDKs to build. I have the FDK files on my local storage as well as on TFS. I want to link them in my pipeline so that they can be used while building and do not throw file not found errors.

Since Framemaker FDK is installed on local machines. You should create a self-hosted agent on the same machine where the FDK is installed.
And then you need configure your pipeline to run on this self-hosted agent by targeting the pipeline agent pool to your local agent pool. So that your pipeline will be able to access to the FDK installed on the local machine while running on the self-hosted agent.
See the detailed steps here to create self-hosted agent.

Related

How to deploy a container to multiple GCP projects and host with Cloud Run?

We have a requirement to deploy our application in multiple GCP projects with new projects being provisioned by Terraform. We have Terraform configured Cloud Build in each project, but we run into issues when the Cloud Build attempts to access the Source Repo in our centralized project.
We would prefer not to clone the repo, but rather instruct Cloud Build to consume and deploy from the central repo. It is also important that we have Cloud Build update each project as new code is deployed.
You should use a central project to run a single Cloud Build trigger that will build, push built container image in the project and deploy to Cloud Run services in other projects.
In order for the Cloud Build trigger to be allowed to deploy to Cloud Run in other projects, follow these instructions to grant the Cloud Build service agent the appropriate permission on the other projects
In order for Cloud Run to be able to import images from the central project, make sure you follow these instructions for each Service agent of each project.

Can I use local and hosted agents for the same build

I have Hosted agent in the Hosted queue and I also installed local agent in the Default queue. Can I use them for the same build configuration to have builds running in parallel (for different branches/versions/etc)?
The same build can’t be run in local and hosted agent. You can queue multiple build with different build agent. Another build can be triggered during current build by calling Queue a build rest api.
A build definition can only be configured against one Agent Queue, so it is not possible currently to use both Hosted as well as Default agent queue for a single build definition.
You can instead buy more Hosted Pipelines that will allow you to run more builds on Hosted agents in parallel or you can configure more local agents.

How to build a netbean project on aws

I created a java web project on netbean. it is working fine on local machine. I want to build the same java project on aws instance without netbean and also deploy the war file over there.
Can any one tell the process how to build the java project on aws instance.
All the files are uploaded using subversion on aws instance.

Deploy from Codeship to DigitalOcean Droplet

I've got a project setup in Codeship which hooks into my private Github repo, tests and builds. I want to use a custom script to deploy to my DigitalOcean Droplet (VM).
I thought of adding a git remote and using a simple push to the machine using SSH. This was based on the following resource. This fails because I would be pushing to a non-bare repo on the target machine.
Anyone aware of resources or ideas on how to deploy to the VM?
Found on Codeship's documentation that you can use a custom script to run commands using SSH. Instructions are here.

Rare scenario in DevOps - using jenkins

I have new to aws and jenkins. I have a scenario as below.
We have an aws AMI which has jenkins installed in it. The AMI is a Linux platform. We already have few jobs set in the AMI for code bases (PHP and Python) for Development and QA environment.
Now that we have a new framework in .net which is again a part of the same project done in PHP. These are windows services written in .net.
Right now the deployment are performed manually. We pull the code and build the code in the same machine. So we take care of stop/starting the services manually during this process on the Windows AMI dedicated for this testing. We would like to create a job (build and deploy) as we do for python and PHP.
The challenge is that we want to build the code on the Windows AMI and the jenkins in running on Linux AMI.
Is there a way to establish a connection between the AMI's running in different operating systems in aws.
Should we install powershell in windows to have ssh access. In that case we can establish a connection from Linux AMI to Windows AMI and then execute a .bat to do the rest of activities.
** We are specifically asked not to install another jenkins in Windows system since we want to maintain all the jobs in a single place and single server.
Its not actually a very rare scenario. Its not uncommon to have Jenkins running on Linux and also have the need to build and deploy windows applications using it.
Lucky for you Jenkins handles this quite easily using the concept of a master/slave architecture, where in your case the master node will be your primary Jenkins install running on Linux and you will setup one or more 'slave' instances running windows and the jenkins agent that allows the two to coordinate.
Its all explained here:
https://wiki.jenkins-ci.org/display/JENKINS/Distributed+builds