CloudBuild with GITLAB at module level - google-cloud-platform

I was working on GITHUB and GCP(Cloud Build for deployments) and working good. Below are the steps:
Created multiple Cloud Functions and used same GIT HUB repository.
Created separate Cloud Build Trigger for each Cloud Function where separate cloudbuild.yml in each Cloud Function folder in repository.
Trigger gets run when there are changes in respective cloud function scripts.
Now i need to integrate Cloud Build with GITLAB.
I have gone through the documentation but found that only webhook is the option and the trigger will be based on whole repository changes. It will require separate repository for each cloud function or Cloud Run. There is no option to select the repository itself.
Can experts guide me on this how I can do this integration because, we are planning to have one repo and multiple service/applications stored in that repository. And we want CI to run on GCP environment itself.

Personally I found GitLab being the worst in comparison to GitHub and BitBucket in terms of integration with the GCP Cloud Build (to run the deployment within GCP).
I don't know ideal solutions, but I probably have 2 ideas. None of them is good from my point of view.
1/ Mirror GitLab repository into GCP repository as described here - Mirroring GitLab repositories to Cloud Source Repositories One of the biggest drawbacks from my point of view - the integration solution is based on a personal credentials, and there should be a person to make it working -
Mirroring stops working if the Google Account is closed or loses access rights to the Git repository in Cloud Source Repositories
When mirroring is done - you probably can work with the GCP based repository in an ordinary way and trigger cloud build jobs as usual. A separate question - how to provide deployment logs to those who initiated the deployment...
2/ Use webhooks. That does not depend on any personal accounts, but not very granular - as you mentioned push on the whole repository level. To overcome that limitation, there might be a very tricky (inline) yaml file - executed by a cloud build trigger. In that yaml file, not only we should fetch the code, but also parse all changes (all commits) in that push to find out which subdirectories (thus separate components - cloud functions) are potentially modified. Then, for each affected (modified) subdirectory we can trigger (asynchronously) some other cloud build job (with a yaml file for it located inside that subdirectory).
An obvious drawback - not clear who and how should get the logs from all those deployments, especially if something went wrong, and the development (and management) of such deployment process might be time/effort consuming and not easy.

Related

What is the difference between Cloud Build and Cloud Deploy?

They both seem to be recommended CI/CD tools within Google Cloud.. but with similar functionality. Would I use one over the other? Maybe together?
Cloud Build seems to be the de facto tool. While Cloud Deploy says that it can do "pipeline and promotion management."
Both of them are designed as serverless, meaning you don't have to manage the underlying infrastructure of your builds and defining delivery pipelines in a YAML configuration file. However, Cloud Deploy needs a configuration for Skaffold, which Google Cloud Deploy needs in order to perform render and deploy operations.
And according to this documentation,
Google Cloud Deploy is a service that automates delivery of your applications to a series of target environments in a defined sequence.
Cloud Deploy is an opinionated, continuous delivery system currently supporting Kubernetes clusters and Anthos. It picks up after the CI process has completed (i.e. the artifact/images are built) and is responsible for delivering the software to production via a progression sequence defined in a delivery pipeline.
While Google Cloud Build is a service that executes your builds on Google Cloud.
Cloud Build (GCB) is Google's cloud Continuous Integration/Continuous Development (CICD) solution. And takes users code stored in Cloud Source Repositories, GitHub, Bitbucket, or other solutions; builds it; runs tests; and saves the results to an artifact repository like Google Container Registry, Artifactory, or a Google Cloud Storage bucket. Also, supports complex builds with multiple steps, for example, testing and deployments. If you want to add your CI pipeline, it's as easy as adding an additional step to it. Take your Artifacts, either built or stored locally or at your destination and easily deploy it to many services with a deployment strategy of you choice.
Provide more details in order to choose between the two services and it will still depend on your use case. However, their objectives might help to make it easier for you to choose between the two services.
Cloud Build's mission is to help GCP users build better software
faster, more securely by providing a CI/CD workflow automation product for
developer teams and other GCP services.
Cloud Deploy's mission is to make it easier to set up and run continuous
software delivery to a Google Kubernetes Engine environment.
In addtion, refer to this documentation for price information, Cloud Build pricing and Cloud Deploy pricing.

Best approach for migrating Maven development projects into AWS Code Pipeline

We are trying to migrate several of our Java/Maven projects into AWS Code Pipeline and we could not find a good and reasonable migration approach (our current architecture is to use AWS for production). Specifically we are interested in several things:
How to cache Maven dependencies so that build tasks do not download the same packages all over again.
There are several approaches possible, for example:
a) Use Code Artifact, but then the Maven projects will be connected to a specific AWS subscription.
b) Use S3 buckets, but then 3PP modules (Maven Wagons) will need to be used.
c) Use EC2 instance for building.
d) Use Docker container created specifically for build purposes.
It is not really clear if Jenkins or Code Pipeline is recommended as a CI/CD product in AWS. We could see some examples that Code Pipeline is used with Jenkins. What is the purpose of such a setup.
Thank you,

How to set up a remote backend bucket with Terraform before creating the rest of my infrastructure? (GCP)

How would I go about initializing Terraform's backend state bucket on GCP first with Gitlab's pipeline, and then the rest of my infrastructure? I found this but not sure what that implies with Gitlab's pipeline.
This is always a difficult question. My post won't answer your question directly but will give my view about the subject. (too long to be a comment)
It's a bit like asking to manage the server where you have your CI tools with the same CI tools (for exemple: the gitlab server managing itself).
If you use gitlab CI to create your repository, you won't be able to keep the state as you would not have remote state to store it for this specific task. This would mean you would have an inconsistent resource with a tf but no state.
If you want to integrate it with your CI I would recommend using gcloud cli inside your ci, checking if the gcs exists and if not creating it.
If you really want to use terraform, maybe use the free tier of terraform cloud with remote backend only for this specific resource. Like this you have all resources managed by tf, and all with a tfstate.
You now have another option, which does not involve GCP, with GitLab 13.0 (May 2020)
GitLab HTTP Terraform state backend
Users of Terraform know the pain of setting up their state file (a map of your configuration to real-world resources that also keeps track of additional metadata).
The process includes starting a new Terraform project and setting up a third party backend to store the state file that is reliable, secure, and outside of your git repo.
Many users wanted a simpler way to set up their state file storage without involving additional services or setups.
Starting with GitLab 13.0, GitLab can be used as an HTTP backend for Terraform, eliminating the need to set up state storage separately for every new project.
The GitLab HTTP Terraform state backend allows for a seamless experience with minimal configuration, and the ability to store your state files in a location controlled by the GitLab instance.
They can be accessed using Terraform’s HTTP backend, leveraging GitLab for authentication.
Users can migrate to the GitLab HTTP Terraform backend easily, while also accessing it from their local terminals.
The GitLab HTTP Terraform state backend supports:
Multiple named state files per project
Locking
Object storage
Encryption at rest
It is available both for GitLab Self-Managed installations and on GitLab.com.
See documentation and issue.
Furthermore, this provider will be supported for the foreseeable futur, with GitLab 13.4 (September 2020):
Taking ownership of the GitLab Terraform provider
We’ve recently received maintainer rights to the GitLab Terraform provider and plan to enhance it in upcoming releases.
In the past month we’ve merged 21 pull requests and closed 31 issues, including some long outstanding bugs and missing features, like supporting instance clusters.
You can read more about the GitLab Terraform provider in the Terraform documentation.
See Documentation and Issue.

CodeDeploy to S3

I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.

Deploying an Angular 2 app built with webpack using Bitbucket

I have searched high and low for an answer to this question but have been unable to find one.
I am building an Angular 2 app that I would like hosted on an S3 bucket. There will be an EC2 (possibly) backend but that's another story. Ideally, I would like to be able to check my code into Bitbucket, and by some magic that alludes me I would like S3, or EC2, or whatever to notice via a hook, for instance, that the source has changed. Of course the source would have to be built using webpack and the distributables deployed correctly.
Now this seems like a pretty straightforward request but I can find no solution exception something pertaining to WebDeploy which I shall investigate right now.
Any ideas anyone?
Good news, AWS Lambda created for you.
You need to create following scenario and code to achieve your requirement.
1-Create Lambda function, this function should do the following steps:
1-1- Clone your latest code from GitHub or Bitbucket.
1-2- install grunt or another builder for your angular app.
1-3- install node modules.
1-4- build your angular app.
1-5- copy new build to your S3 bucket.
1-6- Finish.
2-Create AWS API gateway with one resource and one method point to your Lambda function.
3-Goto your GitHub or Bitbucket settings and add webhook with your API gateway.
4-Enjoy life with AWS.
;)
Benefits:
1-You only charge when you have the new build.
2-Not need any machine or server (EC2).
3-You only maintain one function on Lambda.
for more info:
https://aws.amazon.com/lambda/
https://aws.amazon.com/api-gateway/
S3 isn't going to listen for Git hooks and fetch, build and deploy your code. BitBucket isn't going to build and deploy your code to S3. What you need is a service that sits in-between BitBucket and S3 that is triggered by a Git hook, fetches from Git, builds, and then deploys your code to S3. You need to search for Continuous Integration/Continuous Deployment services which are designed to do this sort of thing.
AWS has CodePipeline. You could setup your own Jenkins or TeamCity server. Or you could look into a service like CodeShip. Those are just a few of the many services out there that could accomplish this task. I think any of these services will require a bit of scripting on your part in order to get them to perform the actual webpack and copy to S3.