GCloud - Where to place project when working with multiple users? - google-cloud-platform

First of all, I'm new to the GCloud platform and to everything cloud related in general.
I want to work with multiple users (with different Google Accounts) on one project in the GCloud.
I already granted the users all the necessary rights, to access my project.
I plan on running a Kubernetes Cluster. I followed this tutorial and everything worked fine. But now I figured out, that other users can't acces my project folder, because it is in /home/USERNAME.
Also when saving some dummy file to /tmp the other users can't see it and I read, that the GCloud Shell is per user and not per project.
My question is, where can I clone my git project to on the GCloud platform so that other users can git pull when there are code changes? Or should I setup my project differently? Also they would need acces to the dockerfile in order to build a new image for Kubernetes.
Do I have to use a CI/CD solution? As I'm working on a school project currently, there is no need for CI/CD.

Github, Gitlab, Bitbucket or any other SCM should do. That way each of the users can have their own local repository of the code you are working on.
CI/CD is not obligatory and you can deploy your applications without it, however it can make your life easier when working with large codebases and when you are deploying often.

Related

Best way of deployment automation

I've made a pretty simple web application which has a REST API backend service written in Python/Django and a FE service written in JS/React. Both parts are containerized and can be launched locally via docker-compose. They are situated in separate github repositories, and every time a new tag is pushed to the corresponding repo, an image is built and pushed to the corresponding ecr repo via github actions. Until that point everything works smoothly, but the problem is that, I don't know how to properly automate the deployment process to the test and production environments. The goal is to have those environments as similar as possible.
My current solution for test env is to simply upload docker-compose file to the ec2 instance via github actions, then manually run the docker-compose command, which pulls images from ecr.
Even though it's a simple solution, it's not very scalable or automated and requires some work to be done in order to update an application. The desired flow is to have a manual GitHub action in each repository, which would deploy either BE or FE to the test or prod environment without any need to ssh into the server and do any other manipulations with docker.
I was looking at ECS, but it seems that it's a solution for bigger apps, which need several or more instances to run. I want my app to be used by many users, but I'm not sure, when it will happen. So maybe i should stick to something less complicated than ECS. Are there any simpler solutions, which i am missing? Like Elastic beanstack or something from any other provider?
I will be happy to receive a feedback on anything I wrote in this post, thanks!

How to use multi project multi environment deployment using google deployment manager and google cloud build

Currently we're having a dev environment in a gcp project. We're using GDM templates and other stuffs along with repo in bitbucket. Whenever we push any changes in bitbucket it builds and deploy to this dev environment. Suddenly, we've decided to have a new gcp project as test environment and we want to deploy automatically to this environment like dev environment. Our preference will be to deploy to this environment from the cloudbuild execution in dev environment. Can you suggest us any guideline that'll help us to set up things in one place that'll automatically deploy this in multiple projects as multiple environments automatically?
You can use Terraform to achieve this.
There's a lot of information on how to start here.
However, I would suggest having projects in separate deployments. This way you limit the blast radius and protect production from errors occurring in other environments.
You need separate calls for separate projects. Just like almost all Google API resources deploymentmanager/deployments lives inside a project (https://www.googleapis.com/deploymentmanager/v2/projects/[PROJECT]/global/deployments), thus you cannot deploy to multiple projects in one call.

How do I put AWS Amplify project into CodeCommit?

I am just starting to use AWS Amplify but can't figure out how you are supposed to commit the project to a source code repository so that others can work on the same project.
I created an react Serverless project 'web_app' and have created a few APIs and a simple front end application and now want to commit this to CodeCommit so it can be accessed by others.
Things get a bit confusing now because for the CI/CD it seems once should create a repository for the front end application - usually the source files are in the 'web_app/src' folder.
But Amplify seems to have already created a git repository at the 'web_app' folder level so am I supposed to create a CodeCommit repository and push the 'web_app' local repo to the remote repository and then separately create another repository for the front end in order to be able to use the CI/CD functions in AWS?
For some reason if I do try and push anything to AWS CodeCommit I always get an error 403.
OK - I'll answer this myself.
You just commit the entire project to a repo in CodeCommit. The project folder contains both the backend and the frontend code. The frontend code is usually in the /src folder and the backend code (CloudFormation files) is usually in the amplify folder.
Once you have the CodeCommit repo setup you can use the Amplify Console or the amplify-cli to create a new backend or frontend environment. Amplify is smart enough to know where to find the backend and frontend code.
Bear in mind that the backend amplify-cli code creates a bunch of files that are placed in the frontend folder (/src), including the graphql mutations and queries that will be used in the frontend code.
If you have set up CI/CD then any 'git push' will result in a new build for the environment you are in. You can modify the build script to include or exclude rebuilding the backend - I think by default it will rebuild the backend if there are changed.
You can also manually rebuild the backend by using the amplify-cli 'amplify push' command.
Take care because things can get out of sync and it seems old files can be left lying around that cause problems. Fortunately it doesn't take long to delete and rebuild and entire environment. Of course you may have to backup and reload your data first. Having some scripts to automatically load any seed data for development or testing is useful.
There is a lot of documentation out there but a lot of it seems to be quite confusing.

Deploy only changed files on AWS Elastic Beanstalk website

I have successfully deployed my website on AWS Elastic Beanstalk. Now I want to change the code in one of my file.
If I do eb deploy, it will completely deploy a new version of my code which I don't want. I already have an updated DB on Elastic Beanstalk. If I deploy the whole code again, it will overwrite my DB file.
How can I deploy the changed file only successfully?
This may not be the answer you're looking for, but I would highly recommend deleting this file from your code repository. Hopefully you're using a version control system like Git; if you want to keep the original file for historical purposes, I would create an entirely different repository and put it in there.
Why? Even if you did come up with a solution to only deploy changed files...would you really want to trust it? If there's any problem with the solution you came up with, you would entirely erase/overwrite your production database. Not good.
In addition, if you want to build a really robust system to entirely create your app from scratch in AWS, take a look at Cloud Formation. It takes some learning and work, but you can build a script -- and maintain it in version control -- that will scaffold your entire cloud infrastructure.

Create a new GCP project from existing

I created a Project on GCP. It has a postgres database, a node Appengine web app, and some other stuff. Now I am developing the app, and when everything is set up and running nicely I'd like to clone this project somehow and create a staging and a production environment/project.
So my project now is called dev-awesomeapp. Can I somehow make a staging-awesomeapp for staging and a awesomeapp for production from my existing dev-awesomeapp?
Edit: there is an other question from 2017 that asks the same thing, but maybe it's possible now after 2,5 years?
You can't, but if you don't want to configure everything form the beginning each time, you can use "architecture as code" with tools like deployment manager or Terraform.
This could help you in replicating your infrastructure, moreover it can be really helpful in automating any architectural changes if you use it in a CI/CD pipeline, making your release phase quicker and more reliable :)