I'm new to AWS and I need some assistance with a quick project.
I'm trying to export some of my code located on my Ubuntu Server onto CodeCommit repository. The code lives on multiple docker containers.
Would anyone be able to provide a guide on doing so?
Much appreciated!
CodeCommit == Git.
So if you want to "copy" from a CodeCommit repository, all you should need to do is a git clone.
EXAMPLE:
https://docs.aws.amazon.com/codecommit/latest/userguide/how-to-connect.html
cd /my/local/directory
git clone https://git-codecommit.us-east-2.amazonaws.com/v1/repos/MyDemoRepo my-demo-repo
This will create a copy of your CodeCommit repo into local repository "/my/local/directory/my-demo-repo".
Related
I had a few questions about automatic git pulls on a remote server. I am aware there are several questions like this, but I wasn't sure what steps to take exactly, and I don't want to mess up my current setup with a mistake :/
To wit, the environment is on a Google Cloud VM. I am running a flask-based website that renders each page with the render_template() function.
The website resides inside its git folder, i.e. I never set up a bare repo and copied stuff. When I set it up a couple years ago, I just did git clone repo-url, then inside the repo directory, did flask run. Then I set up nginx to connect to the site's socket created with uwsgi inside the repo directory.
--
It has been working fine. I make changes locally to the content, push to github, then log in to the VM, and perform a git pull.
I want to do this automatically. I tried adding a cron job to do this, where the job basically ran a script, and the script did the git pull. Script content was:
cd /repo
git pull
Running the script in the server worked, but cron never managed to do the pull.
--
I have been reading about web hooks, and there is a bunch of stuff about post-receive hooks, post-update hooks, and making bare repos. At this point, I am embarrassed to say I have no idea what I should be doing.
Any help is greatly appreciated.
Another option would be to consider a GitHub Action, which, from GitHub, could interract with your Google cloud VM.
For example, actions-hub/gcloud.
- uses: actions-hub/gcloud#master
env:
PROJECT_ID: test
APPLICATION_CREDENTIALS: ${{ secrets.GOOGLE_APPLICATION_CREDENTIALS }}
with:
args: cp your-file.txt gs://your-bucket/
cli: gsutil
I have a Google Source Repository that mirrors a private Github repo of mine that contains code for a Python package. I also have a Cloud Run instance where I'd like to install this private Python library. How can I go about doing that in the cloudbuild.yaml or Dockerfile?
Cloud Source repository is a Git repository. So, if you want to get sources (your private library) from there, simply clone the repo (the head, or a specific tag/commit sha).
You can't use pip install command because you need a private pipy server installed to achieve this. If you have this, get the credentials and request it!
That's for the principle. Then how to achieve this in Cloud Build or in Docker build.
IMO, the easiest is in the Cloud Build pipeline. Load a GIT compliant image (for instance gcr.io/cloud-builders/git) and close your Cloud Source repo. I recommend this step because you can use the Cloud Build authentication context to log into Cloud Source repository.
Then, in your Dockerfile, copy all your environment (your code and the venv that contains the download library). The additional public libraries can be download inside the Dockerfile, or in the Cloud Build, as you wish.
Is there a way to add multiple git repositories in the same Google cloud project?
You currently cannot do this. We know this is a useful feature, and we're working hard on it. Stay tuned!
We've added the ability to have multiple Cloud Source Repositories for every cloud project.
You can read about how to add a new repo to your project here: https://cloud.google.com/source-repositories/docs/setting-up-repositories
There is no way of doing this as of today. Every project can only have one remote repository.
Git submodule should do the trick. Add git repositories as submodules.
See
https://git-scm.com/docs/git-submodule
https://git-scm.com/book/en/v2/Git-Tools-Submodules
No, there isn't, but you can use Git subtree merges to add multiple "subrepositories" as folders in your main repository, which will do the trick.
See details here https://help.github.com/articles/about-git-subtree-merges/
(There are also submodules as #Shishir stated, but as I understand they are only set for your current local clone and won't be included in checkouts/clones done by others, so I think submodules won't work).
Every Google cloud project can only have one remote repository.
However, t's definitely possible to have multiple local repositories that correspond with the same remote Google cloud repository.
The official documentation describes the following procedure for how to use a Cloud Source Repository as a remote for a local Git repository :
Create a local Git repository
Now, create a repository in your environment using the Git command
line tool and pull the source files for a sample application into the
repository. If you have real-world application files, you can use
these instead.
$ cd $HOME
$ git init my-project
$ cd my-project
$ git pull https://github.com/GoogleCloudPlatform/appengine-helloworld-python
Add the Cloud Source Repository as a remote
Authenticate with Google Cloud Platform and add the Cloud Source
Repository as a Git remote.
On Linux or Mac OS X:
$ gcloud auth login
$ git config credential.helper gcloud.sh
$ git remote add google https://source.developers.google.com/p/<project-id>/
On Windows:
$ gcloud auth login
$ git config credential.helper gcloud.cmd
$ git remote add google https://source.developers.google.com/p/<project-id>/
The credential helper scripts provide the information needed by Git to
connect securely to the Cloud Source Repository using your Google
account credentials. You don't need to perform any additional
configuration steps (for example, uploading ssh keys) to establish
this secure connection.
Note that the gcloud command must be in your $PATH for the
credential helper scripts to work.
It also explains how to create a local git by cloning a Cloud Source repository :
Clone a Cloud Source Repository
Alternatively, you can create a new local Git repository by cloning
the contents of an existing Cloud Source Repository:
$ gcloud init
$ gcloud source repos clone default <local-directory>
$ cd <local-directory>
The gcloud source repos clone command adds the Cloud Source
Repository as a remote named origin and clones it into a local Git
repository located in <local-directory>.
I have set up a django project on EC2 following https://shirtdev.wordpress.com/2011/05/04/setting-up-a-git-repository-on-an-amazon-ec2-instance/. Now I want to clone it so that I can develop locally. The instructions in the tut say:
git clone username#hostname.com:the_project.git
I know this can't be right because I would need to use a private key and the path to the project is:
/home/ubuntu/tproxy/testproject
How cam I clone this repo?
I am using separate servers for dev, test and production environments. I have git installed on my local machine and have separate branches for each environment. How can I deploy the code from each branch to its respective server using GIT?
FYI: GIT is not installed on server.
Thanks in advance.
Simrat
You should have git installed on server,
use the following command to push you code to your remote repository
git push remote_repository_url branch_name
use the following commands to fetch or deploy the code from remote repository to your corresponding machine .
git pull remote_repository_url branch_name
or
git fetch remote_repository_url branch_name
Use fetch command to avoid the merging
Even you can use git clone command to clone the specific branch from your remote_repository(assembla) to your corresponding server >
git clone --branch branch_name remote_repository_url