How to upload a project folder to AWS CodeCommit repository? I can see that I can only upload a single file from the CodeCommit console. Is it possible to upload all the items in the folder (Entire folder) at once?
The CodeCommit console is primarily for managing repositories, their branches, approval rules, etc., but not for interacting with the content of the git-based repositories themselves. For committing files or directories, the appropriate git command (git add etc.) is more feasible than using the AWS Console, CLI or API.
However, if this is primarily about initializing a repository with a blueprint or project template, you can also upload the content to S3 and create a new CodeCommit repository using CloudFormation, see here.
Related
I am trying to connect the content of a static Website in an S3 bucket to a CodeCommit repo via CodeDeploy.
However, when I set up a repo via CodeCommit and a CodeDeploy Pipeline and when I push changes to my S3 bucket of my HTML file, the static HTML page doesn't loadinstead my browser screen either briefly flashes or it instead downloads the HTML file.
I know I have the S3 bucket configured correctly because when I test my .html file via it's public URL, it loads as expect.
Addtionally, when I download my HTML file via my S3 bucket BEFORE I push commit changes, the same file downloads. However, when I download the newly committed HTML file from S3, it's corrupted. Which makes me think it's an issue in how I've configured CodeDeploy, but can't figure it out.
I believe I have the header information configured correctly
The S3 Bucket policy Bucket policy allows for reading of objects. CodePipeline successfully pushes my repo changes to my S3 Bucket. But for some reason, even through S3 still sees the file type as HTML, it's not configuring as such after a push from CodeDeploy. Additionally, when I try to download the new pushed HTML file and open it, the HTML code is all jumbled.
Any help or insights is appreciated.
Eureka! I found the solution (by accident).
Just in case others run into this problem: When configuring a deployment pipeline in CodePipeline, if you don't select "Extract file before deploy" in your deployment configuration step, CodePipeline will instead deploy any code commit HTML files (and I assume other files types as well) as "octet-streams". Enabling "Extract file before deploy" fixed this problem.
Now I will be able to finally sleep tonight!
I am looking to integrate enterprise bitbucket server with aws ci/cd pipeline features.
I have tried creating a project within aws codebuild but do not see any option for bitbucket enterprise .
If this is not possible then what is the long route using api gateway / webhooks etc ?
AWS Codebuild only supports the Bitbucket cloud. To integrate with Bitbucket self hosted solution, you will need to create a API gateway + Lambda. And then add this gateway address as a webhook in the bitbucket repo. The Lambda will then be responsible to process the incoming events from Bitbucket server. There could be 2 routes from here.
One way could be to download the zip for the particular commit and upload it on a S3 bucket. Add S3 as a source trigger for the build project. You lose the ability to run any git specific commands in such a case though as it's just a zip file containing the specific version of files.
Second option could be to pass on the relevant info to codebuild by directly invoking it from Lambda. Passing off details like commit_id, event (pr or push), branch etc as environment variables. Based on this info, run a git clone in codebuild before running other build steps. This way you would have access to git specific commands.
Here is an example workflow from AWS (it is for codepipeline, but you can modify it suitably for codebuild)
I want create Cloud Build trigger linked to Cloud Source Repository in another project.
But when I'm on a step where I am supposed to choose a repository, the list is empty.
I tried different some permission, but without luck.
Could someone tell whether such configuration is possible and how do it?
The cloudbuild trigger can only see repositories that are in the same project.
We ran into the same issue with Bitbucket repos that we are mirroring into the Cloud Source Repos in our projects.
What we discovered was that we needed to mirror the repo into BOTH projects so that the cloudbuild trigger could see the repository. I am not sure how this would work with a repo that only lives in the GCP source code repo.
When you have project A that has a trigger to build a container and place it in a repository owned by project B, you must add an IAM permission on project B that allows creation of images from a service account on project A. When you are using triggers, a service account on project A is created called A_number#cloudbuild.gserviceaccount.com. On project B, you must then use IAM to give permissions for this service account to create containers. For example, you may add the role "Cloud Build Editor".
This appears to be quite well document in the following Cloud Build docs:
Configuring access control
Setting service account permissions
I want to deploy certain files pushed into my CodeCommit repo into an S3 bucket. I'm attempting to do this with a Lambda trigger on the repo. However, I cannot get a list of files changed in a commit nor request a specific file from CodeCommit using the AWS CodeCommit API.
Any suggestions would be greatly appreciated!
Yep, that capability is not in the CodeCommit API (yet... I assume/hope someone at AWS is working on it).
My suggestion would be some sort of CI job, such as a Jenkins job, using an IAM role that is configured for CodeCommit and S3 access, that regularly polls your repo(s), picks up the changes, and uses a language of your choice to handle the commit and push the changes to S3. Is this a round-about way of doing it? Yes, but I am unable to come up with an AWS-native way to do it at the moment. I would love to see someone suggest a better way.
My Jenkins Setup works as follows:
Takes checkout of code from Github
Publishes the code successfully to my S3 bucket.
The IAM user I have configured for it has full access permission to S3.
But the problem occurs if I delete a file/directory, it updates all the files in my s3 bucket but doesn't removes the deleted files/directories. Is Deleting files/directories not possible by Jenkins S3 plugin?
S3 plugin removes files on onDelete event. Jenkins creates this event when it goes to remove build from history (due to rotation or something like that). Uploading works only as uploading - not updating.