Deploy files from CodeCommit to S3 - amazon-web-services

I want to deploy certain files pushed into my CodeCommit repo into an S3 bucket. I'm attempting to do this with a Lambda trigger on the repo. However, I cannot get a list of files changed in a commit nor request a specific file from CodeCommit using the AWS CodeCommit API.
Any suggestions would be greatly appreciated!

Yep, that capability is not in the CodeCommit API (yet... I assume/hope someone at AWS is working on it).
My suggestion would be some sort of CI job, such as a Jenkins job, using an IAM role that is configured for CodeCommit and S3 access, that regularly polls your repo(s), picks up the changes, and uses a language of your choice to handle the commit and push the changes to S3. Is this a round-about way of doing it? Yes, but I am unable to come up with an AWS-native way to do it at the moment. I would love to see someone suggest a better way.

Related

can we migrate a code repository from one AWS account to another aws account using CLI. if not is there any way to do that?

I want to migrate a code from one AWS account to another AWS account using code commit CLI.
Can anyone have any idea or any documents on "How to".
can we also clone a repo to another aws account?
Thanks in advance
Since CodeCommit is just a regular Git repository, you can create a new repository in the target account, checkout the source repository locally and push it to the newly created repository.
There is no native way to move, clone or transfer ownership of a CodeCommit repository.

Is bitbucket enterprise server allowed with AWS codebuild?

I am looking to integrate enterprise bitbucket server with aws ci/cd pipeline features.
I have tried creating a project within aws codebuild but do not see any option for bitbucket enterprise .
If this is not possible then what is the long route using api gateway / webhooks etc ?
AWS Codebuild only supports the Bitbucket cloud. To integrate with Bitbucket self hosted solution, you will need to create a API gateway + Lambda. And then add this gateway address as a webhook in the bitbucket repo. The Lambda will then be responsible to process the incoming events from Bitbucket server. There could be 2 routes from here.
One way could be to download the zip for the particular commit and upload it on a S3 bucket. Add S3 as a source trigger for the build project. You lose the ability to run any git specific commands in such a case though as it's just a zip file containing the specific version of files.
Second option could be to pass on the relevant info to codebuild by directly invoking it from Lambda. Passing off details like commit_id, event (pr or push), branch etc as environment variables. Based on this info, run a git clone in codebuild before running other build steps. This way you would have access to git specific commands.
Here is an example workflow from AWS (it is for codepipeline, but you can modify it suitably for codebuild)

Creating Amazon Kinesis Data Generator Stack on different Region

I'm trying to generate a cloudformation stack provided by AWS here. When I click the Create a Cognito User with CloudFormation button, it directs me to AWS console CloudFormation page on us-west-2 (Oregon), from there its pretty much self explanatory. The problem is, the company that I'm working on only allows work on us-west-1 (N. California). I have tried looking over the CloudFormation template itself and I cant find any region being mentioned. I have also asked this question in AWS developer forum but no one has responded, and I'm wondering if anyone here knows how to generate that particular stack on any region other than us-west-2 (oregon)? Thanks!
I found a workaround for that. I used to face the same problem, as my company policy was set to not use us-west-2, therefore I couldn't use the CloudFormation JSON script provided by Amazon Kinesis Data Generator.
What I did was:
Download CloudFormation JSON script by Amazon Kinesis Data Generator in your local machine. CloudFormation JSON script download link can be found Amazon Kinesis Data Generator Help page
Download the source code. The source code download link can be found in Amazon Kinesis Data Generator Help page.
In your AWS account, go to S3 and create a S3 bucket in the region that you are allowed to create. Name it whatever you want.
Upload the source code downloaded in step2 to the created bucket in step3.
Edit CloudFormation JSON script downloaded in step1. Inside of script, change bucket name inside of Lambda function to the name of bucket you created in step3.
Go to CloudFormation and create the stack by uploading your edited script.
One thing that you need to keep in mind implementing this workaround is that if there are any changes to source code by AWSLAB, or any newer version of source code comes to life, you will have to manually check and update it to your bucket.
I hope it was clear.
I have created JMeter plugin to publish data records in Kinesis Data Stream.
https://github.com/JoseLuisSR/awsmeter
It works very well and you don't need use any aditional AWS service to publish event in Kinesis as Kinesis Data Generator does, where you could pay aditional charges for services like Cognito, Cloudformation, Lambda that are need to build and deploy KDG.
You just need AWS IAM user with programmatic access, download JMeter and install awsmeter plugin.
If you have questions or comments let me know.
Thanks.

How can I FTP(SSL) into a private AWS S3?

I have an AWS S3 Bucket holding a development website. I would like to FTP(SSL) into the S3 Bucket, and also be able to create username and password credentials for others. Is this possible, and how can I do this?
Thanks!
Before giving up on S3 remember that sometimes frustration with a new product or technology comes from lack of knowledge and experience. The Amazon Cloud platform has some amazing services to work with.
FTP is an old technology that is not as popular today. The new style is using REST interfaces. S3 supports REST. Also you can easily copy files to / from S3 using command line tools. Look into the AWS Command Line Interface (CLI). Link below.
If your goal is to use S3 as your source repository look into AWS CodeCommit. Very similar to GIT. There is also CodePipeline, CodeBuild and CodeDeploy. Combine these tools with other Amazon services such as CloudFormation and you have real developer power.
AWS Command Line Interface
AWS Code Services
AWS CloudFormation

Is it possible to combine AWS CodeBuild and CodePipeline to build described CI workflow?

What I'm trying to do is to create a following CI flow with standard AWS tools: run a build of a commit when a Pull Request in Github is created or updated. Or run a build of any branch on my command. Very similar to what Codeship, Travis and many other CI services offer.
Is it possible with CodeBuild + CodePipeline? I noticed that I have to specify exact branch in CodePipeline and, unfortunately, could not find how to integrate Github Pull requests into it. Maybe I overlooked it?
CodeBuild now directly supports building GitHub pull requests (without Lambda intermediate step), if you're looking to simply run a build as part of the PR. For running more steps with CodePipeline as part of a PR, you'll still need to set up some scaffolding as the other answers suggest.
https://aws.amazon.com/about-aws/whats-new/2017/09/aws-codebuild-now-supports-building-github-pull-requests/
CodePipeline does support basic, fully-managed integrations with both GitHub and CodeBuild, as listed in Product and Service Integrations with AWS CodePipeline. With these integrations, it is possible to use CodeBuild with CodePipeline to run a build of a commit when a commit is pushed to a branch on GitHub. See Use AWS CodePipeline with AWS CodeBuild to Run Builds for details on integrating CodeBuild with CodePipeline as a Build action provider, and see the Four-Stage Pipeline Tutorial for details on integrating Github with CodePipeline as a Source action provider.
Currently, the Pull Request feature in Github is not supported in the official CodePipeline integration, you did not overlook anything. For an interesting AWS-ecosystem open source project (not yet v1.0) that does support GitHub Pull Request integration (though not yet CodePipeline), you might want to check out LambCI.
It looks like this can be done somewhat manually by using Lambda and S3 - https://aws.amazon.com/blogs/devops/integrating-git-with-aws-codepipeline/
Webhooks notify a remote service by issuing an HTTP POST when a commit is pushed to the repository. AWS Lambda receives the HTTP POST through Amazon API Gateway, and then downloads a copy of the repository. It places a zipped copy of the repository into a versioned S3 bucket. AWS CodePipeline can then use the zip file in S3 as a source; the pipeline will be triggered whenever the Git repository is updated.
You could try https://www.deploytoproduction.com for Github Pull Request build status integration with AWS CodeBuild. It is free for a single Github repository with a subscription plan available for multiple repositories.
The service doesn't currently integrate with CodePipeline but that is coming soon.
If you wanted to build something yourself, you could make a new integration on GitHub that uses the webhook functionality to trigger a lambda function which in turn triggers your CodeBuild jobs or pushes an artifact to S3 to start a CodePipeline.
Full disclosure I am the author of this service