we're having issues connecting our BitBucket to CodePipeline.
I can setup the connection, but i don't get any repos listed.
Even if I type in the name, it cannot find it.
After manually typing i can save and run, but it fails with error: make sure repo exists.
If I setup the connection in CodeBuild it's working.
I tested in eu-central-1 and eu-west-1
Anybody with a similar issue?
Best regards,
Kai
Related
I am trying to create a CI pipeline with Github, AWS CodeBuild, CodePipelines, and CodeDeploy. I continually get the error As shopwn below
I have my s3 bucket that holds my artifacts that I want to be pushed on an "allow all" policy for troubleshooting purposes and I have full permissions to the github repo I am pulling from. The "Release Change" on pipelines always fails at the deploy phase shown by image 1 below. It also fails relatively quickly if that helps. For context I am trying to create CI to just one ec2 atm and that ec2 has the deploy agent running on it and is working. Thank you all for your help!
Is it possible to use a remote ECR Repository as a source in CodePipeline?
I get the following error:
The repository with name '12345.dkr.ecr.eu-central-1.amazonaws.com/ecrrepo' does not exist in the registry with id '67890'
(Account IDs have been intentionally changed)
However the remote repository definitely exists.
Whole picture: I have 2 accounts, dev and test. Now that I have a pipeline built and running in dev account, I would like to do the same deployment in test account, but using the same ECR repository.
Just additional info: I am able to deploy to the ECS cluster of test account manually using the dev account's repository.
CodeBuild definitely supports cross account ECR image access, doesn't CodePipeline?
Any hints for solution or workaround? (I can think of Lambda)
At the moment in CodePipeline source stage when ECR is selected you only have option to provide ECR from the current AWS account.
Workaround would be to have a CodeBuild stage in the pipeline which can retrieve cross account ECR source:
https://aws.amazon.com/blogs/devops/how-to-use-cross-account-ecr-images-in-aws-codebuild-for-your-build-environment/
Your pipeline can still be started by CloudWatch Events when the ECR source changes in the other account:
CW Event Bus: https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/CloudWatchEvents-CrossAccountEventDelivery.html
I have a AWSCodeCommit repository where developers check-in the code. Now since, this is for a PoC, I don't want to create a CI-CD pipeline, instead I would like to copy the code from CodeCommit to my AWS EC2 instance. I would then run my code on EC2 instance to view the results. Does anyone know how to copy the code from CodeCommit to EC2 instance ?. I know using scp to copy code from my laptop to EC2, but since we collaborate on CodeCommit I think it would be nice to get the latest code from repository and then run it on the instance. Any help appreciated. Thanks
Thank you and Regards,
Santosh
Install git
Configure with AWS credentials
Do a git clone on the CodeCommit repository
This will provide a local copy of the code checked into the repository.
See: Setting Up for AWS CodeCommit - AWS CodeCommit
I'm trying to trigger multiple builds with CodePipeline (AWS) and when the pipeline trigger a CodeBuild, the CodeBuild fail with the next error:
[Container] 2018/02/07 19:30:20 Waiting for DOWNLOAD_SOURCE
Message: Access Denied
Extra information:
The source is coming from Github
If I start the CodeBuild manually works perfectly.
I just discovered this the other day. I'm not sure if it's documented anywhere, but it's definitely not clear in the Code Pipeline UI.
Any CodeBuild project that CodePipeline initiates must have been created through the CodePipeline UI. It cannot be a "standalone" CodeBuild project.
When you create a CodeBuild project from the CodePipeline UI, the "Source Provider" setting is "AWS CodePipeline", which is not an available choice when you create the CodeBuild project yourself.
CodePipeline retrieves it's own source code from GitHub. It then passes that source code to your CodeBuild project. If your project is getting it's own source code from GitHub, then that seems to cause the issue you describe:
[Container] 2018/02/06 14:58:37 Waiting for agent ping
[Container] 2018/02/06 14:58:37 Waiting for DOWNLOAD_SOURCE
To resolve this issue, you must edit your CodePipeline "build" stage, and choose "Create a new build project" under "AWS CodeBuild, Configure Your Project". You can copy most settings from your existing project and reuse the buildspec.yml file in your source code.
I had the same exact error. Codebuild worked fine when I ran it alone, but in order to make it work in CodePipeline I had to update my CodePipeline role to allow access to the S3 bucket.
The way to resolve this issue was creating the CodeBuild with the CodePipeline Wizard creation.
In this way the wizard gives to the CodeBuild the necessary privileges.
For a month or so, I've been studying AWS services and now I have to accomplish some basic stuff on AWS elastic beanstalk via command line. As far as I understand there are the aws elasticbeanstalk [command] and the eb [command] CLI installed on the build instance.
When I run eb status inside application folder, I get response in the form:
Environment details for: app-name
Application name: app-name
Region: us-east-1
Deployed Version: app-version
Environment ID: env-name
Platform: 64bit Amazon Linux ........
Tier: WebServer-Standard
CNAME: app-name.elasticbeanstalk.com
Updated: 2016-07-14 .......
Status: Ready
Health: Green
That tells me eb init has been run for the application.
On the other hand if I run:
aws elasticbeanstalk describe-application-versions --application-name app-name --region us-east-1
I get the error:
Unable to locate credentials. You can configure credentials by running "aws configure".
In home folder of current user there is a .aws directory with a credential file containing a [profile] line and aws_access_key_id and
aws_secret_access_key lines all set up.
Beside the obvious problem with the credentials, what I really lack is understanding of the two cli. Why is EB cli not asking for credentials and AWS cli is? When do I use one or the other? Can I use only aws cli? Any clarification on the matter will be highly appreciated.
EDIT:
For anyone ending up here, having the same problem with "Unable to locate credentials". Adding --profile profile-name option solved the problem for me. profile-name can be found in ~/.aws/config (or credentials) file on [profile profile-name] line.
In order to verify that the AWS CLI is configured on your system run aws configure and provide it with all the details it requires. That should fix your credentials problem and checking the change in configuration will allow you to understand what's wrong with your current conf.
the eb cli and the aws cli have very similar capabilities, and I too am a bit confused as to why they both should exist. From my experience the main differences are that the cli is used to interact with your AWS account using simple requests while the eb cli creates connections between you and the eb envs and so allows for finer control over them.
For instance - I've just developed a CI/CD pipeline for our beanstalk apps. When I use the eb cli I can monitor the deployment of our apps and notify the developers when it's finished. aws cli does not offer that functionality, and the only to achieve that is to repeatedly query the service until you receive the desired result.
The AWS CLI is a general tool that works on all AWS resources. It's not tied to a specific software project, the type of machine you're on, the directory you're in, or anything like that. It only needs credentials, whether they've been put there manually if it's your own machine, or generated by AWS if it's an EC2 instance.
The EB CLI is a high level tool to wrangle your software project into place. It's tied to the directory you're in, it assumes that the stuff in your directory is your project, and it has short commands that do a lot of background work to magically put everything in the right place.