AWS JIRA CloudFormation - amazon-web-services

I am trying to deploy JIRA on AWS, but am having a hard time setting it up. I couldn't find any document on how to troubleshoot the following errors.
First one is:
S3 error: Access Denied For more information check
I made a S3 bucket public, and was able to bypass this error, but I don't want it to be public, but since creating a whole new stack, I don't have any information of an instance to adjust allow permission to S3 bucket.
Is there any way to troubleshoot this error without adjusting the bucket to be public?
After bypassing the previous error, I was getting this error:
S3 error: The specified key does not exist.
I definitely couldn't find how to troubleshoot this issue? What needs to be done to fix this error?

The Access Denied indicates that you do not have permissions to access content in Amazon S3. The normal way of providing these permissions is:
Create an IAM Role
Assign permission to the role sufficient to access the S3 bucket
Assign the Role to the Amazon EC2 instance running the software
The specified key does not exist error basically means File Not Found.
If you wish any further trouble shooting tips, you'll need to provide details of what you are doing (eg the commands used) and what specific errors you are receiving.
You may also wish to read:
Getting started with JIRA Data Center on AWS - Atlassian Documentation
JIRA on AWS - Quick Start

Related

Getting error while configuring AWS profile in vscode

"In UBUNTU-22.04"
I am trying to look for my resources in the Vscode editor, but I am getting an error AWS "Failed to load resources." under the Resource option.
I just tried to load resources so that AWS resources can get synced
The error message "AWS multiple items have the key" typically occurs when you are trying to access an AWS resource, such as an S3 bucket or an EC2 instance, that has multiple items with the same key. This can happen if you have multiple copies of the same resource, or if there are multiple resources with the same name.
1 To resolve this issue, you can try the following steps:
2 Check if you have multiple copies of the same resource. If you do, delete the extra copies.
3 Check if there are multiple resources with the same name. If there are, rename the resources to make them unique.
4 Make sure that your AWS CLI and SDK credentials are set up correctly. Go to the AWS Management Console and check if you are logged in with the right credentials.
5 Check if you have the correct permissions to access the resource. Make sure that your IAM role or user has the necessary permissions to access the resource.
6 Try to access the resource using the AWS CLI or SDK. If you are still getting the error, try to access the resource using the AWS Management Console to see if the issue is with your code or with the resource itself.

Copy an on premise Windows folder to S3 bucket

I have an old archive folder that exists on an on premise Windows server that I need to put into an S3 bucket, but having issues, it's more my knowledge of AWS tbh, but I'm trying.
I have created the S3 bucket and I can to attach it to the server using net share (AWS gives you the command via the AWS gateway) and I gave it a drive letter. I then tried to use robocopy to copy the data, but it didn't like the drive letter for some reason.
I then read I can use the AWS CLI so I tried something like:
aws s3 sync z: s3://archives-folder1
I get - fatal error: Unable to locate credentials
I guess I need to put some credentials in somewhere (.aws), but after reading too many documents I'm not sure what to do at this point, could someone advise?
Maybe there is a better way.
Thanks
You do not need to 'attach' the S3 bucket to your system. You can simply use the AWS CLI command to communicate directly with Amazon S3.
First, however, you need to provide the AWS CLI with a set of AWS credentials that can be used to access the bucket. You can do this with:
aws configure
It will ask for an Access Key and Secret Key. You can obtain these from the Security Credentials tab when viewing your IAM User in the IAM management console.

AWS Workspaces - Unable to provide Console Access to IAM user

I want an IAM user to have read/List access and start/stop access to AWS workspaces. Hence I've created a simple IAM policy which grants all read and list actions.
but this was not enough. I was displayed with an error message An Error Has Occurred There was an error retrieving information about your WorkSpaces. Upon investigating cloudtrail, I found that the user need read/list permissions to KMS and AWSDirectory Service. Hence granted that too but when I login again, I still see the same error. Even tried attaching EC2 full access too but still the same error. Is this a potential bug?
The same issue has been discussed in AWS forum too but no resolution there.
https://forums.aws.amazon.com/thread.jspa?threadID=236408
KMS policy and Directory service policy below.
DS:
KMS:
Error Screenshot:
I've found the solution for this. AWS has bizarre limitation where if you want to access workspaces via console, then you need to give full access (workspaces:*) only. Below is a screenshot from the document that states this. Highly disappointed with AWS regarding this limitation.
https://docs.aws.amazon.com/workspaces/latest/adminguide/workspaces-access-control.html
Have you tried a policy similar to the one in the documentation. It includes some services in addition to the once you have tried already.

Elastic Map Reduce and amazon s3: Error regarding access keys

I am new to Amazon EMR and Hadoop in general. I am currently trying to set up a Pig job on an EMR cluster and to import and export data from S3. I have set up a bucket in s3 with my data named "datastackexchange". In an attempt to begin to copy the data to Pig, I have used the following command:
ls s3://datastackexchange
And I am met with the following error message:
AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).
I presume I am missing some critical steps (presumably involving setting up the access keys). As I am very new to EMR, could someone please explain what I need to do to get rid of this error and allow me to use my S3 data in EMR?
Any help is greatly appreciated - thank you.
As you correctly observed, your EMR instances do not have the privileges to access the S3 data. There are many ways to specify the AWS credentials to access your S3 data, but the correct way is to create IAM role(s) for accessing your S3 data.
Configure IAM Roles for Amazon EMR explains the steps involved.

Trying to load Redshift samples, Access Denied when COPYing from S3

I'm running through the Redshift tutorials on the AWS site, and I can't access their sample data buckets with the COPY command. I know I'm using the right Key and Secret Key, and have even generated new ones to try, without success.
The error from S3 is S3ServiceException:Access Denied,Status 403,Error AccessDenied. Amazon says this is related to permissions for a bucket, but they don't specify credentials to use for accessing their sample buckets, so I assume they're open to the public?
Anyone got a fix for this or am I misinterpreting the error?
I was misinterpreting the error. The buckets are publicly accessible and you just have to give your IAM user access to the S3 service.