I am trying to download multiple files from S3 using aws cli in python. Using pip install I installed aws cli and was able to successfully pass credentials. But when I try to download multiple files, I get following error:
fatal error: invalid literal for int() with base 10: 'us-east-1.amazonaws.com'
My code to download the file looks like this:
aws s3 cp "s3://buckets/testtelligence/saurav_shekhar/test/" "C:/HD/Profile data/Downloads" --recursive
Also, my C:\Users\USERNAME\.aws\config is
[default]
region = Default region name [None]:us-east-1
output = Default output format [None]: table
I am not sure what that error means and how to resolve this.
The contents of your .aws/config file should look like:
[default]
region = us-east-1
output = table
Related
I am using AWS CLI version 2. I am using centos > Nginx > php 7.1, Following command works fine when I directly run on command line.
aws s3 cp files/abc.pdf s3://bucketname/
but when I run same command from index.php file using following code
echo exec("aws s3 cp files/abc.pdf s3://bucketname/ 2>&1");
then it gives error
upload failed: Unable to locate credentials
#Jass Add your credentials in "~/.aws/credentials" or "~/.aws/config" and make it [default] or else use profile_name incase you have multiple accounts.
Also verify, if you are using keys as Environment variables by export, then it will work for that terminal only. So try to execute the php from same terminal where you exported the keys or add it in ~/.aws/credentials.
I tried this and it worked for me and I believe should work for you as well. In your PHP code (index.php), try exporting the credential file location like below
echo exec("export AWS_SHARED_CREDENTIALS_FILE=/<path_to_aws_folder>/.credentials; aws s3 cp files/abc.pdf s3://bucketname/ 2>&1");
When you run from your command-line the AWS CLI picks up the credentials from your home directory i.e. ~/.aws/credentials (this is default). When the index.php is being executed it is looking for the above file in its home directory which appears is not the same as your home directory and hence cannot find the credentials. With the above change you are explicitly pointing it to your AWS credentials.
I want to upload a file to amazon CLI it's not working
When I'm uploading manually it's working
I'm using the below command
aws s3 cp /localfolderlocation awss3foldername --recursive --include "filename"
When i try to get list same error
aws s3 ls
The issue was that when running the aws configure CLI command the OP entered the name of the region as seen from the console.
In AWS CLI the region identifier should be the code not the full display name.
The full list of region codes are available here.
This is required for any programmatic interaction with AWS including the SDKs as well.
Check whether you have added the region something like that
AWS_S3_REGION_NAME = 'ca-central-1' in your settings.py file and make sure it should be in small letters.
For my Spring application, after the release of v1.3.1, I had to replace a call to the AWS SDK's Regions.US_EAST_1.getString() with getName(). It didn't like getting US_EAST_1 as part of the request anymore, though it worked before.
#Bean("amazonS3")
#ConditionalOnProperty(name = "localstack", havingValue = "true")
#Profile("!test")
public static AmazonS3 amazonS3LocalStackClient(
#Value("${s3.endpoint}") String localEndpoint) {
return AmazonS3ClientBuilder.standard()
.withCredentials(new DefaultAWSCredentialsProviderChain())
.withEndpointConfiguration(
new EndpointConfiguration(localEndpoint, Regions.US_EAST_1.getName()))
.withPathStyleAccessEnabled(Boolean.TRUE)
.build();
}
For me I tried all options,
Deleted env variables
Deleted .aws folder
Nothing worked. Then I just updated the aws cli and rebooted the machine. And it did work. Try below
For windows: msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi
For Linux & Mac this is this link
I have AWS_CONFIG_FILE set to C:\Users\myname\.aws
When I run command aws configure and pass in the correct details I get the below error:
[Errno 13] Permission denied: 'C:\Users\myname.aws'
And a credentials file has appeared in folder C:\Users\myname\.aws
But, when I run the command aws configure again, the AWS Access Key ID and AWS Secret Access Key are already set, but the Default Region Name and Default Output Format are [None].
Question 1:: Why hasn't the config file been created?
Then, when I run command aws s3 ls I get error:
An error occurred (SignatureDoesNotMatch) when calling the ListBuckets operation: The request signature we calculated does not match the signature you provided. Check your key and signing method.
Question 2: Why am I getting this error and how do I fix it?
It worked with me after deleting the credentials inside .aws and trying again. As per the recommendation here.
I tried the other solutions to no avail. The only way to get it to work was to manually create the 2 files required - config with content like this:
[default]
region=ap-southeast-2
output=json
and credentials like this:
[default]
aws_access_key_id=YOURKEY
aws_secret_access_key=YOURSECRETKEY
On Windows the files go into C:\Users\Username\.aws and neither file has an extension. Afterwards you can confirm it with
aws configure list
on the commandline. Hopes this helps someone!
I am trying to install the Amazon Web Services Python SDK. I cannot find the ~/.aws/credentials folder on my machine though.
This is the page I am using for reference: https://aws.amazon.com/developers/getting-started/python/
It says the location on Windows should be like: C:\Users\USER_NAME.aws\credentials
I've done pip installs for boto, boto3, and awscli. Is there something else I need to install to get a credentials folder?
According to the link you posted...
Create your credentials file at ~/.aws/credentials (C:\Users\USER_NAME.aws\credentials for Windows users) and save the following lines after replacing the underlined values with your own.
With the contents
[default]
aws_access_key_id = YOUR_ACCESS_KEY_ID
aws_secret_access_key = YOUR_SECRET_ACCESS_KEY
There's nothing you need to install, just create the file, and put that in. (obviously swapping out YOUR_ACCESS_KEY_ID for your actual access key, and YOUR_SECRET_ACCESS_KEY for your actual secret key ;) )
I have a gzipped file on a local machine and want to load it to Redshift.
My command looks like this:
\COPY tablename FROM 's3://redshift.manifests/copy_from_yb01_urlinfo.txt' REGION 'us-east-1' CREDENTIALS 'aws_access_key_id=...;aws_secret_access_key=...' SSH GZIP;
But I get a message "s3:/redshift.manifests/copy_from_yb01_urlinfo.txt: No such file or directory".
But this file even public: https://s3.amazonaws.com/redshift.manifests/copy_from_yb01_urlinfo.txt.
Moreover, the user whose credentials I use have a full access to S3 and Redshift: http://c2n.me/iEnI5l.png
And even more weird is the fact that I could perfectly access that file with same credentials from AWS CLI:
> aws s3 ls redshift.manifests
2014-08-01 19:32:13 137 copy_from_yb01_urlinfo.txt
How to diagnose that further?
Just in case, I connect to my Redshift cluster via psql (PostgreSQL cli):
PAGER=more LANG=C psql -h ....us-east-1.redshift.amazonaws.com -p 5439 -U ... -d ...
edit:
Uploaded file to S3 - same error on COPY...
And again I uploaded it and ran COPY with same credentials.
\COPY url_info FROM 's3://redshift-datafiles/url_info_1.copy.gz' CREDENTIALS 'aws_access_key_id=...;aws_secret_access_key=...' GZIP;
I am going to despair...
Since you are trying to copy to RedShift using a manifest file, you need to use the MANIFEST command at the end like :
\COPY tablename FROM 's3://redshift.manifests/copy_from_yb01_urlinfo.txt' REGION 'us-east-1' CREDENTIALS 'aws_access_key_id=...;aws_secret_access_key=...' SSH GZIP MANIFEST;
Oh.
The fix was to remove backslash in the beginning of the command.
Can't remember why I started writing it... Actually I already began writing it when I exported data from local PostgreSQL installation.
This is so stupid) One small rubber duck could have saved me a day or two.