I am trying to set up email using AWS SDK for PHP, so far unsuccessfully. I have done the following:
verified my domain and recipient email addresses.
Generated my AWS access key
3.installed pHP version 7.0.25
4.installed nstall the AWS SDK for PHP version 3.
Created a shared credentials file.
The documentation instructs that I need to save this file to the following location: "~/.aws/credentials" (I am using MAC)
My question is i do not have ".aws" folder, so I created one (hidden aws directory) in my home directory and then saved my "credentials" file in there(containing by access key credentials). I am getting the following error:
PHP Fatal error: Uncaught Aws\Exception\CredentialsException: Error
retrieving credentials from the instance profile metadata server. (Client
error: `GET http://169.254.169.254/latest/meta-data/iam/security-
credentials/` resulted in a `404 Not Found` response:
<?xml version="1.0" encoding="iso-8859-1"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"htt (truncated...)
) in /home/ec2-user/vendor/aws/aws-sdk-
php/src/Credentials/InstanceProfileProvider.php:79
Stack trace:
#0 /home/ec2-user/vendor/guzzlehttp/promises/src/Promise.php(203):
Aws\Credentials\InstanceProfileProvider->Aws\Credentials\{closure}
(Object(GuzzleHttp\Exception\ClientException))
#1 /home/ec2-user/vendor/guzzlehttp/promises/src/Promise.php(156):
GuzzleHttp\Promise\Promise::callHandler(2, Array, Array)
#2 /home/ec2-user/vendor/guzzlehttp/promises/src/TaskQueue.php(47):
GuzzleHttp\Promise\Promise::GuzzleHttp\Promise\{closure}()
#3 /home/ec2-
user/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(96):
GuzzleHttp\Promise\TaskQueue- in /home/ec2-user/vendor/aws/aws-sdk-
php/src/Credentials/InstanceProfileProvider.php on line 79
PHP script executed in the terminal (as described by aws: http://docs.aws.amazon.com/ses/latest/DeveloperGuide/send-using-sdk-php.html):
<?php
// Replace path_to_sdk_inclusion with the path to the SDK as described in
// http://docs.aws.amazon.com/aws-sdk-php/v3/guide/getting-started/basic-
usage.html
define('REQUIRED_FILE','path_to_sdk_inclusion');
// Replace sender#example.com with your "From" address.
// This address must be verified with Amazon SES.
define('SENDER', 'sender#example.com');
// Replace recipient#example.com with a "To" address. If your account
// is still in the sandbox, this address must be verified.
define('RECIPIENT', 'recipient#example.com');
// Specify a configuration set. If you do not want to use a configuration
// set, comment the following variable, and the
// 'ConfigurationSetName' => CONFIGSET argument below.
define('CONFIGSET','ConfigSet');
// Replace us-west-2 with the AWS Region you're using for Amazon SES.
define('REGION','us-west-2');
define('SUBJECT','Amazon SES test (AWS SDK for PHP)');
define('HTMLBODY','<h1>AWS Amazon Simple Email Service Test Email</h1>'.
'<p>This email was sent with <a
href="https://aws.amazon.com/ses/">'.
'Amazon SES</a> using the <a href="https://aws.amazon.com/sdk-
for-php/">'.
'AWS SDK for PHP</a>.</p>');
define('TEXTBODY','This email was send with Amazon SES using the AWS SDK for
PHP.');
define('CHARSET','UTF-8');
require REQUIRED_FILE;
use Aws\Ses\SesClient;
use Aws\Ses\Exception\SesException;
$client = SesClient::factory(array(
'version'=> 'latest',
'region' => REGION
));
try {
$result = $client->sendEmail([
'Destination' => [
'ToAddresses' => [
RECIPIENT,
],
],
'Message' => [
'Body' => [
'Html' => [
'Charset' => CHARSET,
'Data' => HTMLBODY,
],
'Text' => [
'Charset' => CHARSET,
'Data' => TEXTBODY,
],
],
'Subject' => [
'Charset' => CHARSET,
'Data' => SUBJECT,
],
],
'Source' => SENDER,
// If you are not using a configuration set, comment or delete the
// following line
'ConfigurationSetName' => CONFIGSET,
]);
$messageId = $result->get('MessageId');
echo("Email sent! Message ID: $messageId"."\n");
} catch (SesException $error) {
echo("The email was not sent. Error message: ".$error-
>getAwsErrorMessage()."\n");
}
?>
First, you do not manually create ~/.aws/credentials. Let the AWS CLI do this for you.
Install the CLI: pip install awscli --upgrade
Delete the credentials file that you created.
Configure the CLI and credentials: aws configure
Your credentials are now setup correctly.
The error message that you received is caused by the PHP SDK trying to get the credentials from the EC2 Instance Metadata. This was failing as you have not configured an IAM Role to this system. This also means that the credentials are not stored correctly in your profile (~/.aws/credentials).
Note: The better method is to NOT store credentials on your EC2 instances. Instead create an IAM Role and assign that role to the EC2 instance. All AWS SDKs know know how to extract the credentials from the instance metadata.
IAM Roles for Amazon EC2
Related
I am using AWS SNS to send SMS to mobile.
I can see S3 Delivery Status upload on S3 Option is there. But for that, I have to run batch daily. The report will almost delayed.
The Second Option I can see on Cloudwatch.
Is it possible from cloudwatch we can publish SQS event with logs detail So that I can write SQS subscriber to fetch the message delivery status? or if any other option available which I have not explored so far please let me know.
I am using Java and aws-java-sdk to fetch detail.
Please Note I have gone through the document but unable to find something useful.
Any Suggestion is welcome.
Please I have gone through document but unable to find
To get SMS delivery status you have to do the following
Enable 'Delivery status logging' to AWS Cloudwatch More Info
set permission for the user (used in script) to access the AWS Cloudwatch log
Read the log entry from cloudwatch with awssdk.
See the below PHP code which I am using in my project, you can refer this to a create java code
require 'inc/awsSdkForPhp/aws-autoloader.php';
$params = array(
'credentials' => array(
'key' => '<YOUR KEY>',
'secret' => '<YOUR SECRET>',
),
'region' => 'us-east-1', // your aws from SNS region
'version' => 'latest'
);
$cwClient = new \Aws\CloudWatchLogs\CloudWatchLogsClient($params);
$queryRes = $cwClient->startQuery([
'endTime' => 1621231180, // UNIX TIMESTAMP
'logGroupName' => 'sns/us-east-1/***/DirectPublishToPhoneNumber', // YOUR LOG GROUP NAME
'queryString' => 'fields #timestamp, status, #message
| filter notification.messageId="5a419afc-c4b3-55b3-85f9-c3e7676b2dd2"', // YOUR MESSAGE ID
'startTime' => 1620954551 // START UNIX TIMESTAMP
]);
$qryID = $queryRes->get('queryId');
sleep(3); // To wait the execution to be completed.
$resultObj = $cwClient->getQueryResults(array(
'queryId' => $qryID, // REQUIRED
));
//echo "<pre>";print_r($resultObj);echo "</pre>";
$result = $resultObj->get('results');
$jsnRs = json_decode($result[0][2]['value']); // TO get the delivery array
echo "<br>status :".$jsnRs->status;
echo "<br>phone Carrier :".$jsnRs->delivery->phoneCarrier;
echo "<br>provider Response :".$jsnRs->delivery->providerResponse;
echo "<pre>";print_r($jsnRs);echo "</pre>";
I believe it will help someone
we can use lamda to readlogs from awscloudwatch by watch awscloudwatch events.
I am searching on the internet on how can I get the AWS s3 bucket region with an API call or directly in PHP using their library but have not luck finding the info.
I have the following info available:
Account credentials, bucket name, access key + secret. That is for multiple buckets, that I have access to, and I need to get the region programatically, so logging in to aws console and checking out is not an option.
Assuming you have an instance of the AWS PHP Client in $client, you should be able to find the location with $client->getBucketLocation().
Here is some example code:
<?php
$result = $client->getBucketLocation([
'Bucket' => 'yourBucket',
]);
The result will look like this
[
'LocationConstraint' => 'the-region-of-your-bucket',
]
When you create a S3 client, you can use any of the available regions in AWS, even if it's not one that you use.
$s3Client = new Aws\S3\S3MultiRegionClient([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => $accessKey,
'secret' => $secretKey,
],
]);
$region = $s3Client->determineBucketRegion($bucketname);
I have hit a roadblock using the loopback-component-storage with Amazon S3.
As a test, I am trying to upload a file to S3 from my browser app, which is calling my loopback API on the backend.
My server config for datasources.json looks like:
"s3storage": {
"name": "s3storage",
"connector": "loopback-component-storage",
"provider": "amazon",
"key": “blahblah”,
"keyId": “blahblah”
},
My API endpoint is:
‘/api/Storage’
The error response I am getting from the API is as follows:
. error: {name: "MissingRequiredParameter", status: 500, message: "Missing required key 'Bucket' in params",…}
. code: "MissingRequiredParameter"
. message: "Missing required key 'Bucket' in params"
. name: "MissingRequiredParameter"
. stack: "MissingRequiredParameter: Missing required key 'Bucket' in params …”
. status: 500
. time: "2015-03-18T01:54:48.267Z"
How do i pass the {“params”: {“Bucket”: “bucket-name”}} parameter to my loopback REST API?
Please advice. Thanks much!
AFAIK Buckets are known as Containers in the loopback-component-storage or pkgcloud world.
You can specify a container in your URL params. If your target is /api/Storage then you'll specify your container in that path with something like /api/Storage/container1/upload as the format is PATH/:DATASOURCE/:CONTAINER/:ACTION.
Take a look at the tests here for more examples:
https://github.com/strongloop/loopback-component-storage/blob/4e4a8f44be01e4bc1c30019303997e61491141d4/test/upload-download.test.js#L157
Bummer. "container" basically translates to "bucket" for S3. I was trying to pass the params object via POST but the devil was in the details i.e. the HTTP POST path for upload was looking for the bucket/container in the path itself. /api/Storage/abc/upload meant 'abc' was the bucket.
I created an EC2 Ubuntu instance.
The following is working using the AWS 2.6 SDK for PHP:
$client = DynamoDbClient::factory(array(
'key' => 'xxx',
'secret' => 'xxx',
'region' => 'eu-west-1'
));
I created a credentials file in ~/.aws/credentials.
I put this in /home/ubuntu/.aws/credentials
[default]
aws_access_key_id=xxx
aws_secret_access_key=xxx
Trying the following does not work and gives an InstanceProfileCredentialsException :
$client = DynamoDbClient::factory(array(
'profile' => 'default',
'region' => 'eu-west-1'
));
There is a user www-data and a user ubuntu.
In what folder should I put the credentials file?
One solution to set the credentials is:
sudo nano /etc/apache2/envvars
add environment variables:
export AWS_ACCESS_KEY_ID="xxx"
export AWS_SECRET_ACCESS_KEY="xxx"
sudo service apache2 restart
After that the following works:
$client = DynamoDbClient::factory(array(
'region' => 'eu-west-1'
));
If you are calling the API from an EC2 instance, you should use IAM roles.
Using IAM roles is the preferred technique for providing credentials
to applications running on Amazon EC2. IAM roles remove the need to
worry about credential management from your application. They allow an
instance to "assume" a role by retrieving temporary credentials from
the EC2 instance's metadata server. These temporary credentials, often
referred to as instance profile credentials, allow access to the
actions and resources that the role's policy allows.
This is way too late, but the solution I found for shared servers where you can't actually use environment vars is to define a custom ini file location, like this:
require (__DIR__.'/AWSSDK/aws-autoloader.php');
use Aws\Credentials\CredentialProvider;
use Aws\S3\S3Client;
$profile = 'default';
$path = '/path/to/credentials';
$provider = CredentialProvider::ini($profile, $path);
$provider = CredentialProvider::memoize($provider);
$client = new \Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => $provider
]);
Note that you could even define different profiles with this method.
Documentation HERE
I have a non-EC2 server that accesses SQS and needs credentials. I can't use envvars because there are various people with differing rights who run on this server and envvars is global. For the same reason I don't think I can use an AWS credential file stored under a user's home (although I also couldn't figure out how to make that work for user www-data.)
What I have done is set up a small file AWS_Creds.php
<?php
define ("AWS_KEY", "MY KEY HERE");
define ("AWS_SECRET", "MY SECRET");
?>
The file is stored outside of the webroot and included with include ('ABSOLUTEPATH/AWS_Creds.php') and I include the hard wired reference to the client factory.
Elegant? No. Done? Yes.
EDIT
I forgot to mention: I gitignore AWS_Creds.php so that it doesn't go into our repo.
basicly you can use like this :
$client = DynamoDbClient::factory(array(
'key' => 'aws_key',
'secret' => 'aws_secret',
'region' => 'us-east-1'
));
but in documentation :
Starting with the AWS SDK for PHP version 2.6.2, you can use an AWS credentials file to specify your credentials. This is a special, INI-formatted file stored under your HOME directory, and is a good way to manage credentials for your development environment. The file should be placed at ~/.aws/credentials, where ~ represents your HOME directory.
and usage :
$dynamoDbClient = DynamoDbClient::factory(array(
'profile' => 'project1',
'region' => 'us-west-2',
));
more info : http://docs.aws.amazon.com/aws-sdk-php/guide/latest/credentials.html
After watch the source code Credential.php in aws/aws-sdk-php/src,
php can not directly access /root folder in default. You can write $_SERVER['HOME']=[your new home path] in your php, and put the credential file in newHomePath/.aws/credentials.
require('vendor/autoload.php');
use Aws\Ec2\Ec2Client;
$credentials = new Aws\Credentials\Credentials('Your Access Key',
'Your Secret Key'); // Place here both key
$ec2Client = new Aws\Ec2\Ec2Client([
'version' => 'latest',
'region' => 'ap-south-1',
'credentials' => $credentials
]);
$result = $ec2Client->describeKeyPairs();
echo '<pre>';
print_r($result);
Reference site : https://docs.aws.amazon.com/aws-sdk-php/v2/guide/credentials.html#passing-credentials-into-a-client-factory-method
I'v got user with all permissions.
{
"Statement": [
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
]
}
I'm using aws-sdk-php-2 to put and copy objects in bucket.
http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.S3Client.html
Put code works perfect
$client->putObject(array(
'Bucket' => 'kiosk',
'Key' => 'test/orders/test.csv',
'SourceFile' => $sourcePath,
));
After check if object created on S3 via https://console.aws.amazon.com/s3 I'm executing next script.
$result = $client->copyObject(array(
'Bucket' => 'kiosk',
'CopySource' => 'test/orders/test.csv',
'Key' => 'test/test.csv',
));
And I'm getting fatal error:
Fatal error: Uncaught Aws\S3\Exception\S3Exception: AWS Error Code: AllAccessDisabled, Status Code: 403, AWS Request ID: XXX, AWS Error Type: client, AWS Error Message: All access to this object has been disabled, User-Agent: aws-sdk-php2/2.2.1 Guzzle/3.3.1 curl/7.19.7 PHP/5.4.13 thrown in phar:///usr/share/pear/AWSSDKforPHP/aws.phar/src/Aws/Common/Exception/NamespaceExceptionFactory.php on line 89
After upload file manually console.aws.amazon.com/s3 I see different error when trying to copy:
Fatal error: Uncaught Aws\S3\Exception\AccessDeniedException: AWS Error Code: AccessDenied, Status Code: 403, AWS Request ID: XXX, AWS Error Type: client, AWS Error Message: Access Denied, User-Agent: aws-sdk-php2/2.2.1 Guzzle/3.3.1 curl/7.19.7 PHP/5.4.13 thrown in phar:///usr/share/pear/AWSSDKforPHP/aws.phar/src/Aws/Common/Exception/NamespaceExceptionFactory.php on line 89
I also try to set permissions on file and folder via console.aws.amazon.com/s3:
Grantee: Everyone, Open/Download and View Permission and Edit Permission
But still same error.
I know this is an old question, but I ran into the same issue recently while doing work on a legacy project.
$this->client->copyObject([
'Bucket' => $this->bucket,
'CopySource' => $file,
'Key' => str_replace($source, $destination, $file),
]);
All of the my other S3 calls worked except for copyObject continued to throw an ACCESS DENIED error. After some digging, I finally figured out why.
I was passing just the key and making the assumption that the bucket being passed was what both the source and destination would use. Turns out that is an incorrect assumption. The source must have the bucket name prefixed.
Here was my solution:
$this->client->copyObject([
'Bucket' => $this->bucket,
// Added the bucket name to the copy source
'CopySource' => $this->bucket.'/'.$file,
'Key' => str_replace($source, $destination, $file),
]);
It says "Access Denied" because it thinks the first part of your key/folder is actually the name of the bucket which either doesn't exist or you really don't have access to.
Found out what the issue is here; being an AWS newbie I struggled here for a bit until I realized that each policy for the users you set needs to clearly allow the service you're using.
In this case I hadn't set the user to be allowed into S3.
Goto IAM then goto Users and click on the particular user that has the credentials you're using. From there goto Permissions tab, then click on Attach User Policy and find the S3 policy under select policy template. This should fix your problem.
Hope that helps!
Popular answer was on point, but still had issues. Had to include ACL option.
$this->client->copyObject([
'Bucket' => $this->bucket,
// Added the bucket name to the copy source
'CopySource' => $this->bucket.'/'.$file,
'Key' => str_replace($source, $destination, $file),
'ACL' => 'public-read'
]);
ACL can be one of these value 'ACL' => 'private|public-read|public-read-write|authenticated-read|aws-exec-read|bucket-owner-read|bucket-owner-full-control',