I'm trying to generate a cloudformation stack provided by AWS here. When I click the Create a Cognito User with CloudFormation button, it directs me to AWS console CloudFormation page on us-west-2 (Oregon), from there its pretty much self explanatory. The problem is, the company that I'm working on only allows work on us-west-1 (N. California). I have tried looking over the CloudFormation template itself and I cant find any region being mentioned. I have also asked this question in AWS developer forum but no one has responded, and I'm wondering if anyone here knows how to generate that particular stack on any region other than us-west-2 (oregon)? Thanks!
I found a workaround for that. I used to face the same problem, as my company policy was set to not use us-west-2, therefore I couldn't use the CloudFormation JSON script provided by Amazon Kinesis Data Generator.
What I did was:
Download CloudFormation JSON script by Amazon Kinesis Data Generator in your local machine. CloudFormation JSON script download link can be found Amazon Kinesis Data Generator Help page
Download the source code. The source code download link can be found in Amazon Kinesis Data Generator Help page.
In your AWS account, go to S3 and create a S3 bucket in the region that you are allowed to create. Name it whatever you want.
Upload the source code downloaded in step2 to the created bucket in step3.
Edit CloudFormation JSON script downloaded in step1. Inside of script, change bucket name inside of Lambda function to the name of bucket you created in step3.
Go to CloudFormation and create the stack by uploading your edited script.
One thing that you need to keep in mind implementing this workaround is that if there are any changes to source code by AWSLAB, or any newer version of source code comes to life, you will have to manually check and update it to your bucket.
I hope it was clear.
I have created JMeter plugin to publish data records in Kinesis Data Stream.
https://github.com/JoseLuisSR/awsmeter
It works very well and you don't need use any aditional AWS service to publish event in Kinesis as Kinesis Data Generator does, where you could pay aditional charges for services like Cognito, Cloudformation, Lambda that are need to build and deploy KDG.
You just need AWS IAM user with programmatic access, download JMeter and install awsmeter plugin.
If you have questions or comments let me know.
Thanks.
Related
I'm trying to trigger AWS Step Function whenever a new file is uploaded on S3 bucket. I'm using Cloudwatch rules to do this but I'm getting this warning
I tried to follow AWS documentation link "https://docs.aws.amazon.com/step-functions/latest/dg/tutorial-cloudwatch-events-s3.html#tutorial-cloudwatch-events-s3-step-1" but state machine did not invoked.
Can anyone tell me what exactly I'm doing wrong?
EDIT
I created this trail and region is Ohio
I found the issue, we need to enable data events as well to get API calls for S3. It was not mentioned in above AWS Document.
how to save voice message of customer number and store in an s3 bucket using aws connect. I made a contact workflow but I am not understanding how to save voice message to s3 bucket?
We've tried many ways to build a voicemail solution, including many of the things you might have found on the web. After much iteration we realized that we had a product that would be useful to others.
For voicemail in Amazon Connect, take a look at https://amazonconnectvoicemail.com as a simple, no-code integration that can be customized to meet the needs of your customers and organization!
As soon as you enabled Voice Recording all recordings are placed automatically in the bucket you defined at the very beginning as you setup your AWS Connect Instance. Just check your S3 Bucket if you can spot the recordings.
By default, AWS creates a new Amazon S3 bucket during the
configuration process, with built-in encryption. You can also use
existing S3 buckets. There are separate buckets for call recordings
and exported reports, and they are configured independently.
(https://docs.aws.amazon.com/connect/latest/adminguide/what-is-amazon-connect.html)
The recording in S3 is only starting when an agent is taking the call. Currently, there is no direct voice mail feature in Amazon connect. You can forward the call to a service that allows it, such as Twillio.
I am new to AWS CloudTrail. I have gone through number of aws docs and unable to figure out how to read cloudtrails last 7 days logs through program without configuring trail or without getting charged.
I want to write a java program which will read audit logs from aws and process those logs. I know we can create trail and we can read logs from aws s3 bucket using program, but I don't know how to read logs using aws sdk api for last 7 days like how we get logs on aws console ( we can read last 7 days audit logs free of cost.).
We can get this done using - cloudtrail-processing-library, but the properties/conf file for this lib requires sqs url as argument which i don't have, rather I don't know.
Please assist me so that I can write java program.
Regards,
Sachin
You can use the lookupEvents API in cloudtrail for getting the list of events (any create/update/delete operations).
http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/cloudtrail/AWSCloudTrail.html#lookupEvents-com.amazonaws.services.cloudtrail.model.LookupEventsRequest
The logs are stored in a S3 bucket and you can use AWS athena to process and query the logs if you want, so you don't have to write a Java program. If you do then that program will need IAM privileges to read from the S3 bucket that stores the logs.
AWS Athena
How to find your Cloud trail logs
Java code examples on S3 bucket objects
Java Cloudtrail SDK reference
I have an AWS S3 Bucket holding a development website. I would like to FTP(SSL) into the S3 Bucket, and also be able to create username and password credentials for others. Is this possible, and how can I do this?
Thanks!
Before giving up on S3 remember that sometimes frustration with a new product or technology comes from lack of knowledge and experience. The Amazon Cloud platform has some amazing services to work with.
FTP is an old technology that is not as popular today. The new style is using REST interfaces. S3 supports REST. Also you can easily copy files to / from S3 using command line tools. Look into the AWS Command Line Interface (CLI). Link below.
If your goal is to use S3 as your source repository look into AWS CodeCommit. Very similar to GIT. There is also CodePipeline, CodeBuild and CodeDeploy. Combine these tools with other Amazon services such as CloudFormation and you have real developer power.
AWS Command Line Interface
AWS Code Services
AWS CloudFormation
Is there a way to list the Amazon Resource Name (ARN) of an S3 Bucket from the web GUI?
I know I can piece it together myself, but that just seems unnecessary. Ideally, I could go to the S3 instance page and copy and paste the ARN. I've looked in the properties page of the bucket, but I'm not seeing anything that looks useful there.
No, the current S3 console does not expose bucket ARNs. You could probably add a feature to the S3 console page to yield ARNs with a simple GreaseMonkey script.