I have checked AWS documentation for Athena Data Sources https://docs.aws.amazon.com/athena/latest/ug/data-sources-managing.html and I have also checked AWS CLI for all available commands, but I was not able to find any API that would allow me to set AWS Athena Data Source via programmatic way.
I was successful in automating deployment of SAR app for connectors, but I have been unable to automate configuration of Athena Data Source at all.
Is there any public API for doing that?
After more searching, it's named 'data-catalog', so appropriate action is to call create-data-catalog with proper attributes
Related
Is there an API/way to programmatically query AWS documentation for a specific service? For instance, I want to know the encryption algorithm used by a service for protecting data at rest. Can I write a script that will automatically query AWS documentation for that service and give me this information?
There is no API for AWS Documentation.
However, the AWS CLI is open-source and it has data files that detail all API calls and their parameters.
It would not, however, contain the encryption algorithms. That is internal to Amazon S3 and is not shared publicly.
In my AWS project, I created webservices using API Gateway, Lambda, DynamoDB and S3, that are called by an Android app.
Now, I want to log specific actions on my webservices (in my lambda functions), so I can download them from an Android app.
Here is what I was thinking of:
append my logs to a text file (or multiple text files) in S3, but then I have to download the file, append the logs, then upload the file, each time I need to add a log (doesn't sound very optimized)
store my logs in a DynamoDB table, but it doesn't look like a clean solution, and might be pricy
using CloudWatch Logs to log everything I want, but then I need to only extract the logs I need, and it seems quite complex, and not sure it's the best solution either
So what is the most suitable solution to log actions in lambda functions, so I can then download them from an app?
Thanks.
I think you can use AWS Kinesis Stream if you want to analyze your logs on the fly, or use Kinesis Firehose if you just want to aggregate your logs and store them in the same place.
Kinesis Firehose can receive logs from multiple sources, aggregate your logs and save it in S3. When the log is saved in S3, you can use AWS Athena to do queries in these logs files. To connect it with an Android Device to download the logs, you can build an API to communicate with Athena.
And if you to personalize the view for each Android Device, just make sure to include a unique ID in each log and query for this ID in Athena.
I am working on a pet project based on multi-cloud (AWS and GCP) which is based on serverless architecture.
Now there are files generated by the business logic within GCP (using Cloud Functions and Pub/Sub) and they are stored in GCP Cloud storage. I want to ingest these files dynamically to AWS S3 bucket from the Cloud Storage.
One possible way is by using the gsutil library (Exporting data from Google Cloud Storage to Amazon S3) but this would require a compute instance, and run the gsutil commands manually which I want to avoid.
In answering this I'm reminded a bit of a Rube Goldberg type setup but I don't think this is too bad.
From the Google side you would create a Cloud Function that is notified when a new file is created. You would use the Object Finalize event. This function would get the information about the file and then call an AWS Lambda fronted by AWS API Gateway.
The GCP Function would pass the bucket and file information to the AWS Lambda. On the AWS side you would have your GCP credentials and the GCP API download the file and upload it to S3.
Something like:
All serverless on both GCP and AWS. Testing isn't bad as you can keep them separate - make sure that GCP is sending what you want and make sure that AWS is parsing and doing the correct thing. There is likely some authentication that needs to happen from the GCP cloud function to API gateway. Additionally, the API gateway can be eliminated if you're ok pulling AWS client libraries into the GCP function. Since you've got to pull GCP libraries into the AWS Lambda this shouldn't be much of a problem.
AWS Pinpoint Analytics appears to have replaced Amazon Mobile Analytics. In Mobile Analytics, you were able to create custom dashboards.
I'm struggling to find the feature in AWS Pinpoint. I'm assuming it's in there somewhere, but alas, I haven't found it yet.
#D.Patrick, you can create custom dashboards with Pinpoint data but not directly within Pinpoint console i.e You would need first to export your Pinpoint event data to a persistent storage (e.g S3 or Redshift) using Amazon Kinesis. Once in S3, you can use analytics tools to further analyze or visual the data. Such analytic tool offered by AWS include AWS Quicksight or AWS Athena. Other analytics(none-AWS) tools include Splunk
Check out the blog by AWS on this topic:
https://aws.amazon.com/blogs/messaging-and-targeting/creating-custom-pinpoint-dashboards-using-amazon-quicksight-part-1/
The 3 parts of this session describe in detail how to use Python 3, with AWS Lambda to create the custom dashboards.
You can change the data source of appsync service in aws console, but I am not sure if it can work after I run command
amplify push api
And haven't found a way of changing data source with aws-amplify.
There isn't a direct way, no, as AppSync doesn't have native Postgres data sources. You could theoretically do anything with a Lambda data source, though, very much including Postgres.
An AppSync dev created a sample app that shows a way to do this, via what Amplify calls custom resolvers. You can find that repo here: https://github.com/mikeparisstuff/amplify-cli-nested-api-sample