I am trying to use AWS amplify analytics library to stream analytics data. I ran into issues with pinpoint not sending custom attributes and decided to switch to AWS Kinesis. Now I am having issues configuring a custom kinesis backend to amplify analytics library. This docs only provides documentation only for configuring custom Pinpoint backend. How do I configure custom Kinesis backend?
Related
I need to do a POC if it is feasible use AWS Glue to READ data from AWS and WRITE it into Salesforce via JDBC custom connector.
Checking the docs like
AWS Custom Glue connectors docs : https://aws.amazon.com/blogs/big-data/developing-testing-and-deploying-custom-connectors-for-your-data-stores-with-aws-glue/
and the available marketplace options like from CData :
CData AWS Glue Salesforce Connector
It seems that it is only possible to READ from Salesforce and WRITE into AWS data stores. My usecase would be the inverse to update Salesforce objects using AWS Glue.
Does anybody have experience with that or knows if that is feasible with custom connectors?
Many thanks in advance.
For the AWS marketplace integration, we have checked the sample code of Serverless integration for SaaS products. The samples provided, it is based on AWS Lambda functions for registering new subscribers, subscribing to SQS etc.
Can we implement all these functionalities with our database, functions etc, instead of using AWS lambda and dynamo DB?
Also in the examples, some lambda functions are there like stream handler, entitlement-SQS, subscription-SQS-handler, grant-revoke-access-to-product etc. How can we trigger if we use our APIS?
Yes, the AWS Marketplace SaaS integration documentation states:
When a customer subscribes to your product, they are redirected to your registration URL which is an HTTP POST request with a temporary x-amzn-marketplace-token token.
All you need is an endpoint that can receive that POST in order to integrate. If you want to trigger any additional AWS API calls, your endpoint could make use of an AWS SDK. There are also requirements surrounding what your POST endpoint must do in order to be approved by the AWS Marketplace team. I suggest reviewing the documentation above or this AWS Marketplace
SaaS Listing Process & Integration Guide.
I am planning to export logging from MuleSoft CloudHub to AWS CloudWatch.
I saw there is one AWS CloudWatch Connector in GitHub:
https://github.com/mulesoft-labs/mule-amazon-cloudwatch-connector/tree/master/mule-cloudwatch-connector
Is there any examples on how to implement this?
And which AWS CloudWatch features is supported in this connector?
I found other export logs method with examples, but not for AWS CloudWatch:
https://help.mulesoft.com/s/question/0D52T00004mXUALSA4/export-log-to-external-system
Thanks.
You could try to create a Mule application that reads logs from CloudHub using CloudHub's API and pushes them to CloudWatch. The readme for the connector doesn't seem to mention an operation to put logs into CloudWatch, but if CloudWatch has a REST API for it -every AWS product has it- you could use it in your application.
This KB article shows how to get the logs from CloudHub: https://help.mulesoft.com/s/article/How-to-get-whole-Application-logs-from-Cloudhub-through-API
We are in the process of migrating from activeMQ to amazonMQ on AWS. ActiveMQ and AmazonMQ internally uses kahaDB as there data store.
Earlier we were able to see kahaDB logs files while using activeMQ on data center is there a similar way of seeing the kahadb logs file on AWS while using amazonMQ?
Tried enabling cloudwatch logs but it contains general and audit logs of amazonMQ.
I checked with AWS technical team, they don't allow to access the kahaDB logs.
Is there an AMAZON-SNS mock?
That is, some non-hosted version that I can use for testing or for offline cases? Preferably with the same API.
Like elasticMQ is for SQS?
Thank you
LocalStack - A fully functional local AWS cloud stack
LocalStack provides an easy-to-use test/mocking framework for
developing Cloud applications.
Currently, the focus is primarily on supporting the AWS cloud stack.
LocalStack spins up the following core Cloud APIs on your local
machine:
API Gateway at http://localhost:4567
Kinesis at http://localhost:4568
DynamoDB at http://localhost:4569
DynamoDB Streams at http://localhost:4570
Elasticsearch at http://localhost:4571
S3 at http://localhost:4572
Firehose at http://localhost:4573
Lambda at http://localhost:4574
SNS at http://localhost:4575
SQS at http://localhost:4576
Redshift at http://localhost:4577
If you really need a service you can try https://github.com/s12v/sns
Another option would be mocking. SNS is a HTTP service, so you can mock responses in your application.