How do you create custom dashboards in AWS Pinpoint? - amazon-web-services

AWS Pinpoint Analytics appears to have replaced Amazon Mobile Analytics. In Mobile Analytics, you were able to create custom dashboards.
I'm struggling to find the feature in AWS Pinpoint. I'm assuming it's in there somewhere, but alas, I haven't found it yet.

#D.Patrick, you can create custom dashboards with Pinpoint data but not directly within Pinpoint console i.e You would need first to export your Pinpoint event data to a persistent storage (e.g S3 or Redshift) using Amazon Kinesis. Once in S3, you can use analytics tools to further analyze or visual the data. Such analytic tool offered by AWS include AWS Quicksight or AWS Athena. Other analytics(none-AWS) tools include Splunk

Check out the blog by AWS on this topic:
https://aws.amazon.com/blogs/messaging-and-targeting/creating-custom-pinpoint-dashboards-using-amazon-quicksight-part-1/
The 3 parts of this session describe in detail how to use Python 3, with AWS Lambda to create the custom dashboards.

Related

AWS Athena Data Sources API

I have checked AWS documentation for Athena Data Sources https://docs.aws.amazon.com/athena/latest/ug/data-sources-managing.html and I have also checked AWS CLI for all available commands, but I was not able to find any API that would allow me to set AWS Athena Data Source via programmatic way.
I was successful in automating deployment of SAR app for connectors, but I have been unable to automate configuration of Athena Data Source at all.
Is there any public API for doing that?
After more searching, it's named 'data-catalog', so appropriate action is to call create-data-catalog with proper attributes

Does GCP have an equivalent of AWS's custom Glue connector for Snowflake access?

We've got some data in Snowflake that we'd like to pull into our GCP environment, on a periodic basis. One of our engineers has done the equivalent setup on AWS on a previous project, using the documentation here. I find this setup to be a lot simpler than setting up a push data flow, which requires creating an integration and service account from the Snowflake side, then granting the service account some IAM permissions in GCP.
Can anyone tell me if GCP offers a similar pull-based connector API/setup for Snowflake?

Azure to aws migration (synapse question)

We want to decommission the azure lake and maintain just aws moving forward. Issue is several of my customers use synapse that sources from azure lake. We are hoping to give them something closer to the synapse tool but the data would come from the s3 bucket. Any ideas?
These clients do not have aws accounts so wondering if we enable something like redshift, would we have to give them access to our account?

Is there a way to export and import the Amazon Connect - contact flow

I checked there is a way to import and export the connect json in the Amazon Connect via UI.
It is working as expected. Is there a API (AWS SDK) available for importing? We want to automate this process! Could someone tell the way for achieving this?
Anyhelp is appreciated.
Thanks,
Harry
Disclaimer : I am not with Amazon Connect product team.
Looking through the AWS CLI and boto3 docs, there is no API currently (i.e as of 03rd June, 2020) to support import and export of Amazon Connect - contact flow programmatically.
most likely because this feature is still in beta status as per Amazon Connect - Administrator guide
As of September 17, 2020, AWS now has an API for programmatically creating and managing contact flows: https://aws.amazon.com/blogs/contact-center/programmatically-manage-contact-flows-with-new-amazon-connect-apis/
There isn’t much documentation yet. https://github.com/aws-samples/ac-contactflowapi exists but most of the API functionality is buried inside JavaScript.
https://docs.aws.amazon.com/connect/latest/APIReference/API_CreateContactFlow.html probably describes it best. The content attribute is JSON serialized to a string and embedded again in the JSON attributes. I’m not yet certain where to find documentation for the JSON itself but you can build it interactively by using AWS Connect as shown in the above question.
You can also use this library I made: https://github.com/sethkor/connect-backup
It uses the newly released AWS Connect API, you can find details here: https://docs.aws.amazon.com/connect/latest/APIReference/Welcome.html
It allows you to backup and restore contact flows plus some other AWS Connect components,
To back everything it can backup, including contact-flows type:
connect-backup --profile your-aws-profile backup --instance your-connect-instance-id --file path-to-write-backup
There are options to also write to S3.
It also comes with a Lambda and a AWS SAM template to set up periodic backup too.

Accessing AWS S3 from within google GCP

We were doing most of our cloud processing (and still do) using AWS. However, we also now have some credits on GCP and would like to use and want to explore interoperability between the cloud providers.
In particular, I was wondering if it is possible to use AWS S3 from within GCP. I am not talking about migrating the data but whether there is some API which will allow AWS S3 to work seamlessly from within GCP. We have a lot of data and databases that are hosted on AWS S3 and would prefer to keep everything there as it still does the bulk of our compute.
I guess one way would be to transfer the AWS keys to the GCP VM and then use the boto3 library to download content from AWS S3 but I was wondering if GCP, by itself, provides some other tools for this.
From an AWS perspective, an application running on GCP should appear logically as an on-premises computing environment. This means that you should be able to leverage the services of AWS that can be invoked from an on-premises solution. The GCP environment will have Internet connectivity to AWS which should be a pretty decent link.
In addition, there is a migration service which will move S3 storage to GCP GCS ... but this is distinct from what you were asking.
See also:
Getting started with Amazon S3
Storage Transfer Service Overview