Why does my AWS AppFlow setup not show any subobject for selection? - amazon-web-services

I have just been given admin access to a Google Analytics portal that tracks the corporate website's activity. The tracked data are to be moved to Amazon S3 via AppFlow.
I followed official AWS documentation in how to setup the connection between GA and AWS. We have created the connection successfully but I came across an issue I can't find an answer to:
Subobject field is empty. Currently, there are already ~4 months worth of data so I was thinking it's not an empty data thing. This issue does not allow me to proceed creating the flow as it is a required field. Any thoughts?
note: the client and the team is new to AWS, so we are setting it up as we go, learning on the way. thank you for the help!

Found the answer! The Google analytics account should have a Universal Analytics property available. Here are a few links:
https://docs.aws.amazon.com/appflow/latest/userguide/google-analytics.html
https://support.google.com/analytics/answer/6370521?hl=en

Related

how to connect dataflow with webAPI using Dataflow PowerBI Service

Actually I'm beginner with powerBI service, I created DataFlow then filled in needed parameters to consume a WebAPI as you can see in this picture.
configuration
My problem is that I'm getting Data from Toggl so its not an on premises Data Source so its supposed to work without an on premises gateway. But in my case if delete the on-premises Getaway and select none it wont work and its raising this error : Invalid credentials. (Session ID: e6a5c147-0f28-4c59-b707-e69851c19, region Europe).
Could you please help me to know if the on premises Gateway is required or not as its not on premises data source? Also if not any reference or steps how to get the job done without on premises Gateway would be perfect as I'm struggling to find an answer or a documentation about this since days.
Any help will be so much appreciable!
Hello, Actually I'm beginner with powerBI service, I created DataFlow then filled in needed parameters to consume a WebAPI as you can see in this picture.
configuration
My problem is that I'm getting Data from Toggl so its not an on premises Data Source so its supposed to work without an on premises gateway. But in my case if delete the on-premises Getaway and select none it wont work and its raising this error : Invalid credentials. (Session ID: e6a5c147-0f28-4c59-b707-e69851c19, region Europe).
Could you please help me to know if the on premises Gateway is required or not as its not on premises data source? Also if not any reference or steps how to get the job done without on premises Gateway would be perfect as I'm struggling to find an answer or a documentation about this since days.
Any help will be so much appreciable!

Connecting on prem MQ to Google Cloud Platform

This is more of a conceptual question as there is no relevant documentation available. We have an on prem IBM-MQ from which we need to transfer data on our cloud storage bucket (GCP/AWS), what could be possible solutions in this case? Any help or direction would be appreciated. Thank you!
I'm assuming you can reach your goal once the MQ-data has been changed/converted to supported format by the Big Query.
You can refer on this google documentation for full guide on Loading data from local files. You can upload file via GCP Console or using selected programming language that will match on your on-prem. There's also variety of uploads that you can choose from according to data file. This also includes the right permission to use the BigQuery.
If you require authentication you check on this Big Query Authentication Guide

BigQuery API Listed Twice in APIs & Services Dashboard

does anyone happen to know why the BigQuery API would be listed twice in the APIs & Services Dashboard in Google Clout Platform?
BigQuery seems to be functioning properly I just thought it was strange this is the only API that seems to be listed twice.. I don't think it could be enabled twice as both the links lead to the same overview page and all the metrics are the same.
Duplicate Bigquery API listed in dashboard
This behavior is apparently caused by the fact that bigquery-json.googleapis.com is an alias for bigquery.googleapis.com.
The BigQuery engineering team is aware of this issue and are working on resolving it. All further updates should occur on this Public report.

building an amazon store with drupal 7

I've been playing around with an idea for an amazon store with Drupal 7. I do a lot of product reviews, and I typically link to amazon pages already (without referrer IDs, since I wanted toa void any questions of integrity all together), but having a seperate storefront link, well I'm playing with the idea.
I'm using Drupal 7, and I installed the Amazon API and Amazon Store module. It uses an Amazon AWS account and amazon associates ID. Basically creates a light storefront that does all the lifting through amazon itself. It only even uses Amazon items, which is fine since what isn't on Amazon, and only gives you a referral payout.
Well, what I'd love to do is have a stronger control over the items in the store. The Amazon Store module just gives you the option to control the basic items that are visible upon loading.
What I'd like to do: Create a store where categories match the contents of my site, and disable the search options. Is this possible with these modules? Does anyone have advice on creating something like this?
please see the below module and I hope it will be handy
https://drupal.org/project/amazon_store

Amazon Mechanical Turk task retrieval API

I'm writing an app wherein the premise is that people who can't directly fund a charity out of their pocket could automatically work on Amazon AWS HITs in order to bring clean water to the 3rd world - I'd known there was an Amazon AWS API but is very unclear on how to retrieve a HIT to be worked on, rather than just consume some data about some tasks I'm trying to complete.
Is there any way to retrieve a HIT to be worked on through the AWS API or otherwise?
Thanks ahead of time.
As far as I know, the only way to work on a HIT is through the mTurk website. i.e. - not via API.
There is a site that is trying to do something very similar to what you have described. http://www.sparked.com/