Actually I'm beginner with powerBI service, I created DataFlow then filled in needed parameters to consume a WebAPI as you can see in this picture.
configuration
My problem is that I'm getting Data from Toggl so its not an on premises Data Source so its supposed to work without an on premises gateway. But in my case if delete the on-premises Getaway and select none it wont work and its raising this error : Invalid credentials. (Session ID: e6a5c147-0f28-4c59-b707-e69851c19, region Europe).
Could you please help me to know if the on premises Gateway is required or not as its not on premises data source? Also if not any reference or steps how to get the job done without on premises Gateway would be perfect as I'm struggling to find an answer or a documentation about this since days.
Any help will be so much appreciable!
Hello, Actually I'm beginner with powerBI service, I created DataFlow then filled in needed parameters to consume a WebAPI as you can see in this picture.
configuration
My problem is that I'm getting Data from Toggl so its not an on premises Data Source so its supposed to work without an on premises gateway. But in my case if delete the on-premises Getaway and select none it wont work and its raising this error : Invalid credentials. (Session ID: e6a5c147-0f28-4c59-b707-e69851c19, region Europe).
Could you please help me to know if the on premises Gateway is required or not as its not on premises data source? Also if not any reference or steps how to get the job done without on premises Gateway would be perfect as I'm struggling to find an answer or a documentation about this since days.
Any help will be so much appreciable!
Related
I am new to Neptune with no previous experience on it. My team wants to start using graph databases so asked me to set up a Neptune DB.
Following the AWs-provided guides step by step, I managed to get the Neptune Cluster created (Status shows available) but the Reader and Writer endpoints are stuck at status = 'Creating' for almost 2 days now.
Is this normal to take so long?
I tried doing this in two seperate Regions as well, thinking it might not be working correctly in Cape Town Region (af-south-1), but get the same scenario if I do it in Ireland Region.
Any help or pointers would be greatly appreciated!
We would need to look at your cluster specifically to know what is going on. That would mean sharing your cluster and account details with us. The best path ahead would be to open a support case or maybe post via re:Post where official representatives from the team can get in touch with you to get your cluster details.
If you are unable to get help, please do report. We'll find alternate ways to exchange your cluster info securely to get someone to take a look.
I hope everyone is doing great.
I am working on Amazon Connect reporting APIs. The basic need to post this question is that I want to get agent performance and agent status report through historical metrics using APIs. I am trying to find an API that will give me agent status from midnight that is only possible through historical metrics.
I don't want to use Streams APIs. If anyone has any solution kindly respond to me, it would be very helpful for me. Thanks.
The current reporting API only provides one function for retrieving historical data, GetMetricData, and that function is only able to return queue or channel statistics. Agent specific data is only available via agent event stream and console-based reports at this time. So, unfortunately, there is no way to do what your describing with an API in Amazon Connect right now.
Our client would like us to use dataflows for data reuse and other reasons. We will be connecting to a Snowflake database from PBI service. However, they also want to be able to use SSO (Single Sign-On). So, when a user creates a dataset referencing a dataflow, they want the credentials from the currently logged in user to be picked up via SSO and passed along to Snowflake when the dataflow retrieves data from Snowflake. I don't think this can be done but I wanted to verify.
BTW, I know that SSO can be used with PBI Desktop. Just curious if dataflows can use it.
Yes, it seems is possible to use DataFlows with SSO for Snowflake. I am coming to this conclusion from the following reference : https://learn.microsoft.com/en-us/power-query/connectors/snowflake
which includes DataFlows under Summary - Products.
I have just been given admin access to a Google Analytics portal that tracks the corporate website's activity. The tracked data are to be moved to Amazon S3 via AppFlow.
I followed official AWS documentation in how to setup the connection between GA and AWS. We have created the connection successfully but I came across an issue I can't find an answer to:
Subobject field is empty. Currently, there are already ~4 months worth of data so I was thinking it's not an empty data thing. This issue does not allow me to proceed creating the flow as it is a required field. Any thoughts?
note: the client and the team is new to AWS, so we are setting it up as we go, learning on the way. thank you for the help!
Found the answer! The Google analytics account should have a Universal Analytics property available. Here are a few links:
https://docs.aws.amazon.com/appflow/latest/userguide/google-analytics.html
https://support.google.com/analytics/answer/6370521?hl=en
Recently, I start to learn aws and also try to work on it, but there is one query at the time of data fatching from aws to our app.
I follow all the ateps which are mention in aws's suggestion but, steel got error. access deniel so, if any one know about it than please help me.