I am attempting to use the new Automatic Aggregations feature in the PBI service. However, I do not understand what the following is trying to tell me.
What exactly is this telling me and what does the "target seconds" field accomplish?
Related
I am trying to create a metric to measure the amount of time that an item has been in a cache using Elasticache. There does not seem to be any built in metric for this in Cloud Watch, and I have struggled to run a query in logs insights to obtain this information.
I have tried running a query in log insights to create this metric, but it requires matching of an ID and the query language used in AWS does not seem to support these types of conditional queries. So I am unsure of how to solve this problem
I am trying to use pre aggregations over CLOUD SQL on Google Cloud Platform but the database is denying access and giving error Statement violates GTID consistency.
Any help is appreciated.
Cube.js done pre-aggregation by CREATE TABLE ... SELECT, but you are using MySQL on top of Google SQL with --enforce-gtid-consistency (has limitations).
Since only transactionally safe statements can be logged, there is a limitation to use CREATE TABLE ... SELECT (and some another SQL), because this statement is actually logged as two separate events.
There are two ways how to solve this issue:
1. Use pre-aggregations to an external database. (recommended way).
https://cube.dev/docs/pre-aggregations/#read-only-data-source-pre-aggregations
2. Use not documented flag loadPreAggregationWithoutMetaLock
Attention: This flag is an experimental and can be removed or changed in the feature..
Take a look at the source code
You can pass it directly in the driver constructor. This will produce two SQL statements to pass the limitation:
CREATE TABLE
INSERT INTO
Thanks
I have a client that has a large number of customers, and I have reports that can accept parameters and pass to a REST-based Web API to pull, for example, customer-specific records. This, of course, is easy using Power BI.
The challenge is, there could literally be 500,000 records out there, so filters and passing filters is not really an option. What I need to do is pass a value via Power BI Embedded to the report that will update the parameter of the Web API dynamically.
Such as https://services.server.com/api/customers/{customerId}
.
I have read and experimented with about every technique possible, and yet I still can't seem to pull this simple (and common) scenario off. To confirm, this is would work fine if I allowed a user to filter these values manually, but the goal is to have the Web.Content value be dynamic (like via a parameter) and then the parameter (like CustomerId) get fed to the report externally, like in a Power BI embedded parameter to the report.
Again, this can't be a filter, I just want to do what you used to be able to do with SSRS or Crystal Reports and send something like {parameter} = (or eq) '{some value}' and have the report use that as the datasource JSON feed.
Any thoughts on this frustrating situation?
You can do this with RLS:
https://learn.microsoft.com/en-us/power-bi/developer/embedded-row-level-security
Bring all the 500,000 records to your pbix.
Define a role which will filter based on an username.
When embedding, pass the role and the desired username to the embed token.
I am trying to query a Salesforce Datasource using the Salesforce API in Power BI.
The issue I am having is that the Salesforce API requires the SOQL to either contain a WHERE clause on the specific query being performed, by using an explicit value "=" or a IN operator.
If I use something like the following
each ([Column] = "value")
It works, however if I try and pass a list like the following:
each List.Contains(myList, [Column])
It fails with an error because the SOQL query being executed in the background clearly does a filtering AFTER all the data has been returned (i.e being filtered in Power BI after its been retrieved and not at source) and the Web API is expecting a IN operating with a list of values.
With regards to the first query, this works because the SOQL in the background is WHERE (MyField = 'someValue')
Does anyone have any ways around this? Can Power BI pass a collection into a where clause some way?
If not, I thought about querying the records by a single value and looping through this but it will have performance issues and not the best way of doing such query!
I have also tried joining the tables on the field I need but it also looks like the joins are performed after the data is retrived and not before (using SOQL).
I cannot modify the Web API to support other queries.
Thank you!
I am using Power BI API.
I've got a dataset with some tables and rows.
From Power BI API Console I don't have any issue when retrieving datasets or tables.
However the PUT verb on a table resource to update its schema always returns a 504 - Proxy request timed out
It's the first time I use Apiary IO so it might be its problem and not Power BI update, but that leads me to some questions:
Is there any workaround to test Power BI with, for example, Fiddler? I can type the url and body but I will need an Authorization header with the OAuth2 token if I'm not mistaken. How can I get that? ApiaryIO seems to hide it.
As per Update Schema Documentation the URL with the resource is https://api.powerbi.com/v1.0/myorg/datasets/{myDatasetId}/tables/{myTableName}
and the verb is a PUT. What is then the meaning of the "name": "???" parameter that goes in the JSON body? Is it the table's name or something else? I am assuming it's the table name but it seems redundant as I am already accessing the resource {myTableName} as per the given URL.
And my last related question is how to rename a specific table's column without modifying its data? This is what I'm trying to achieve by updating the schema but I don't understand how does Power BI know what column I am trying to rename.
Thank you!
Sorry that you're having trouble. You can get a token in two ways -the right way is to create an app in AAD (here's how). The wrong way ;) is to open the Power BI.com service, in a browser then open fiddler, then press F5 to reload. You should be able to see the Access Token in various requests. If you register an app, you can plug in your App's information in one of the samples we have https://powerbi.microsoft.com/developers, see client app or web app.
The name you provide in the table is the friendly human readable name that appears in the UI when you're building a report. Without it the system is unusable by humans :).
Let me get back to you on #3.
Calling PUT table will attempt to save upgrade the table without loosing any data (unless you removed columns). If it can't, it will return a conflict error. If you still want to update the table schema, you would have to delete the rows and call PUT table again. There is currently no direct way to rename a column. PUT table would treat it like a delete and add for that column. You would loose the data in that column but not the whole table.