I can't read specific tables of Dynamics BC with Power BI - powerbi

I am having troubles sincronising certain tables of our ERP (Dynamics Business Central) with Power BI. The steps that I am doing are explained below:
Get Data
Search Dynamics 365 Business central
Search for the relevant tables
This is when Power BI doesn´t let me preview the information within the table called 'salesCreditMemoLines' (), contrary to other tables that I can see without troubles ()
I appreciate your help in this issue.

This is expected error. Document lines collections in Business Central API require the respective document ID to be present in the request, otherwise it fails.
This is the piece of code from the API page that throws this error.
IdFilter := GetFilter(SystemId);
DocumentIdFilter := GetFilter("Document Id");
if (IdFilter = '') and (DocumentIdFilter = '') then
Error(IDOrDocumentIdShouldBeSpecifiedForLinesErr);
There are two ways to send the document ID. My examples below are querying sales orders, but the same applies to all document collections.
First is request the lines along with the document header using $expand syntax:
https://api.businesscentral.dynamics.com/v2.0/{{tenantid}}/{{environmentname}}/api/v2.0/companies({companyId})/salesOrders(orderId)$expand=salesOrderLines
Another option is to query the document lines adding the $filter parameter:
https://api.businesscentral.dynamics.com/v2.0/{{tenantid}}/{{environmentname}}/api/v2.0/companies(companyId)/salesOrderLines?$filter=documentId eq salesOrderId
Filters can include ranges, so this way it's possible to request a collection of lines from multiple documents.
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/webservices/use-filter-expressions-in-odata-uris
Neither of these methods is going to work in Power BI data source selection, though. An alternative way is to use an entity salesDocumentLines under the Web Services (legacy) option. Yes, it shows as legacy, but so far Microsoft has not announced any plans to remove the OData web services.
https://learn.microsoft.com/en-us/dynamics365-release-plan/2021wave1/smb/dynamics365-business-central/enable-power-bi-connector-work-business-central-apis-instead-web-services-only

Related

Connecting Reports to Web APIs with Parameters

I have a client that has a large number of customers, and I have reports that can accept parameters and pass to a REST-based Web API to pull, for example, customer-specific records. This, of course, is easy using Power BI.
The challenge is, there could literally be 500,000 records out there, so filters and passing filters is not really an option. What I need to do is pass a value via Power BI Embedded to the report that will update the parameter of the Web API dynamically.
Such as https://services.server.com/api/customers/{customerId}
.
I have read and experimented with about every technique possible, and yet I still can't seem to pull this simple (and common) scenario off. To confirm, this is would work fine if I allowed a user to filter these values manually, but the goal is to have the Web.Content value be dynamic (like via a parameter) and then the parameter (like CustomerId) get fed to the report externally, like in a Power BI embedded parameter to the report.
Again, this can't be a filter, I just want to do what you used to be able to do with SSRS or Crystal Reports and send something like {parameter} = (or eq) '{some value}' and have the report use that as the datasource JSON feed.
Any thoughts on this frustrating situation?
You can do this with RLS:
https://learn.microsoft.com/en-us/power-bi/developer/embedded-row-level-security
Bring all the 500,000 records to your pbix.
Define a role which will filter based on an username.
When embedding, pass the role and the desired username to the embed token.

Redmine Custom Query to return only specific columns

How to filter Redmine issues to only include specified columns, when using the REST API xml/json format?
I have a custom query to obtain specific columns of issues:
Filter set to "status(open)" and "Due Date(any)"
Options / Selected Columns set to "Due Date" (To only return the ID and the Due Date in the response)
These are two ways of accessing it:
A /issues?query_id=myqueryid
B /issues.xml?query_id=myqueryid
When i run A in my browser, the correct response is given, containing only the id and the due date. In contrast to B, now every field (ie. id, description, start date,...) is included in the response.
I also tried to add a "fields" value as some other apis suggest, but to no avail (ie. /issues.xml?issue_id=1337&fields=due_date,etc).
Redmine's REST API (i.e. the JSON and XML APIs) always returns all base fields of filtered issues. You can optionally include additional fields like watchers, journals, issue relations, ... by using the include mechanism describe in the API docs
When using the API, the client is supposed to fetch any fields it desires from the response on their own. The feature to show a tailored HTML table with selected columns as done in the web UI is considered not very useful for the API use-case right now.

How can I use a parameter in a MS Power Bi web data source string?

I have a URL that returns a json object with everything I need for my power bi embedded report. I get the data for the report by adding a new web data source and pasting the URL in. a few transformations later and tada! sexy report. the report shows lots of charts and graphs etc... however I need to be able to change the datasource URL depending on who is looking at it.
The report shows data for a single organization. You can only look at it if you're in that organization. how can I pass an organizations ID when embedding the report so that the datasource will show different data?
for example if my datasource is defined in the originating pbix as
Json.Document(Web.Contents("http://www.testdata.com/api/json?orgId=1"))
how can I change it to
Json.Document(Web.Contents("http://www.testdata.com/api/json?orgId=2"))
when I'm pull the report to embed on a page?
I know you can filter data but that means I have to make the datasource URL pull ALL the data which would be huge and intensive just to have bi filter out something.
In short, I'm embedding a report on a website and tat report's only way to get data is via a json endpoint. That endpoint requires the org id of the user so how do I pass it to bi which in turn uses it in the data source url?
Your only option for this scenario is to pull all the required data into your dataset. Then you can use either Role Level Security (RLS) or the new JS API to filter the data for each user.
You should probably look at an Azure SQL data source as a more efficient, flexible and scalable back-end for PBI Embedded.

Web query with Power BI desktop returns empty table

I'm trying to import data from a web page to Power BI Desktop. The URL is the following:
https://casa.sapo.pt/Venda/Apartamentos/Maia/?sa=13
It contains data about housing prices, characteristics, etc.
The query returns an empty table, although if I browse to the page with any browser and without authentication or login, I see it contains the data I'm looking to analyse so I guess the publisher has disabled somehow the query I'm trying to make.
Does anyone know a workaround?
Power BI / Power Query gives quite disappointing results against that fairly typical page structure. It only seems to work well against "Web 1.0" style HTML tables. There are so few of those around (outside of Wikipedia) that it is almost pointless ...
You could try Chris Webb's ExpandAll function (link below) that will expand out all the data available.
http://blog.crossjoin.co.uk/2014/05/21/expanding-all-columns-in-a-table-in-power-query/
Then you can exclude all the non ".Text" columns and merge them together to get all the text from the page in one column - but it's hard to make sense of it. You also lose a lot of the HTML content eg links, image URLs.

Unable to update a Power BI table schema through the API with or without ApiaryIO

I am using Power BI API.
I've got a dataset with some tables and rows.
From Power BI API Console I don't have any issue when retrieving datasets or tables.
However the PUT verb on a table resource to update its schema always returns a 504 - Proxy request timed out
It's the first time I use Apiary IO so it might be its problem and not Power BI update, but that leads me to some questions:
Is there any workaround to test Power BI with, for example, Fiddler? I can type the url and body but I will need an Authorization header with the OAuth2 token if I'm not mistaken. How can I get that? ApiaryIO seems to hide it.
As per Update Schema Documentation the URL with the resource is https://api.powerbi.com/v1.0/myorg/datasets/{myDatasetId}/tables/{myTableName}
and the verb is a PUT. What is then the meaning of the "name": "???" parameter that goes in the JSON body? Is it the table's name or something else? I am assuming it's the table name but it seems redundant as I am already accessing the resource {myTableName} as per the given URL.
And my last related question is how to rename a specific table's column without modifying its data? This is what I'm trying to achieve by updating the schema but I don't understand how does Power BI know what column I am trying to rename.
Thank you!
Sorry that you're having trouble. You can get a token in two ways -the right way is to create an app in AAD (here's how). The wrong way ;) is to open the Power BI.com service, in a browser then open fiddler, then press F5 to reload. You should be able to see the Access Token in various requests. If you register an app, you can plug in your App's information in one of the samples we have https://powerbi.microsoft.com/developers, see client app or web app.
The name you provide in the table is the friendly human readable name that appears in the UI when you're building a report. Without it the system is unusable by humans :).
Let me get back to you on #3.
Calling PUT table will attempt to save upgrade the table without loosing any data (unless you removed columns). If it can't, it will return a conflict error. If you still want to update the table schema, you would have to delete the rows and call PUT table again. There is currently no direct way to rename a column. PUT table would treat it like a delete and add for that column. You would loose the data in that column but not the whole table.