I am working with SCOPUS data via API to a Power BI Desktop to retrieve a search of a list of author id´s, but Just 25 papers (first page) out of more than 800 documents were retrieved in desktop. This is the link of scopus api : https://api.elsevier.com/content/search/scopus?query=AU-ID(%2256973530700%22)%20OR%20AU-ID(%2255911063200%22)%20OR%20AU-ID(%2236482638800%22)%20OR%20AU-ID(%22%C2%A016318706200%22)%20OR%20AU-ID(%2220433698300%22)%20OR%20AU-ID(%22%C2%A07003336314%22)%20OR%20AU-ID(%2256343679600%22)%20OR%20AU-ID(%227402141385%22)%20OR%20AU-ID(%229743731700%22)%20OR%20AU-ID(%2236166690800%22)%20OR%20AU-ID(%22%C2%A07006406404%22)%20OR%20AU-ID(%2225521994200%22)%20OR%20AU-ID(%227005832568%22)%20OR%20AU-ID(%227003993398%22)%20OR%20AU-ID(%2257194492913%22)%20OR%20AU-ID(%2224279923300%22)%20OR%20AU-ID(%227005533422%22)%20OR%20AU-ID(%226603806362%22)%20OR%20AU-ID(%227003737536%22)%20OR%20AU-ID(%2256458722000%22)%20OR%20AU-ID(%227004890330%22)&apiKey=7f59af901d2d86f78a1fd60c1bf9426a
Hope anyone could help me :)
The responses from the API are, by default, limited to 25 results per page.
The top of the Response Body contains details of the results you are receiving, the pertinent details are:
"opensearch:totalResults": How many articles your search returned
"opensearch:startIndex": What page of the results you are currently on
"opensearch:itemsPerPage": How many results per page there are.
"#searchTerms": Your query
"#startPage": What page of the results your query started on (default is 0)
In order to receive more than 25 results you will need to include pagination in your requests, by increasing the startPage="0" parameter incrementally.
Related
I am having troubles sincronising certain tables of our ERP (Dynamics Business Central) with Power BI. The steps that I am doing are explained below:
Get Data
Search Dynamics 365 Business central
Search for the relevant tables
This is when Power BI doesn´t let me preview the information within the table called 'salesCreditMemoLines' (), contrary to other tables that I can see without troubles ()
I appreciate your help in this issue.
This is expected error. Document lines collections in Business Central API require the respective document ID to be present in the request, otherwise it fails.
This is the piece of code from the API page that throws this error.
IdFilter := GetFilter(SystemId);
DocumentIdFilter := GetFilter("Document Id");
if (IdFilter = '') and (DocumentIdFilter = '') then
Error(IDOrDocumentIdShouldBeSpecifiedForLinesErr);
There are two ways to send the document ID. My examples below are querying sales orders, but the same applies to all document collections.
First is request the lines along with the document header using $expand syntax:
https://api.businesscentral.dynamics.com/v2.0/{{tenantid}}/{{environmentname}}/api/v2.0/companies({companyId})/salesOrders(orderId)$expand=salesOrderLines
Another option is to query the document lines adding the $filter parameter:
https://api.businesscentral.dynamics.com/v2.0/{{tenantid}}/{{environmentname}}/api/v2.0/companies(companyId)/salesOrderLines?$filter=documentId eq salesOrderId
Filters can include ranges, so this way it's possible to request a collection of lines from multiple documents.
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/webservices/use-filter-expressions-in-odata-uris
Neither of these methods is going to work in Power BI data source selection, though. An alternative way is to use an entity salesDocumentLines under the Web Services (legacy) option. Yes, it shows as legacy, but so far Microsoft has not announced any plans to remove the OData web services.
https://learn.microsoft.com/en-us/dynamics365-release-plan/2021wave1/smb/dynamics365-business-central/enable-power-bi-connector-work-business-central-apis-instead-web-services-only
Query window
I have problems with connecting with the Facebook API in Power BI.
Problem occurs when I try to aggregate the number of likes from a table -Count of id. Well it works perfectly with pages with a low number of likes, like a local blog and my personal page but when I try to do that (count) with a page with a lot of likes the aggregation process never finishes. Once I let it run for 20 minutes. I tried to limit the number of posts to a very low number, such as 40, and 10 but it still could not finish the aggregation.
Please help me finding a solution!
Places where I looked for the answer:
https://www.linkedin.com/pulse/20140616113222-14085524-using-power-query-to-tell-your-story-form-your-facebook-data
I would try adding a Transform / Count Rows step, instead of your Aggregation.
I'm trying to import data from a web page to Power BI Desktop. The URL is the following:
https://casa.sapo.pt/Venda/Apartamentos/Maia/?sa=13
It contains data about housing prices, characteristics, etc.
The query returns an empty table, although if I browse to the page with any browser and without authentication or login, I see it contains the data I'm looking to analyse so I guess the publisher has disabled somehow the query I'm trying to make.
Does anyone know a workaround?
Power BI / Power Query gives quite disappointing results against that fairly typical page structure. It only seems to work well against "Web 1.0" style HTML tables. There are so few of those around (outside of Wikipedia) that it is almost pointless ...
You could try Chris Webb's ExpandAll function (link below) that will expand out all the data available.
http://blog.crossjoin.co.uk/2014/05/21/expanding-all-columns-in-a-table-in-power-query/
Then you can exclude all the non ".Text" columns and merge them together to get all the text from the page in one column - but it's hard to make sense of it. You also lose a lot of the HTML content eg links, image URLs.
I am working on a realtime dashboard, i'd like to use the powerbi Rest Api.
My question how does the updating of rows work. I have 1300 records to load once and then update 2 columns for each row every 20 seconds.
The only rest call I see is to addrows, but it's not clear how it handles update of rows if it does
You have two patterns you can choose from:
You can send data in batches: upload 1300 rows, then call DELETE on the rows, then call upload with the next payload of rows.
Here's the DELETE method you need to all. We're adopting REST standards for our APIs so the 'methods' are the REST verbs :). https://msdn.microsoft.com/en-us/library/mt238041.aspx
Alternately you can incrementally update the data: You'd add a 'timestamp' column to your data set. Then in your query (like in Q&A) you'd ask for "show data for the last 20 seconds". If you do this, set the FIFO retention policy when you create the data set so you don't run out of space.
In either case, double check the number of rows you're pushing fit within the limits we spell out. https://msdn.microsoft.com/en-US/library/dn950053.aspx
HTH,
-Lukasz
i was searching something in powerbi docs that could help me in creating a report with rest APIs. couldn't find it exact though. however made a work-around.
firsly, I created a push dataset schema in powerbi with help of post push dataset rest api.
https://learn.microsoft.com/en-us/rest/api/power-bi/push-datasets/datasets-post-dataset-in-group
then I pushed rows/record/data into my dataset with this post rows in push dataset.
https://learn.microsoft.com/en-us/rest/api/power-bi/push-datasets/datasets-post-rows-in-group
then I went to powerbi service, and created a visual report manually there.
after this I embedded that report in my react app.
finally my report was live.
now if wanted to update my report in real time, I called delete push dataset rows api to delete the existing rows/records from my dataset.
https://learn.microsoft.com/en-us/rest/api/power-bi/push-datasets/datasets-delete-rows-in-group
and then called the post push dataset rows api again with new updated data. (repeated step 2)
and then finally I refreshed my website page, and now I see the updated visual report in my website.
it took me too much time. so I can feel if you are struggling w/ powerbi rest api. it's not straightforward. so feel free to ask anything down below. will happy to help.
I am using Amazon's Product Advertising API. When I'm searching products by keyword from an item search operation I get only 10 results, is there any way to get all result in one single call?
No - Paging Through Results explains some of the details:
It is possible to create a request that returns many thousands of
items in a response. This is problematic for several reasons.
Returning all of the item attributes for those items would
dramatically impact the performance of Product Advertising API in a
negative way. Also, posting a thousand responses on a web page is
impractical.
...
This example shows that 9729 items matched the search criteria. Also,
it shows that those results are on 973 (~9729/10) pages. You might try
putting in an ItemPage value over 10. If you do, Product Advertising
API returns the following error.
...
So, how do you get that 973rd page? You cannot. A better approach is
to submit a new request that is more targeted and yields fewer items
in the response.