Connecting with Facebook API in Power BI - powerbi

Query window
I have problems with connecting with the Facebook API in Power BI.
Problem occurs when I try to aggregate the number of likes from a table -Count of id. Well it works perfectly with pages with a low number of likes, like a local blog and my personal page but when I try to do that (count) with a page with a lot of likes the aggregation process never finishes. Once I let it run for 20 minutes. I tried to limit the number of posts to a very low number, such as 40, and 10 but it still could not finish the aggregation.
Please help me finding a solution!
Places where I looked for the answer:
https://www.linkedin.com/pulse/20140616113222-14085524-using-power-query-to-tell-your-story-form-your-facebook-data

I would try adding a Transform / Count Rows step, instead of your Aggregation.

Related

Connect and retrieve data from SCOPUS database to Power BI Desktop

I am working with SCOPUS data via API to a Power BI Desktop to retrieve a search of a list of author id´s, but Just 25 papers (first page) out of more than 800 documents were retrieved in desktop. This is the link of scopus api : https://api.elsevier.com/content/search/scopus?query=AU-ID(%2256973530700%22)%20OR%20AU-ID(%2255911063200%22)%20OR%20AU-ID(%2236482638800%22)%20OR%20AU-ID(%22%C2%A016318706200%22)%20OR%20AU-ID(%2220433698300%22)%20OR%20AU-ID(%22%C2%A07003336314%22)%20OR%20AU-ID(%2256343679600%22)%20OR%20AU-ID(%227402141385%22)%20OR%20AU-ID(%229743731700%22)%20OR%20AU-ID(%2236166690800%22)%20OR%20AU-ID(%22%C2%A07006406404%22)%20OR%20AU-ID(%2225521994200%22)%20OR%20AU-ID(%227005832568%22)%20OR%20AU-ID(%227003993398%22)%20OR%20AU-ID(%2257194492913%22)%20OR%20AU-ID(%2224279923300%22)%20OR%20AU-ID(%227005533422%22)%20OR%20AU-ID(%226603806362%22)%20OR%20AU-ID(%227003737536%22)%20OR%20AU-ID(%2256458722000%22)%20OR%20AU-ID(%227004890330%22)&apiKey=7f59af901d2d86f78a1fd60c1bf9426a
Hope anyone could help me :)
The responses from the API are, by default, limited to 25 results per page.
The top of the Response Body contains details of the results you are receiving, the pertinent details are:
"opensearch:totalResults": How many articles your search returned
"opensearch:startIndex": What page of the results you are currently on
"opensearch:itemsPerPage": How many results per page there are.
"#searchTerms": Your query
"#startPage": What page of the results your query started on (default is 0)
In order to receive more than 25 results you will need to include pagination in your requests, by increasing the startPage="0" parameter incrementally.

PowerBI Auto Refresh

Good Day
A client I am working with wants to use a PowerBI dashboard to display in their call centre with stats pulled from an Azure SQL Database.
Their specific requirement is that the dashboard automaticly refresh every minute between their operating hours (8am - 5pm).
I have been researching this a bit but can't find a definitive answer.
Is it possible for PowerBI to automaticly refresh every 1min?
Is it dependant on the type of license and/or the type of connection (DIRECTQUERY vs IMPORT)
You can set a report to refresh on a direct query source, using the automatic report refresh feature.
https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-automatic-page-refresh
This will allow you to refresh the report every 1 minute or other defined interval. This is report only, not dashboards as it is desktop only.
When publishing to the service you will be limited to a minimum refresh of 30 mins, unless you have a dedicated capacity. You could add an A1 Power BI Embedded SKU and turn it on then off during business hours to reduce the cost. Which would work out around £200 per month.
Other options for importing data would be to set a Logic App or Power Automate task to refresh the dataset using an API call, for a lower level of frequency, say 5 mins. It would be best to optimise your query to return a small amount of pre aggregated data to the dataset.
You can use Power Automate to schedule refresh your dataset more than 48 times a day. You can refresh it every minute with Power Automate, it looks like. I can also see that you may be able to refresh your dataset more frequently than that with other tools.
Refreshing the data with 1 min frequency is not possible in PowerBI. If you are not using powerBI premium than you can schedule upto 8 times in a day, with the minimum gap of 15 minutes. If in case you are using PowerBI premium you are allowed to schedule 48 slots.
If you are not able to compromise with the above restrictions, then it might be worth to look into powerBI reports for streaming datasets. But then again there are some cons to that as well, like they work only with DirectQuery etc etc.

Dynamic filters with power bi reports

I have a very big database with a lot of tables. Let’s imagine this is Amazon customer database.
I have several users which want to make queries on this database. For example : « give me all customers which live in this area, are between 20 and 30 years old, have spent between 200 ans 300$ in 2017 and which did not made any order last 6 month and which visited product pages talking about video games last 3 days ».
Queries can concern every table and every field. I do not Know at development step which criteria will be needed by users. It can change every day.
What i need is to provide an universal query tool.
I have deployed power bi and i need to add dynamic filters inside the report but i do not know how...
My problem is i want to embed filters INSIDE the report
Thanks

Power BI Adobe Analytics connector is limiting dimensions

I am using the Adobe Analytics connector for one of my reports. One of my 'Dimensions' is called 'Websites' and the data is a URL (www.myurl.com), we look at over 100 websites and I am looking at the traffic for each one. When I use the Websites dimension it only displays 10 of the sites out of the total number. Is there a way I can add a line of code in the advanced query editor to remove this limit? Or is this a limitation with the Power BI connector?
Using the 'Top parameter'(returns up to 50,000 dimensions per call) fixes my problem and breaks down my URL column by the correct number of dimensions as opposed to 10.
{Cube.ApplyParameter, "Top", {10000, "evar123"}},

Analytics django app with noSQL db and GA

I've started a django project that will include an analytics app. I want that app to use either couchDB or mongoDB for storing data.
The initial idea was (since the client already is using Google Analytics) to once a day/week/month grab data from GA, and store store it locally as values in database. Which would ultimately build a database of entries - one entry per user per month - with summed values like
{"date":"11.2011""clicks": 21, "pageviews": 40, "n": n},
for premium users there could be one entry per user per week or even day.
The question would be:
grab analytics from GA, do a sum entries for clicks, visits etc.
or
store clicks and whatever values locally and once a month do sums for display ?
Lukasz, unless Google Analytics has really relaxed their privacy levels, you're not going to be able to access user-level records (but check out the answer here: Django saving the whole request for statistics, whats available?)
Right, old question but I've just finished the project so I'll just write what I did.
Since I didn't need concurrency and wanted more speed approach, I found that mongodb is better for that.
The final document schema that I've used is
{'date': '11.2009', 'pageviews': 40, 'clicks': 13, 'otherdata': 'that i can use as filters'}
The scope of my local analytics is monthly, so I create one entry in mongdb per user per month, and update it each day. As said just now, I update data daily, and store only summaries and averages of those.
What else. Re: Jamie's answer... The system is using GA events, so I've got access to all data that i need.
Hope someone may find it interesting.
cheers and thanks for ideas !