Sometimes users have authentication issues with a SharePoint list.
I would like PowerQuery to try to get data from that list but if it fails I would like it to pull from an Excel file on the network. (which I would just update manually once per week)
How would I do this?
Power Query's M/PQL language has a try expression for error handling:
https://msdn.microsoft.com/en-us/library/mt186368.aspx
I would try to edit the Query script to have 2 Source statements (SPO & Excel) with try expressions, then choose between them in a later step, e.g.
... SourceSPList = try SharePoint.Tables( ...
... SourceExcel = try Excel.Workbook( ...
... if SourceSPList[HasError] then SourceExcel[Value] else SourceSPList[Value]
Good luck with this - M/PQL raw coding about is the most miserable coding experience around ...
Related
i am trying to change my database name in my advanced editor query in power bi. I know i can create parameters with in the power bi desktop app and pass the different database with in it. I have done this and it works fine.
But what i am trying to do is when i give a user a link for example
https://app.powerbi.com/groups/me/reports/DataSource="PowerBi_1"
how do i get the datasource name which is "PowerBi_1" and pass it into my advanced editor query which looks as follows
let
Source = Sql.Database(".", "PowerBi_2", [Query="select *#(lf)from Customer"])
in
Source
so i want to replace the Powerbi_2 with PowerBi_1
is this possible?
I tried searching and the only things i could find was to add parameters from "manage parameters" which i can already do. But i need it to be passed from the URL and automatically change the data source instead of manually changing it via "edit parameters"
i know you can use filter in your URL as https://app.powerbi.com/groups/me/reports/12345678-6418-4b47-ac7c-f8ac7791a0a7?filter=Customer/PostalCode eq '15012'
but this would only work on datasets. im not sure how to do this for a database change in a query
The only thing you could try is if you have a direct query and use the new feature of binding query parameters.
https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-dynamic-m-query-parameters
Then you can set a filter with url to point to the database you need. Not sure how it would work - haven't tried it myself.
To expand on the idea a bit - you would need a table with database names in it. Then you would bind database column of that table to your query parameter and finally, use your url to set appropriate filter on this new table.
EDIT:
Scratch that, in the article I linked to, it says that direct query T-SQL is not supported. But if they were ;)...
I'm using Pentaho PDI 7.1. I'm trying to convert data from Mysql to Mysql changing the structure of data.
I'm reading the source table (customers) and for each row I've to run another query to calculate the balance.
I was trying to use Database value lookup to accomplish it but maybe is not the best way.
I've to run a query like this to get the balance:
SELECT
SUM(
CASE WHEN direzione='ENTRATA' THEN -importo ELSE +importo END
)
FROM Movimento WHERE contoFidelizzato_id = ?
I should set the parameter taking it from the previous step. Some advice?
The Database lookup value may be a good idea, especially if you are used to database reasoning, but it may result in many queries which may not be the most efficient.
A more PDI-ish style would be to make the query like:
SELECT contoFidelizzato_id
, SUM(CASE WHEN direzione='ENTRATA' THEN -importo ELSE +importo END)
FROM Movimento
GROUP BY contoFidelizzato_id
and use it as the info source of a Lookup Stream Step, like this:
An even more PDI-ish style would be to divert the source table (customer) in two flows : one in which you keep the source rows, and one that you group by contoFidelizzato_id. Of course, you need a formula, or a Javascript, or to put a formula in the SQL of the Table input to change the sign when needed.
Test to know which strategy is better in your case. You'll soon discover that the PDI is very good at handling large data.
I went over the tutorial for Scrapy, and I was able to understand how to scrap the site included in the tutorial. But I'm having a little trouble with some of the more complicated sites (at least to me).
I'm attempting to scrape the rows and columns of the insider transactions from this webpage:
http://finviz.com/insidertrading.ashx
I'm using command prompt commands with scrapy to test out if I'm able to scrape the necessary information, so the following commands are what I've have written in the command prompt.
scrapy shell "http://finviz.com/insidertrading.ashx"
I then used firebug from firefox to look at the html code of the page.
I'm able to get some of the information (Stock Name, Name of the Insider and Date) into a list via this code:
response.css('td a.tab-link::text').extract()
However, the rest of the info is missing.
I'm able to get some (maybe most)of the missing info (Cost, Shares, Value etc) via this code
response.css(td::text).extract()
I can't figure out how to cleanly get all info together in one scrape.
Thanks.
EDIT: The other option would be to collect the data iteratively, one row at a time, so I can separate it as I like. I'm brooding over this as well.
Since the data is tabular, the position of table rows and columns is predictable and stable. You can simply extract all text in the row and unpack it into variables:
for row in response.xpath("//tr[#class='insider-option-row']"):
items = row.xpath('td/a/text() | td/text()').extract()
ticker, owner, relationship, date, transaction, cost, shares, value, shares_total, sec_form_4 = items
I am writing simple SELECT queries which involve parsing out date from a string.
The dates are typed in by users manually in a web application and are recorded as string in database.
I am having CASE statement to handle various date formats and use correct format specifier accordingly in TO_DATE function.
However, sometimes, users enter something that's not a valid date(e.g. 13-31-2013) by mistake and then the entire query fails. Is there any way to handle such rougue records and replace them with some default date in query so that the entire query does not fail due to single invalid date record?
I have already tried regular expressions but they are not quite reliable when it comes to handling leap years and 30/31 days in months AFAIK.
I don't have privileges to store procedures or anything like that. Its just plain simple SELECT query executed from my application.
This is a client task..
The DB will give you an error for an invalid date (the DB does not have a "TO_DATE_AND_FIX_IF_NOT_CORRECT" function).
If you've got this error- it means you already tried to cast something to an invalid date.
I recommend doing the migration to date on your application server, and in the case of exception from your code - send a default date to the DB.
Also, that way you send to the DB an object of type DbDate and not a string.
That way you achieve two goals:
1. The dates will always be what you want them to be (from the client).
2. You close the door for SQL Injection attacks.
It sounds like in your case you should write the function I mentioned...
it should look something like that:
Create or replace function TO_DATE_SPECIAL(in_date in varchar2) return DATE is
ret_val date;
begin
ret_val := to_date(in_date,'MM-DD-YYYY');
return ret_val;
exception
when others then
return to_date('01-01-2000','MM-DD-YYYY');
end;
within the query - instead of using "to_date" use the new function.
that way instead of failing - it will give you back a default date.
-> There is not IsDate function .. so you'll have to create an object for it...
I hope you've got the idea and how to use it, if not - let me know.
I ended up using crazy regex that checks leap years, 30/31 days as well.
Here it is:
((^(0?[13578]|1[02])[\/.-]?(0?[1-9]|[12][0-9]|3[01])[\/.-]?(18|19|20){0,1}[0-9]{2}$)|(^(0?[469]|11)[\/.-]?(0?[1-9]|[12][0-9]|30)[\/.-]?(18|19|20){0,1}[0-9]{2}$)|(^([0]?2)[\/.-]?(0?[1-9]|1[0-9]|2[0-8])[\/.-]?(18|19|20){0,1}[0-9]{2}$)|(^([0]?2)[\/.-]?29[\/.-]?(((18|19|20){0,1}(04|08|[2468][048]|[13579][26]))|2000|00)$))
It is modified version of the answer by McKay here.
Not the most efficient but it works. I'll wait to see if I get a better alternative.
I'm using enterprise guide 4.3.
When you run a data step the resulting output opens in a spreadsheet like table.
Then when you run a proc tabulate or similar, the spreadsheet like view of the data disappears and the table comes up in SAS Report or HTML form etc.
You can then run further commands on that dataset that was created in the data step.
Q. How can you get that spreadsheet like view of the dataset back? (assuming it's possible)
I know you can run the data step again and it will display it but that seems really inefficient, especially if the data step had lots of computations involved. The data is obviously 'sitting there' given you can still interact with it (with proc tabulate etc). I was really surprised to see that it drops off from the process flow view.
Apologises if I've name things poorly above, I'm an R beginning to dabble in SAS.
If I understood you correctly you run some code and the result comes up. Then you run some other piece of code, from the same Code node and the initial result gets removed from the process flow.
You can always find your dataset in the Server List. You can enable it by clicking View -> Server List.
There is also a trick that you can do. When you run your code and the dataset node is created in the process flow, you can do a simple query on it. Just do Right click -> Filter and query and make it do something simple that won't take too long.
Now, when you run your next piece of code, this node will not be replaced (at least this is what happens in EG 4.1).
If you mean viewing the resulting data set from a DATA STEP, choose View/Process Flow and double click on the data set you want to view. Also, within your program, log, data or result view, there should be tabs across the top that allow you to bring up the other items of the process flow.