PowerBI Embedded: Datasource has no credentials, unable to Patch the gateway - powerbi-embedded

I wanted to test out PowerBI embedded so I downloaded the the sample app that is able to publish a pbix file and to embed it.
So I created the easiest PowerBI file one is able to make with Azure SQL, using the DirectQuery option, as underlying data source.
I succesfully imported the PowerBI file in my workspace collection
I changed the connection string of my PowerBI file succesfully
After that the code to patch the gateway with the username and password credentials fails
Then when I tried to view the embedded report I got this error.
I believe the connectionstring is in the correct format because it was updated succesfully. I also already tried to point it to another SQL database and then the error shows the other SQL database in the error message.
1) I thought this could be because the Gateway does not get the credentials that I gave it is that correct?
2) Does someone know how can I fix this?
Thanks in advance!

As #Cuong Le stated, this was a Microsoft Issue at first.
When the problem was fixed I still received a BadRequest exception. After trying to update the credentials with the PowerBI-CLI the problem became clearer. I needed to grant rights for Azure IP addresses to the relevant SQL database. Once I did that I was able to update the credentials. Unfortunately PowerBI API SDK's exception messages are not as good as the PowerBI-CLI messages. I also tried it with PowerBI API SDK and it also worked.
The exception message I got was the following:
[ powerbi ] {"error":{"code":"DM_GWPipeline_Gateway_DataSourceAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_DataSourceAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2146232060"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Cannot open server 'engiep-dev-weeu-sql' requested by the login. Client with IP address 'xx.xx.xx.213' is not allowed to access the server. To enable access, use the Windows Azure Management Portal or run sp_set_firewall_rule on the master database to create a firewall rule for this IP address or address range. It may take up to five minutes for this change to take effect."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2146232060"}},{"code":"DM_ErrorDetailNameCode_UnderlyingNativeErrorCode","detail":{"type":1,"value":"40615"}}]}}}
The correct connectionstring format to use is:
Data Source=yourDataSource;Initial Catalog=yourDataBase;User ID=yourUser;Password=yourPass;
(Don't use quotes anywhere.)

I was experiencing the same issue. Also it is an open issue on github.
Attached Image :
enter image description here
To solve this, I used the PowerBI Cli 1.0.4 from NPM. And used Update Connection Operation,(remember to add -d).
powerbi update-connection -c [workspace name] -k [access key] -w [workspace id] -d [dataset id] -s "Data Source=xxx.database.windows.net;Initial Catalog=xxx;User ID=xxx;Password=xxx"
If it fails do it(Update-Connection Operation) again.
The issue happens since sometimes datasource credentials are not carried over to the workspace.
In the case of reports that use direct query, credentials are never brought with the pbix as an import is done. All private info are stripped out.
Hope this helps!
Thanks

Related

PBI services Odbc.DataSource dynamic dsn name

In PBI desktop file no errors, erro appear only in PBI service on refreshing
ERROR:
Query contains unsupported function. Function name:Odbc.DataSource
Parameter1
Mydsn ' as parameter
used as text - not dynamic
= Odbc.DataSource("dsn=Mydsn", [HierarchicalNavigation=true]) ' no error
as used us text Parameter - not dynamic
= Odbc.DataSource("dsn=" & Parameter1, [HierarchicalNavigation=true]) ' no error,
Odbc_dsn ' Query
= settings[Column2]{0} ' from csv
from csv query
= Odbc.DataSource("dsn=" & Odbc_dsn, [HierarchicalNavigation=true]) ' Query contains unsupported function. Function name: Odbc.DataSource
directly from csv table
= Odbc.DataSource("dsn=" & settings[Column2]{0}, [HierarchicalNavigation=true]) ' Query contains unsupported function. Function name: Odbc.DataSource
No one of Privacy settings is not change anything, tryied all
available ways. (change to none, private, organizational, public,
disabling privacy settings and etc)
How to use odbs source DSN name from csv file?
(Answer to be expanded with additional info provided - see comments on original question)
While I have never imported a DSN name through a CSV, your saying that it works on your local machine makes me accept that this is at least possible so we'll instead focus on issues with the gateway.
My first impression here as to why this might not be working is simply permissions and visibility.
Having worked with a number of PowerBI Service setups, the issue with an unrecognized ODBC DSN usually falls into the following issues:
Is the DSN setup as a system DSN?
Is the gateway setup as a LocalService Account vs PowerBI Gateway Host Account?
Does which user the gateway is setup under actually have permissions to the directory that the data source (or custom connector) that the connection depends on?
So:
Fairly straight forward: all gateway accessible ODBC sources need to be setup on the gateway host as system DSNs, not user DSNs. See your ODBC Data Source Administrator here:
Confirm the On-Premise Gateway "Logon" User on the gateway's host machine? Generally I recommend going to Windows Services and making sure to use the "Local System account" (to inherit permissions) but just consider this during the next step of checking local permissions.
This applies to anything which is "self-hosted" on the local machine that is the gateway host: Whichever account is hosting the powerbi gateway service must also be given explicit permissions to the local resources needed. For example, if you add a custom connector to the documents directory on the gateway host under your user account - make sure the PowerBI default user has access to that directory and file. I.E. File properties -> Security -> User permission etc.
In my experience, 9/10 times one of these things isn't setup right.
Additional note - every time you upgrade or re-install a powerbi gateway host, you will have to change the service login account and double check all permissions. I don't know why but it overwrites that setting by default disabling all refresh until restored.
Edit:
After further thinking, I believe you will eventually run into the roadblock regardless - PowerBI Service's Gateway Data Source mappings are 1-1. After upload you will get this screen in the dataset settings:
Which requires that the data source has been defined in the PowerBI service's settings:
I don't believe that it is currently possible to make that definition a variably composed string per user's request.
Dsn name can be only static and only string

Azure Text Analytics API to Power BI - error: Web.Contents failed to get contents

My goal is to get a new column in Power BI with keyphrases based on a column with text data. I try to connect the Azure text analytics API to PowerBI. I use this tutorial:
https://learn.microsoft.com/nl-nl/azure/cognitive-services/text-analytics/tutorials/tutorial-power-bi-key-phrases
After I invoke the custom function, and set the authentication and privacy to "anonymous" and "public", the KeyPhrases column I get only contains the values "Error" with the following description:
An error occurred in the ‘’ query. DataSource.Error: Web.Contents failed to get contents from 'https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases' (404): Resource Not Found
Details:
DataSourceKind=Web
DataSourcePath=https://*******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
Url=https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
Also, not sure if it is related to my issue, but I see the following warning on my Azure account in the Networking menu:
"VNet setting is not supported for current API type or resource location."
I checked all the steps in the tutorial, I re-entered the authentication and privacy settings. Also, I tried the same for the sentiment analysis function. Finally, I tried everything on a different and very simplistic dataset.
Not sure what the cause of my issue is and how to solve it.
Any suggestions would be much appreciated.
Best, Rosanne
Look at your error message:
'https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases' (404): Resource Not Found Details: DataSourceKind=Web DataSourcePath=https://*******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases Url=https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
It throw a 404, so you are pointing to the wrong URL.
And s you can see in the beginning of your url:
https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com << here you have twice ".cognitiveservices.azure.com/" so you url setup is wrong.
I don't know exactly how it is setup on your side, but you may have provided a region or endpoint during it and it's here where you put the wrong value.

Apache Zeppelin with Athena handling session token using jdbc Interpreter

I am trying to connect Athena with Apache Zeppelin.I need to handle secret_key, Access_key, and Session_token. I am feeling hard to establish my connection with the Zeppelin JDBC interpreter.
I am following the steps as mentioned in this block,
If any one can help me out in establishing the connection with AWS Session token approach that would be helpful.
Thank You
The main docs for this are here:
https://docs.aws.amazon.com/athena/latest/ug/connect-with-jdbc.html
I found there are 2 driver versions, -1.1.0 and -1.0.1 . I could only get Zeppelin working with 1.1.0, and the links on that page don't go to that file, the only way to get it was using the aws s3 cp command
e.g.
aws s3 cp s3://athena-downloads/drivers/AthenaJDBC41-1.1.0.jar .
although I've given feedback on that page so it should be fixed soon.
Regarding the parameters, you use default.user and enter the Access_Key, default.password and enter the Secret_key. The default.driver should be com.amazonaws.athena.jdbc.AthenaDriver
The default.s3_staging_dir is actually the bucket where csv results are written so needs to match your athena settings.
There is no mention of where you might put a session token, however, you could always try putting it on the jdbc connection string ( which goes in default.url parameter value)
e.g.
jdbc:awsathena://athena.{REGION}.amazonaws.com:443?SessionToken=blahblahsomethingrealsessiontokengoeshere
but of course, replace {REGION} with the actual aws region and use your real session token.

APEX_ADMINISTRATOR_ROLE in AWS RDS Oracle Instance

I am trying to install APEX on my AWS Oracle 12 RDS Instance. In order to achieve this, I am following these instructions : http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.APEX.HTML
However, I got stucked in step 7:
Step 7:
You must set a password for the APEX admin user. To do this, use
SQL*Plus to connect to your DB instance as the master user, and then
issue the following commands:
grant APEX_ADMINISTRATOR_ROLE to master;
#/home/apexuser/apex/apxchpwd.sql
Replace master with your master user name. When the apxchpwd.sql
script prompts you, type a new admin password
When I log into my my RDS Instance with my master user and execute this:
grant APEX_ADMINISTRATOR_ROLE to [mymasteruser];
I received this error:
ERROR at line 1:
ORA-01924: role 'APEX_ADMINISTRATOR_ROLE' not granted or does not exist
Can you please help me to solve this?
Edit 12/09/2017.
Using this post/answer:
https://serverfault.com/questions/276541/how-do-you-recover-you-rds-master-user-username
I understand my master user is shown in the following image. As I know, in RDS instance i have no access to sys or system user, so this is the only user i can use.
Many thanks
Edit 20/09/2017.
I applied Alex solution, and it works!!. However, some issues to comment:
The tutorial was changed, in fact the url changed, now is
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.APEX.html (the last "html" was in uppercase before)
but is not reliable now, there are some points that should be fixed, e.g. it says now that RDS support Oracle APEX version 5.1.2, i tried with this versión and I got this error:
Also, some directories dont match with the previos step ....
So, I used the versión that the tutorial originally says : Oracle APEX version 4.2.6.v1
I had to execute both statements :
EXEC rdsadmin.rdsadmin_util.grant_apex_admin_role;
grant APEX_ADMINISTRATOR_ROLE to [master];
Then i could execute the apxchpwd.sql script successfully!!.
But, unfortunately, when I accessed to my apex home page and tried to create a new workspace "ws_prueba", I receive this error (Im trying to create it with my apex admin user):
Any ideas?
Use
EXEC rdsadmin.rdsadmin_util.grant_apex_admin_role;
instead. I have a case open on this with AWS and just asked them to update the documentation page.

Error while deploying Sharepoint 2013 timer job :The EXECUTE permission was denied on the object 'proc_putObjectTVP', database 'MSSQL', schema 'dbo'

While trying to create a custom SharePoint timer job at feature activation I got the following error from the log files:
System.Data.SqlClient.SqlException (0x80131904): The EXECUTE permission was denied on the object 'proc_putObjectTVP', database 'MSSQL', schema 'dbo'. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.Sql... 5c6d109c-dbc6-e02e-7ae4-010d7f559e0b
In order to make it work i located the stored procedure proc_putObjectTVP and granted execute permission to the site apppool userID. It worked as desired.
My question is:
Is this a bug in Sharepoint 2013?
Is this the proper way to do it? (On production environment I may not be allowed by the server administrator to perform such operations)
I had a similar error in the event log for the account used for SharePoint 2013 services:
Insufficient SQL database permissions for user 'Name:
XXXXX\SP_Services SID: xxxxxxxxxxxxxxx ImpersonationLevel: None' in
database 'XXXX_Config' on SQL Server instance 'XXXXXXXXX'. Additional
error information from SQL Server is included below.
The EXECUTE permission was denied on the object 'proc_putObjectTVP',
database 'XXXX_Config', schema 'dbo'.
Googling around lots of blog posts recommend the same approach of applying the required permission to the stored proc. Personally I didn't like this approach, however I eventually found this TechNet post which grants the required permissions by adding the stored proc to the securables of the WSS_Content_Application_Pools role.
Using SQL Server Management Studio do the following:
Expand Databases then expand the SharePoint_Config Database.
Expand Security -> Roles -> Database Roles
Find WSS_Content_Application_Pools role, right click it, and select Properties
Click on Securables and click Search
Next click Specific objects and click OK
Click Object Types and select Stored Procedures. Click OK
Add the Stored Procedure 'proc_putObjectTVP' and click OK (if it does not automatically grant it exec permission; you need to click the
checkbox on "execute" and save it)
Using this method any new accounts added to the WSS_Content_Application_Pools role will have the correct rights preventing the problem cropping up again.
SPDataAccess role in SharePoint_Config was configured to execute proc_putObjectTVP for my install of SharePoint 2013 (which has been a trial-by-fire to get used to SQL Server 2012), anyway, making sure my sharepoint users had that role set seems to have done the trick (and of course brought up more errors to debug, now that more things are successfully starting...)
SPDataAccess (also written as SP_DATA_ACCESS) has been a useful role to Google for, bringing up tons of good resources and tips to fix one problem or another. I'll be reading blogs all night. I suspect configuring databases is old hat for quite a few SharePoint admins and devs, but it's not as well-explained, particularly as the wizard does so much (and so little) for you.
I signed up for Safari Books just to access http://my.safaribooksonline.com/book/programming/microsoft-sharepoint/9781118655047 and books like it. It's useful to help me "think like SharePoint", though Google has been just as much help. (More, really.)