I'm fairly new to PowerShell and brand new (as in, today) to web services and SOAP. A vendor gave us documentation on their web service API that allows the creation of user accounts. I'm trying to use PowerShell to pull our users from SQL Server and send the data to their service. We will need to add users on an ongoing basis.
Below is a pared-down version of what I came up with and it actually seems to work; the vendor told me to include a dry_run parameter while testing and I'm getting a dry_run_success from the response_type.
My question is: Is this even close to being the appropriate way to do it with PowerShell?
# Open ADO.NET Connection to database
$dbConn = New-Object Data.SqlClient.SqlConnection;
$dbConn.ConnectionString = "Data Source=mydbserver;User ID=someuserid;Password=mypassword;Initial Catalog=mydatabase";
$dbConn.Open();
$sql = "select * from mytable";
$dbSqlCmd = New-Object Data.SqlClient.SqlCommand $sql, $dbConn;
$dbRd = $dbSqlCmd.ExecuteReader();
# Create a Web Service Proxy
$proxy = New-WebServiceProxy -Uri https://somedomain.com/service/wsdl
$namespace = $proxy.GetType().NameSpace
$param = New-Object($namespace + ".somemethod")
# Loop through records from SQL and invoke the web service
While ($dbRd.Read())
{
$param.user_id = $dbRd.GetString(0)
$param.password = $dbRd.GetString(1)
$param.display_name = $dbRd.GetString(2)
$request = $proxy.TheMethod($param)
if ($request.response_type -eq 'error')
{
$request.error.extended_error_text
}
}
# Clean up
$dbRd.Close();
$dbSqlCmd.Dispose();
$dbConn.Close();
A couple things you could improve:
Don't use select * in your SQL queries. Always specify the fields you need, in the order you need. As written, if someone were to restructure the table such that the user ID wasn't the first column, you'd have a mess on your hands because you're accessing the fields by their ordinal number
You're apparently storing those passwords in plaintext in your database. Anyone with access to your database knows the credentials for every one of your users. This is a very bad thing. Resolving this could be a very big discussion.
Your code keeps the database connection open until the script completes. Given the scope here, it's probably not going to cause a major problem, but your database access strategy should be to get in, get your data, get out & disconnect as quickly as possible.
$sql = "select user_id, password, display_name from mytable";
$QueryCmd = $dbConn();
$QueryCmd.CommandText = $sql;
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$QueryCmd.Connection = $dbConn;
$SqlAdapter.SelectCommand = $QueryCmd;
$DataSet = New-Object System.Data.DataSet;
$SqlAdapter.Fill($DataSet)
$dbConn.Close();
$dbConn.Dispose();
$MyResults = $DataSet.Tables[0];
$MyResults | foreach-object {
$param.user_id = $_.user_id;
$param.password = $_.password;
$param.display_name = $_.display_name;
$request = $proxy.TheMethod($param);
if ($request.response_type -eq 'error')
{
$request.error.extended_error_text;
}
}
Related
I recently subscribed to the SimilarWeb API on the AWS Data Exchange Marketplace but have been unable to access the datasets. The reason is: send_api_asset call requires an AssetID parameter just as used in this sample notebook however, it is not available on SimilarWeb dataset page. The only available details are license, productID, OfferID, RevisionID, and DatasetID. Does anyone have an experience interacting with products on the AWS data Exchange for APIs Mareketplace, if yes, how can one connect to the endpoints or access the data sets via API call without using an AssetID ?
I have attempted:
DATA_SET_ID = '****'
REVISION_ID = '*******'
#ASSET_ID = '' #not provided
BODY = json.dumps({'body_param': 'body_param_value'})
METHOD = 'POST'
PATH = '/'
QUERY_STRING_PARAMETERS = {'param1': 'value1', 'param2': 'value2'}
response = CLIENT.send_api_asset(
DataSetId=DATA_SET_ID,
RevisionId=REVISION_ID,
#AssetId='',
Method=METHOD,
Path=PATH,
Body=BODY,
QueryStringParameters=QUERY_STRING_PARAMETERS
)
but it throws a ParamValidationError due to the missing AssetID. Any help or guidance will be appreciated.
I am trying to programatically deploy a Power BI Report and dataset from one workspace to another, using a mix of PowerShell and the PowerBI REST API. In the new workspace, I am updating a dataset parameters to point to a new DB name.
The dataset is pointed to an Azure SQL DB, and in my DEV workspace (the source for the clone), the dataset passes the accessing user's credential through to the DB.
I am authenticating with a Service Principal that I created and then added to the dataset as an Administrator.
This is the PowerShell code that I wrote to do this:
$config = gc .\EnvConfig.json -raw | ConvertFrom-Json
$envSettings = $config.Dev
$toEnvSettings = $config.QA
# Convert to SecureString
[securestring]$secStringPassword = ConvertTo-SecureString $config.ServicePrincipalSecret -AsPlainText -Force
$userId = "$($config.ServicePrincipalId)#$($config.ServicePrincipalTenant)"
[pscredential]$credObject = New-Object System.Management.Automation.PSCredential ($userId, $secStringPassword)
Connect-PowerBIServiceAccount -Tenant $config.ServicePrincipalTenantName -ServicePrincipal -Credential $credObject
Get-PowerBIReport -WorkspaceId $envSettings.PBIWorkspaceId | ForEach-Object {
$filename ="c:\temp\$($_.Name).pbix"
Remove-Item $filename
Invoke-PowerBIRestMethod -Method GET `
-Url "https://api.powerbi.com/v1.0/myorg/groups/$($envSettings.PBIWorkspaceId)/reports/$($_.Id)/Export" `
-ContentType "application/zip" -OutFile $filename
New-PowerBIReport -WorkspaceId $toEnvSettings.PBIWorkspaceId -ConflictAction CreateOrOverwrite -Path $filename
}
$datasets = Get-PowerBIDataset -WorkspaceId $toEnvSettings.PBIWorkspaceId
$datasetId = $datasets[0].Id
$updateDBParam = "{`"updateDetails`": [ { `"name`": `"DBName`", `"newValue`": `"$($toEnvSettings.DBName)`" }]}"
$updateUri = "https://api.powerbi.com/v1.0/myorg/groups/$($toEnvSettings.PBIWorkspaceId)/datasets/$datasetId/Default.UpdateParameters"
Invoke-PowerBIRestMethod -Method POST -Url $updateUri -Body $updateDBParam
When I have cloned the report and dataset, when I open the report in the new workspace I see an error that the dataset does not have credentials:
If I take over this dataset with my personal login, then the report loads. This is not sufficient, I want to set the credential to pass through the user's id programatically.
I found this discussion on the PowerBI site, where they say you can use the dataset ID and gateway ID from the dataset, and send a PATCH request to https://api.powerbi.com/v1.0/myorg/gateways/[gateway id]/datasources/[datasource id]
I suspect that is only relevant to "My Workspace" datasets, not datasets in a workspace.
When I try and send that patch request with a gateway and datasource ID that I got from performing a GET on https://api.powerbi.com/v1.0/myorg/groups/[workspace id]/datasets/[dataset id]/datasources, I get a 401 error. I have tried posting with my own PowerBI Tenant Admin login, as well as with an Admin app I created through the PowerBI app registration tool, and also I added a tenant level PowerBI Read / Write permission in the AAD portal for my service principal. Nothing works, I keep getting a 401.
Two questions:
Can I set the credentials on a dataset in a workspace?
If not, how can I clone the dataset between workspaces so that it has the credential passthrough to start with?
#Joon: I wanted to leave a comment but am not allowed. I'm in the same boat with getting the 401 errors. But I'm not following your resolution; did you change any logic, or you changed the user account being used? We're using the PBI AAD account that is the Admin of that workspace where dataset resides. Here's the code I'm using, which is based on this: https://martinschoombee.com/2020/10/20/automating-power-bi-deployments-change-data-source-credentials/
$ApiRequestBody = #"
{
"credentialDetails": {
"credentialType": "Basic",
"credentials": "{\"credentialData\":[{\"name\":\"username\", \"value\":\"$FormattedDataSourceUser\"},{\"name\":\"password\", \"value\":\"$FormattedDataSourcePassword\"}]}",
"encryptedConnection": "Encrypted",
"encryptionAlgorithm": "None",
"privacyLevel": "None"
}
}
"#
#. . . (tried other values for "privacyLevel")
#Update username & password
Invoke-PowerBIRestMethod -Url $ApiUrl -Method Patch -Body ("$ApiRequestBody")
Solved the problem.
The 401 error was originating from the credential I was posting itself, not from me not having permissions to post. I was using the OAuth credential method, and the token I was passing was invalid. The response from the PowerBI API is just a bare 401 error, nothing tells the user that the problem is that the API validated the OAuth token and that failed.
I tested with an invalid basic credential, and in that case you get a 400 Bad Request error, which makes more sense.
I am using a Powershell script to generate an embed token for a Power BI dashboard:
Login-PowerBI
$url = "https://api.powerbi.com/v1.0/myorg/groups/395ce617-f2b9-xyz/dashboards/084c9cc4-xyz/GenerateToken"
$body = "{ 'accessLevel': 'View' }"
$response = Invoke-PowerBIRestMethod -Url $url -Body $body -Method Post -ErrorAction "Stop"
$response
$json = $response | ConvertFrom-Json
$json.token
This works, however I was hoping to make the dashboard editable by changing the accessLebel like this:
$body = "{ 'accessLevel': 'Edit' }"
Instead of generating a token, an error is thrown indicating Bad Request, but with no other detail. How can I determine how the request should be created? Are dashboards even editable like reports are? (I can generate edit tokens for reports with no issue) I can't find a code sample for that, and I note the online sample doesn't allow you to edit dashboards like you are able to with reports: https://microsoft.github.io/PowerBI-JavaScript/demo/v2-demo/index.html
You got the error Bad request because accessLevel: Edit is not supported for dashboards.
The accessLevel supported for Generate EmbedToken for dashboard in the group is only View.
Create and Edit accessLevel is available only for reports.
Refer to this link: https://learn.microsoft.com/en-us/rest/api/power-bi/embedtoken/dashboards_generatetokeningroup#tokenaccesslevel
You can use the Try it feature there to see how the REST API calls are made.
I use 'Import' connectivity mode in Power Bi to get data from SQL server.
On the one hand, I can refresh the data for existing time periods.
But on the other hand, once the data extended on server and new time periods are added, the new data with new periods doesn't appear in queries.
Should I use 'Live connection' only or there is another way to handle it?
You can always set a scheduled refresh in Power BI to accomodate for different times of SQL DB updates.
You can also use Power BI REST APIs to do a 'Refresh Now' using
POST https://api.powerbi.com/v1.0/myorg/groups/{group_id}/datasets/{dataset_id}/refreshes
You can use this Powershell snippet:
# Building Rest API header with authorization token
$authHeader = #{
'Content-Type'='application/json'
'Authorization'=$token.CreateAuthorizationHeader()
}
# properly format groups path
$groupsPath = ""
if ($groupID -eq "me") {
$groupsPath = "myorg"
} else {
$groupsPath = "myorg/groups/$groupID"
}
# Refresh the dataset
$uri = "https://api.powerbi.com/v1.0/$groupsPath/datasets/$datasetID/refreshes"
Invoke-RestMethod -Uri $uri –Headers $authHeader –Method POST –Verbose
For more info, use Power BI docs: https://powerbi.microsoft.com/en-us/blog/announcing-data-refresh-apis-in-the-power-bi-service/
My server would communicate with S3. There are two possibilities as far as I understand:
1) Load the file to my server and send it to the user, keeping S3 access only to my server's IP
2) Redirect to S3 while handling authentication on my server
I've understood(I think) how to do #1 from:
Does Amazon S3 support HTTP request with basic authentication
But is there any way to accomplish #2? I want to avoid the latency of first loading the file to my server and then sending it to the user.
I'm not sure how to keep the S3 url protected from public access in #2. Someone might go through my authentication, get a download link, but that link will be publicly accessible.
I'm new to S3 in general, so bear with me if I've misunderstood anything.
Edit: I've looked into signed links with expiration times, but they can still be accessed by others. I would also prefer to use my own authentication so I can allow access to a link only while a user is signed in.
You should try below code, which your server produce an URL which will expire in say 60 seconds, for users to directly download the file from S3 server.
First: Download HMAX.php from here:
http://pear.php.net/package/Crypt_HMAC/redirected
<?php
require_once('Crypt/HMAC.php');
echo getS3Redirect("/test.jpg") . "\n";
function getS3Redirect($objectName)
{
$S3_URL = "http://s3.amazonaws.com";
$keyId = "your key";
$secretKey = "your secret";
$expires = time() + 60;
$bucketName = "/your bucket";
$stringToSign = "GET\n\n\n$expires\n$bucketName$objectName";
$hasher =& new Crypt_HMAC($secretKey, "sha1");
$sig = urlencode(hex2b64($hasher->hash($stringToSign)));
return "$S3_URL$bucketName$objectName?AWSAccessKeyId=$keyId&Expires=$expires&Signature=$sig";
}
function hex2b64($str)
{
$raw = ";
for ($i=0; $i < strlen($str); $i+=2)
{
$raw .= chr(hexdec(substr($str, $i, 2)));
}
return base64_encode($raw);
}
?>
Take a try.