Streaming Dataflow Power BI - powerbi

I am trying to do a very simple streaming dataflow as described here:
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-streaming
I was able to get it work briefly with some manually typed in data, but now it isn't populating any more. I am dumping data into a blob directory, and the data is not showing up in the preview. I am getting notifications that the refresh is failing:
Streaming model definition is syntactically or semantically incorrect.
I keep dumping data into the directory, and nothing shows up in the preview. I've tried turning the dataflow on and off, makes no difference. Nothing shows up in Power BI, nothing shows up in the data preview. Nothing shows up in the input box, nothing shows up in the output box.
The data is of the form:
[
{ "id": "1",
"amount": "3"},
{ "id": "2",
"amount": "4"}
]
Although it also fails with data of the form
{
"id": "1",
"amount: "3"
}
What would cause such an error message?

Related

Unrecognized Verify Auth Challenge Lambda response C#

Hy, I'm implementing a custom auth flow on a Cognito User Pool. I managed to handle the Define- and CreateAuthChallenge-triggers, but not the VerifyAuthChallenge.
I use this documentation as a guide: Verify Auth Challenge Response Lambda Trigger
I take the verify-lambda input and add answerCorrect = true to the response, as described in the documentation. Define- and CreateChallenge-parts work as expected with the given information. Verifying the challenge answers, I get InvalidLambdaResponseException: Unrecognizable lambda output as a response. The verify-lambda exists successfully, returning this object:
{
"version": 1,
"triggerSource": "VerifyAuthChallengeResponse_Authentication",
"region": "eu-central-1",
"userPoolId": "eu-central-1_XXXXXXXXX",
"callerContext": {
"awsSdkVersion": "aws-sdk-dotnet-coreclr-3.3.12.7",
"clientId": "2490gqsa3gXXXXXXXXXXXXXXXX"
},
"request": {
"challengeAnswer": "{\"DeviceSub\":\"TestSub\"}",
"privateChallengeParameters": {
"CUSTOM_CHALLENGE": "SessionService_SendDevice"
},
"userAttributes": {
"sub": "8624237e-0be8-425e-a2cb-XXXXXXXXXXXX",
"email_verified": "true",
"cognito:user_status": "CONFIRMED",
"email": "X.XXXXXXXX#XXXXXXXXXX.de"
}
},
"response": {
"answerCorrect": true
},
"userName": "8624237e-0be8-425e-a2cb-XXXXXXXXXXXX"
}
Before, I ran into the problem, that the "challengeAnswer"-part was described as a Dictionary in the documentation, but it actually is just a string, containing the dictionary as json. Sadly, I cannot find any information anywhere for why the returned object isn't accepted by Cognito.
Apparently someone had the same problem as me, using JavaScript: GitHub link
Can anyone tell me, what the response object should look like, so that it is accepted by Cognito? Thank you.
Well, so my mistake was to not consider the custom authentication flow. I found a different documentation, which is by the way the one you should definitely use:
Customizing your user pool authentication flow
I ran into 2 wrong parts in the documentation here (the triggers sub-pages) and 1 error on my part.
Wrong part 1:
DefineAuthChallenge and CreateAuthChallenge inputs for the session is defined as a list of challenge results. This is all fine, but the challenge result object has the challenge metadata part wrongly displayed of being written like this: "ChallengeMetaData", when instead it should be "ChallengeMetadata", with a lower case "d" for "data" instead of an upper case one. This gave me the "Unrecognized lambda output"-error, because "ChallengeMetaData" wasn't what the backend was expecting, it was looking for "ChallengeMetadata", which wasn't present. The first time you enter the define auth challenge lambda, this error doesn't show up, because the session doesn't contain any challenge answers. The moment you verify a challenge though, this gets filled and then the uppercase d gives you troubles.
Wrong part 2:
As described in my question, the VerifyAuthChallenge input for the "challengeAnswer" is a string, not a Dictionary.
All these wrong parts are correctly displayed on the first documentation page I linked here. So I would recommend using that instead of the other documentation.
Error on my side:
I didn't really check what happens after you verify a custom challenge via the VerifyAuthChallenge-trigger. In the given link, in the image above the headline 'DefineAuthChallenge: The challenges (state machine) Lambda trigger', it clearly states, that after verifying the response, the DefineAuthChallenge trigger is invoked again, which I didn't consider.
I hope I could save someone the time it took for me to figure this out with this :-)

Dataflow Datastore to GCS Template, javascriptTextTransformFunctionName Error

I am using the Cloud Datastore to Cloud Storage Text template from Cloud Dataflow.
My python code correctly submits the request and uses javascriptTextTransformFunctionName to run the correct function in my Google Cloud Storage bucket.
Here is a minimized part of the code that is running
function format(inJson) {
var output = {};
output.administrator = inJson.properties.administrator.keyValue.path[0].id;
return output;
And here is the Json I am looking to format, cut down, but only the other children of "properties."
"properties": {
"administrator": {
"keyValue": {
"path": [
{
"kind": "Kind",
"id": "5706504271298560"
}
]
}
}
}
}
And I am getting this exception:
java.lang.RuntimeException:
org.apache.beam.sdk.util.UserCodeException:
javax.script.ScriptException: TypeError: Cannot read
property "keyValue" from undefined in <eval> at line number 5
I understand what it is saying the error is, but I don't know why its happening. If you take the format function and that json and run it through your browser console you can easily test and see that it pulls out and returns an object with "administrator" equal to "5706504271298560".
I did not found the solution to your problem but I expect to be of some help:
Found this post and this one with the same issue. The first one was fixed installing NodeJS library, the second one changing the kind of quotes for the Java.Type().
Nashorn official docs: call Java.type with a fully qualified Java class name, and then to call the returned function to instantiate a class from JavaScript.

How can I install the sample AdventureWorksDW database on SQL DW using an ARM script

I can create a SQL DW using ARM no problem. However, the portal supports an option of also installing a sample database - e.g. AdventureWorksDW. How can I do the equivalent using an ARM script?
BTW, I clicked on "automation options" on the portal add it shows an ARM script with an extension that probably is the piece that installs the sample database, but it asks for some parameters (e.g. storageKey, storageUri) that I don't know.
Here's what I think is the relevant portion of the ARM JSON:
"name": "PolybaseImport",
"type": "extensions",
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[concat('Microsoft.Sql/servers/', parameters('serverName'), '/databases/', parameters('databaseName'))]"
],
"properties": {
"storageKeyType": "[parameters('storageKeyType')]",
"storageKey": "[parameters('storageKey')]",
"storageUri": "[parameters('storageUri')]",
"administratorLogin": "[parameters('administratorLogin')]",
"administratorLoginPassword": "[parameters('administratorLoginPassword')]",
"operationMode": "PolybaseImport"
}
More specifically, looking at the ARM deploy script generated from the portal, here are the key elements that I need to know in order to auto deploy using my own ARM script:
…
"storageKey": {
"value": null  <- without knowing this, I can’t deploy.
},
"storageKeyType": {
"value": "SharedAccessKey"
},
"storageUri": {
"value": https://sqldwsamplesdefault.blob.core.windows.net/adventureworksdw/AdventureWorksDWPolybaseImport/Manifest.xml  <- this is not a public blob, so can’t look at it
},
…
AFAIK that's currently not possible. The portal kicks off a workflow that provisions the new DW resources, generates the sample DW schema then loads data. The sample is stored in a non-public blob so you won't be able to access it.
I don't think it's hard to make it available publicly but it does take some work so perhaps you should add a suggestion here: https://feedback.azure.com/forums/307516-sql-data-warehouse

ReportProcessingStatus": "_CANCELLED_" while try to get Orders Report in Amazon MWS

I want to get all the orders details whatever time they are.
So i am trying to generate order report of Amazon mws via Report API and sending the enumeration as "_GET_FLAT_FILE_ORDERS_DATA_" or with other report Enum but as i hit the API RequestReport its gives in response as
ReportRequestInfo": {
I20160628-13:14:55.462(5.5)? "ReportType": "_GET_FLAT_FILE_ORDERS_DATA_",
I20160628-13:14:55.462(5.5)? "ReportProcessingStatus": "_SUBMITTED_",
I20160628-13:14:55.462(5.5)? "EndDate": "2016-06-28T07:44:54+00:00",
I20160628-13:14:55.462(5.5)? "Scheduled": "false",
I20160628-13:14:55.463(5.5)? "ReportRequestId": "50692016981",
I20160628-13:14:55.463(5.5)? "SubmittedDate": "2016-06-28T07:44:54+00:00",
I20160628-13:14:55.463(5.5)? "StartDate": "2016-06-28T07:44:54+00:00"
I20160628-13:14:55.463(5.5)? }
I20160628-13:14:55.463(5.5)? },
but as i hit the GetReportRequestList API for response the response status show me cancel.
"ReportRequestInfo": [
I20160628-13:15:22.937(5.5)? {
I20160628-13:15:22.938(5.5)? "ReportType": "_GET_FLAT_FILE_ORDERS_DATA_",
I20160628-13:15:22.938(5.5)? "ReportProcessingStatus": "_CANCELLED_",
I20160628-13:15:22.938(5.5)? "EndDate": "2016-06-28T07:44:54+00:00",
I20160628-13:15:22.938(5.5)? "Scheduled": "false",
I20160628-13:15:22.939(5.5)? "ReportRequestId": "50692016981",
I20160628-13:15:22.939(5.5)? "StartedProcessingDate": "2016-06-28T07:44:58+00:00",
I20160628-13:15:22.939(5.5)? "SubmittedDate": "2016-06-28T07:44:54+00:00",
I20160628-13:15:22.940(5.5)? "StartDate": "2016-06-28T07:44:54+00:00",
I20160628-13:15:22.940(5.5)? "CompletedDate": "2016-06-28T07:45:04+00:00"
I20160628-13:15:22.941(5.5)? },
I20160628-13:15:22.941(5.5)? {
I20160628-13:15:22.942(5.5)? "ReportType": "_GET_CONVERGED_FLAT_FILE_ORDER_REPORT_DATA_",
I20160628-13:15:22.943(5.5)? "ReportProcessingStatus": "_CANCELLED_",
I20160628-13:15:22.943(5.5)? "EndDate": "2016-06-28T07:38:44+00:00",
I20160628-13:15:22.943(5.5)? "Scheduled": "false",
I20160628-13:15:22.943(5.5)? "ReportRequestId": "50691016981",
I20160628-13:15:22.943(5.5)? "StartedProcessingDate": "2016-06-28T07:38:49+00:00",
I20160628-13:15:22.944(5.5)? "SubmittedDate": "2016-06-28T07:38:44+00:00",
I20160628-13:15:22.944(5.5)? "StartDate": "2016-06-28T07:38:44+00:00",
I20160628-13:15:22.944(5.5)? "CompletedDate": "2016-06-28T07:38:56+00:00"
I20160628-13:15:22.944(5.5)? },
I20160628-13:15:22.944(5.5)? {
I20160628-13:15:22.945(5.5)? "ReportType": "_GET_CONVERGED_FLAT_FILE_ORDER_REPORT_DATA_",
I20160628-13:15:22.945(5.5)? "ReportProcessingStatus": "_CANCELLED_",
I20160628-13:15:22.945(5.5)? "EndDate": "2016-06-28T07:33:09+00:00",
I20160628-13:15:22.945(5.5)? "Scheduled": "false",
I20160628-13:15:22.945(5.5)? "ReportRequestId": "50690016981",
I20160628-13:15:22.946(5.5)? "StartedProcessingDate": "2016-06-28T07:33:14+00:00",
I20160628-13:15:22.946(5.5)? "SubmittedDate": "2016-06-28T07:33:09+00:00",
I20160628-13:15:22.946(5.5)? "StartDate": "2016-06-28T07:33:09+00:00",
I20160628-13:15:22.946(5.5)? "CompletedDate": "2016-06-28T07:33:21+00:00"
I20160628-13:15:22.946(5.5)? },
So as you can see its always say status cancel. I read the documentation that says if you try more than once than the previous one is canclled but as you can see here all the request have status cancel.
Please let me know where i am doing wrong or is there any other way to access the orders report.
Also if anyone know that how we can get all the order details greater than a year than how could one get those.
Any helps would be appriciated
Thanks!
MWS documentation clearly says that;
You can only schedule one _GET_FLAT_FILE_ORDERS_DATA_ or _GET_CONVERGED_FLAT_FILE_ORDER_REPORT_DATA_ report at a time. If you have one of these reports scheduled and you schedule a new report, the existing report will be canceled.
Check your scheduled reports to verify if you already have this report scheduled.
In addition, reports that aren't real-time reports can also be canceled if repeatedly requested, as outlined by Amazon's support tech Jim, on the MWS forum;
As you may be aware, many of our reports are not real time and update periodically. In order to keep your report requests running as efficiently as possible and to always provide you with the most updated data, we are now limiting duplicate requests for downloadable reports.
Since our reports update periodically, in most cases, duplicate report requests in quick succession do not display any new information. From now on, when you submit your initial report request, you will be able to generate it once. However, you will not be able to generate a new version of the same report to download again until 30 minutes or 4 hours have passed depending on the report. In the meantime, you can still download the most recently generated report as many times as you want.
NOTE: Near real time reports are capped at once per 30 minutes, while the daily reports are capped at 4 hours.
These new limits are designed to help your report requests run more efficiently and to ensure that we are providing you with the most up-to-date report.
you have to provide dateStartTime atleast in such API. Since in your response I can see the endtime and starttime is same the report is being cancelled as ther is no data for report. Add a valid starttime when you request for such report.

View attachments in threads

I'm currently working on an alternative way to view the threads and messages. But I have problems figuring out how to display the images attached to a message.
I have a GET request to this url: https://graph.facebook.com/t_id.T_ID/messages?access_token=ACCESS_TOKEN. And the response includes
"attachments": {
"data": [
{
"id": "df732cf372bf07f29030b5d44313038c",
"mime_type": "image/jpeg",
"name": "image.jpg",
"size": 76321
}
]
}
but I can't find any way to access the image.
Thanks
Support for this hasn't yet been added to the Graph API and as with many of the other messaging APIs, it's currently only avaialable for testing (i.e you must be a developer of the app to use it presently)
There's an undocumented REST API endpoint for this, which should work for any app (that you're the developer of, as above).
To use the REST method to get the attachment data, it's
https://api.facebook.com/method/messaging.getattachment
With parameters:
access_token=YOUR_ACCESS_TOKEN
mid=MESSAGE_ID
aid=ATTACHMENT_ID
format=json //(it defaults to XML otherwise)
The response is like this:
{"content_type":"image\/png","filename":"Screen Shot 2012-02-08 at 11.35.35.png","file_size":42257,"data":<FILE CONTENTS>}
I've just tested this and it worked OK for me, taking the <FILE CONTENTS> and base64 decoding them gave me back the original image correctly