I am copying a workspace using the sample script “Copy a Workspace”.
While GET workspace returns 200, the POST workspace request is returning this error:
“TypeError: Cannot read properties of undefined (reading ‘id’)”
When I look at the response body in the console, I see this error:
{“error”:{“name”:“invalidParamError”,“message”:“body.visibilityStatus is invalid”}}
The request body is programmatically populated by the sample with my first name, the date, and the workspace I am trying to copy in this format:
“workspace”:{“name”:“[MY FIRST NAME] 2/16 - [WORKSPACE]”
I have tried manually changing the request body to a simple string containing just my name under the ‘Variables’ section and I have tried manually defining a “name” variable in the same section. I don’t think either of these get at the problem, but I am unsure what the workspace is actually expecting and why the request isn’t processing.
Please help. Thank you!
Related
I was playing with Postman Flows, and I was trying to learn by using the Trello API. All requests work on their own if executed manually. I've also debugged values using the terminal to understand where the problem lies. Before that, here's a summary of what I'm doing.
Get all boards for a given trello workspace
For each board, delete that board.
The complete flow looks like this:
I've checked that on the last block Send Request, the looped value of /variable/id outputs the proper board id. I've done this by checking with a terminal block and a string block. I started suspecting that this is caused by a failure of Postman to understand that the variable I'm trying to use is a path variable and not a query parameter. As such I tried to pass a static value to the Send Request and it 404'ed as well (tech aside: in theory for n ids it should give me one 200 and n-1 404s since the variable is static and the board would not be able to be deleted multiple times).
My suspicion comes from the fact that when configuring the block for this request:
You do not get prompted to add the board variable. I've tried to type it in anyway, and even use combinations like :board, with no avail. In fact like I said above, if I use these variables with static values, it still 404s.
ignore the parsing message on the right hand side...
As you can see, board doesn't show up. Did I end up hitting a bug, or is this user error? One thing I do not know how to do, but would help clarify that the issue is that a null value is being passed on to the DELETE would be to output the request itself. On a terminal block I can only see the response.
Thanks in advance.
UPDATE:
After checking the Postman console on the app, I've noticed that in fact the path variable being used is whatever is set on the collection request. It's like it takes the URL as a static field and disregards the path variables. Any thoughts?
Path variables won't be available in your Send Request. Instead, define your path variable with an environment/collection/global variable (i.e. {{board}}) in the value of the path variable. Then it will show up the relevant block of your flow.
Here is a snippet from a DAG that I am working on
create_ext_table = bigquery_operator.BigQueryCreateExternalTableOperator(
task_id='create_ext_table',
bucket='bucket-a',
source_objects='path/*',
schema_object='bucket-b/data/schema.json',
destination_project_dataset_table='sandbox.write_to_BQ',
source_format='CSV',
field_delimiter=';')
create_ext_table
When I run the code, I am getting the following error on Composer 1.10.10+composer :
404 GET https://storage.googleapis.com/download/storage/v1/b/bucket-a/o/bucket-b%2Fdata%2Fschema.json?alt=media: (u'Request failed with status code', 404, u'Expected one of', 200, 206)
As seen in the error, airflow concat the bucket param with the schema_objet param ... Is there any workaround with this ? Because I cannot store the table schema and the table files in the same bucket
Thanks
This is expected as you can see in the source code for the operator here that we use the bucket argument to get the schema_object, so the operator assumes you have them in the same bucket.
As you mentioned you cannot store them there are a few workarounds that you can try, I'll speak to them at a high level:
You can extend the operator and override the execute method in which you retrieve the data from the bucket you care about
You can add an upstream task to move the schema object to bucket-a using GoogleCloudStorageToGoogleCloudStorageOperator. This requires handling the schema_object different from the way the source code handles it. Namely parsing it for the bucket name and object path then retrieving it. Alternatively you can create your own argument (something like schema_bucket) and use it in a similar manner.
You can also delete this object using GoogleCloudStorageDeleteOperator as a downstream task after creating the external table so it does not have to be persisted in `bucket
Final note on the schema_object argument, it's meant to be the GCS path as it uses the same bucket, so if you use the already defined operator it should be schema_object='data/schema.json',
When I add environment variables I can use them in my post body with {{varName}}. But this does not work for collection variables (Collection > edit > Variables tab)
With the settings as shown above, if I add {{firstName}} to my body it does not work. How can I access these collection variables in my posts?
Currently if I try to post postman will just hang for a while then give this error
Error: Script execution timed out.↵ at
ContextifyScript.Script.runInContext (vm.js:53:29)
If I use an environment variable or just type in a value it works fine.
Also, you need to make sure to save the request to the belonging collection before you can use it!
It turns out {{varName}} does work. The problem was in my pre-request script. The API I was connecting to requires a checksum on the body so it pre-processes the variables in the body, but it was not setup to handle collection variables. This was causing postman to fail. User error.
I have create global variable. Set it as Test as env variaable corresponding quote id were stored in the CreateGLVar
pm.test("Status code is 200", function () {
pm.response.to.have.status(200);});
var jsonData = pm.response.json();
pm.environment.set("CreateGLVar",jsonData.result.quoteID);
script for storing the value in Env variable
May i know how i can use value which is stored in the CreateGLVar for the below script. how i can the quote id from first request from global variable and insert dynamically in the second request( shown below) .
get quote id
enter image description here
Postman uses double curly braces to insert variables, which can also be used in raw request bodies.
In your specific case you can use:
"quoteID": "{{quoteIdVariable}}"
I am using the Postman Chrome extension Version 5.3.1, and this works for me.
Edit: Now that the Chrome extension has been depricated, this still works with the Postman Desktop app
Thanks Aaron.
I got the success when i used the "quoteID": "{{quoteIdVariable}}" in my bind API.my 2 API are working fine when i executed individually.
But I got issue when i executed API's as collection( Quote and Bind API). What i missing here if i executed as collection.
Failed
In SSIS, I already have a Web Service Task using a WSDL for sending SMS. I am indeed able to send SMS using this task.
I want supply values to this task from the database, such as Mobile Number, Message body, User ID, etc.
How can I create a complex type user variable that can be passed as input to a Web Service task?
It looks like the only answer is to change the web service to accept only simple types as parameters. I have scoured the web and there seems to be no way to dynamically create complex types for consumption by the input values in the web service task.
The more 'easy' way is to use the script component for bypassing variables to a web service. Check http://amolpandey.com/2016/09/26/ssis-script-task-to-obtain-geo-cordinates-from-address-text-via-google-api/ & http://www.sqlmusings.com/2011/03/25/geocode-locations-using-google-maps-v3-api-and-ssis/.
Tested and working. Using this task you can bypass the SSIS variables/parameters.
Example: Getting ID, addreess, zipcode, city, country from a table with an execute SQL Task. Change Resultset:Full result set on General tab. Then on resultset tab add Result_Name:0 & Variable_Name: User::YourObject. Then the next task will be a Forlooptask editor (Foreach ADO Enumerator ,Collection tab - Ado object source variable: User::YourObject, enumeration mode: rows in the first table, variable Mapping tab - Variable User::Id, 0 | address,1 etc.). Inside the Forlooptask editor you add a data flow task, which the source of this task will be a script component. If you be more specific about your logic,we may assist you more.
Okay so I came across the same problem. I needed to pass one parameter as complex type.
Create a Web Service task in your package.
Fill all the needed properties at General tab: HttpConnection and WSDFile
Fill properties in Input tab: Service, Method
Below click on Value, manually enter the value you need (mine is 2021-11-15)
Deploy and execute package to be sure everything is OK
After this easy steps go into folder where package is localated. Right click on package file (Package.dtsx) and select Open with > Notepad. With find function in notepad search the value you manually inserted.
The part which we are looking for looks in my case like this
<WSTask:ComplexValue>
<WSTask:ComplexProperty
WSTask:Name="date"
WSTask:Datatype="dateTime"
WSTask:ParamType="Primitive">
<WSTask:PrimitiveValue>2021-11-15</WSTask:PrimitiveValue>
</WSTask:ComplexProperty>
</WSTask:ComplexValue>
Finally I found what I was looking for. Now for the second part I needed to be that parameter changing by current date when I execute that package. In powershell I managed to write a code that change date part in string: <WSTask:PrimitiveValue>2021-11-15</WSTask:PrimitiveValue> to current date, everytime when the package is executed. The code looks like this:
$Now = Get-Date -Format "yyyy-MM-dd"
$Yesterday = (Get-Date).AddDays(-1).ToString("yyyy-MM-dd")
$file = ((Get-Content -path "C:\Package.dtsx" -Raw) -replace "<WSTask:PrimitiveValue>$Yesterday</WSTask:PrimitiveValue>", "<WSTask:PrimitiveValue>$Now</WSTask:PrimitiveValue>")
[System.IO.File]::WriteAllText("C:\Package.dtsx",$file)
# This part will execute the package #
dtexec.exe /f "C:\Package.dtsx"
After all this, I planned this script in Task Scheduler and it works.
In my case changing the type of request from complex to simple wasn't an option and all I needed was just one parameter to pass.
Hopes it gonna help somebody.