Google Cloud Compute Python API Isn't Accepting My Startup Script - google-cloud-platform

Here is my request body:
server = {
'name': name_gen.haikunate(),
'machineType': f"zones/{zone}/machineTypes/n1-standard-1",
'disks': [
{
'boot': True,
'autoDelete': True,
'initializeParams': {
'sourceImage': 'projects/ubuntu-os-cloud/global/images/ubuntu-1604-xenial-v20191204'
}
}
],
'networkInterfaces': [
{
'network': '/global/networks/default',
'accessConfigs': [
{'type': 'ONE_TO_ONE_NAT', 'name': 'external nat'}
]
}
],
'metadata': {
'items': [
{
'keys': 'startup-script',
'value': startup_script
}
]
}
When using this request body with the compute object to create a vm, it is giving me this error:
googleapiclient.errors.HttpError:
<HttpError 400 when requesting https://compute.googleapis.com/compute/v1/projects/focal-maker-240918/zones/us-east4-c/instances?alt=json
returned "Invalid value for field 'resource.metadata':
'{
"item": [
{
"value": "#!/bin/bash\n\napt-get update\n\nsleep 15\n\nclear\n\napt-get install squ...'. Metadata invalid keys:">'
How can I fix this error?

If we look at the documentation for setting metadata on a Compute Engine instance found here we see that the structure is:
"items": [
{
"key": string,
"value": string
}
],
if we compare that to your structure described in your post, we see you have coded keys instead of key as the field name. This could easily be the issue. To resolve the problem, change keys to key.

Related

Agora cloud recording not saving to S3 and returning 404

I am trying to save my audio feed to AWS S3.
Acquire and start call give proper response as given in the documentation, but when I try to stop the recording it throws a 404 error code. Also the recording is not found in AWS S3 bucket.
Below are the request and response for each of the call
/acquire
#Request body
body = {"cname": cname, "uid": uid, "clientRequest": {"resourceExpiredHour": 24}}
#Response
"Code": 200,
"Body":
{
"resourceId": "IqCWKgW2CD0KqnZm0lcCzQisVFotYiClVu2jIxWs5Rpidc9y5HhK1HEHAd77Fy1-AK9piRDWUYNlU-AC7dnZfo6QVukbSB_eh3WqTv9_ULLK-EXxt93zdO8yAzY-3SGMPVJ5x4Rx3DsHgvBfnzJWhOvjMFEcEU9X4WMmtdXJxqjV3hhpsx74tefhzfPA2A7J2UDlmF4RRuINeP4C9uMRzPmrHlHB3BrQcogcBfdgb9DAx_ySNMUXGMQX3iGFuWBtjNRB4OLA2HS04VkSRulx3IyC5zkambri3ROG6vFV04jsPkeWb3hKAdOaozYyH4Sq42Buu7dM2ndVxCMgoiPDCi-0JCBL77RkuOijiOGQtOU-w9QKoPlTXRNeTur1MSfouE0A-4eDgu79FxK5abX7dckwcv9R3AExvs47U-uhmBh8vE6NXx4dQrXsu9Krx7Ao"
}
/start
#Request body
body = {
"uid": uid,
"cname": cname,
"clientRequest": {
"recordingConfig": {
"maxIdleTime": 30,
"streamTypes": 0,
"channelType": 0,
},
"recordingFileConfig": {"avFileType": ["hls"]},
"storageConfig": {
"accessKey": ACCESS_ID,
"region": 8,
"bucket": BUCKET_NAME,
"secretKey": ACCESS_SECRET,
"vendor": 1,
"fileNamePrefix": [cname, TODAY_DATE.strftime("%d%m%Y")],
},
},
}
#Response
"Code": 200,
"Body":
{
"sid": "fd987833cb49dc9ba98ceb8498ac23c4",
"resourceId": "IqCWKgW2CD0KqnZm0lcCzQisVFotYiClVu2jIxWs5Rpidc9y5HhK1HEHAd77Fy1-AK9piRDWUYNlU-AC7dnZfo6QVukbSB_eh3WqTv9_ULLK-EXxt93zdO8yAzY-3SGMPVJ5x4Rx3DsHgvBfnzJWhOvjMFEcEU9X4WMmtdXJxqjV3hhpsx74tefhzfPA2A7J2UDlmF4RRuINeP4C9uMRzPmrHlHB3BrQcogcBfdgb9DAx_ySNMUXGMQX3iGFuWBtjNRB4OLA2HS04VkSRulx3IyC5zkambri3ROG6vFV04jsPkeWb3hKAdOaozYyH4Sq42Buu7dM2ndVxCMgoiPDCi-0JCBL77RkuOijiOGQtOU-w9QKoPlTXRNeTur1MSfouE0A-4eDgu79FxK5abX7dckwcv9R3AExvs47U-uhmBh8vE6NXx4dQrXsu9Krx7Ao"
}
/stop
#Request body
body = {"cname": cname, "uid": uid, "clientRequest": {}}
#Response
{
"resourceId": "IqCWKgW2CD0KqnZm0lcCzQisVFotYiClVu2jIxWs5Rpidc9y5HhK1HEHAd77Fy1-AK9piRDWUYNlU-AC7dnZfo6QVukbSB_eh3WqTv9_ULLK-EXxt93zdO8yAzY-3SGMPVJ5x4Rx3DsHgvBfnzJWhOvjMFEcEU9X4WMmtdXJxqjV3hhpsx74tefhzfPA2A7J2UDlmF4RRuINeP4C9uMRzPmrHlHB3BrQcogcBfdgb9DAx_ySNMUXGMQX3iGFuWBtjNRB4OLA2HS04VkSRulx3IyC5zkambri3ROG6vFV04jsPkeWb3hKAdOaozYyH4Sq42Buu7dM2ndVxCMgoiPDCi-0JCBL77RkuOijiOGQtOU-w9QKoPlTXRNeTur1MSfouE0A-4eDgu79FxK5abX7dckwcv9R3AExvs47U-uhmBh8vE6NXx4dQrXsu9Krx7Ao",
"sid": "fd987833cb49dc9ba98ceb8498ac23c4",
"code": 404,
"serverResponse": {
"command": "StopCloudRecorder",
"payload": {
"message": "Failed to find worker."
},
"subscribeModeBitmask": 1,
"vid": "431306"
}
}
My AWS bucket CORS policy is as follows:
[
{
"AllowedHeaders": [
"Authorization",
"*"
],
"AllowedMethods": [
"HEAD",
"POST"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": [
"ETag",
"x-amz-meta-custom-header",
"x-amz-storage-class"
],
"MaxAgeSeconds": 5000
}
]
I was facing the same issue.
I wanted to record a session even with one user, but seems that this is not possible, must be two users with different uid. Not sure why but at least the following worked for me.
Try:
Start your stream with one user like uid: 1
Connect with another device to the same channel but different uid.
Start recording
Before stop your recording make sure that query request is returning data
If you are getting data from query request, then your stream is recording.

How can I get a partial response for Method: instances.aggregatedList Compute API in GCP

I am trying to get specific response from Compute API method instances.aggregatedList by setting the fields request param as per https://cloud.google.com/resource-manager/docs/performance#partial-response
But I am getting 400 BAD REQUEST.
Is there a sample which I can refer for getting partial response for aggregated methods?
If you use the following CURL command:
curl -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) "https://compute.googleapis.com/compute/v1/projects/[CHANGE-FOR-YOUR-PROJECT-ID]/aggregated/instances?maxResults=1"
You'll notice that the result will have a similar form to:
{
"id": "projects/[PROJECT-ID]/aggregated/instances",
"items": {
"zones/us-central1-a": {
"instances": [
{
"id": "[INSTANCE-ID]",
"creationTimestamp": "2020-09-21T06:22:21.604-07:00",
"name": "instance-1",
"description": "",
"tags": {
"items": [
"http-server",
"https-server"
],
"fingerprint": "XXXXXX"
},
"machineType": "https://www.googleapis.com/compute/v1/projects/[PROJECT-ID]/zones/us-central1-a/machineTypes/e2-medium",
"status": "RUNNING",
"zone": "https://www.googleapis.com/compute/v1/projects/[PROJECT-ID]/zones/us-central1-a",
"canIpForward": false,
"networkInterfaces": [
{
"network": "https://www.googleapis.com/compute/v1/projects/[PROJECT-ID]/global/networks/default",
"subnetwork": "https://www.googleapis.com/compute/v1/projects/[PROJECT-ID]/regions/us-central1/subnetworks/[SUBNETWORK_NAME]",
"networkIP": "10.8.0.13",
"name": "nic0",
... with a lot more fields
As you can see the result is a little bit different as the response body explained in the documentation:
{
"id": string,
"items": [
{
"scopeName": string,
"instances": [
{
"id": string,
"creationTimestamp": string,
"name": string,
"description": string,
"tags": {
"items": [
string
],
"fingerprint": string
},
"machineType": string,
"status": enum,
"statusMessage": string,
"zone": string,
"canIpForward": boolean,
"networkInterfaces": [
{
"network": string,
"subnetwork": string,
"networkIP": string,
"ipv6Address": string,
"name": string,
.... with a lot more fields
Notice that if you compare both results, the actual response that you receive has an additional "zones/us-central1-a": field before the instances: field that I believe is causing the behavior you experience.
If you are interested in working with partial resources and get only some particular fields on the response you simply need to respect the syntax rules explained on the documentation you've shared and use the escape characters accordingly on your query parameters.
E.g. if you are only interested in getting the id of your project as well as the instances' name, machineType and status I tested the following curl command from the Cloud Shell with my GCP project and it worked without issues:
curl -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) "https://compute.googleapis.com/compute/v1/projects/[PROJECT-ID]/aggregated/instances?fields=id,items/zones%2Finstances(name,machineType,status)"
where I see that something similar to the following is returned:
{
"id": "projects/[PROJECT-ID]/aggregated/instances",
"items": {
"zones/us-central1-a": {
"instances": [
{
"name": "instance-1",
"machineType": "https://www.googleapis.com/compute/v1/projects/[PROJECT-ID]/zones/us-central1-a/machineTypes/e2-medium",
"status": "RUNNING"
},
{
"name": "instance-2",
"machineType": "https://www.googleapis.com/compute/v1/projects/[PROJECT-ID]/zones/us-central1-a/machineTypes/e2-medium",
"status": "TERMINATED"
}
]
}
}
}

Error while importing formData from swagger json to AWS API Gateway

I am using flask-restx to build an app with a swagger UI and I trying to upload this swagger file as a documentation part in AWS API Gateway. Through this swagger UI, I am enabling the user to upload a CSV file for further data processing.
I have the following swagger json:
{
"swagger": "2.0",
"basePath": "/",
"paths": {
"/upload_profile/csv": {
"post": {
"responses": {
"200": {
"description": "Profile uploaded"
},
"400": {
"description": "Validation Error"
},
"401": {
"description": "Not authorized"
}
},
"operationId": "Get uploaded profiles from user",
"parameters": [
{
"name": "csv_file",
"in": "formData",
"type": "file",
"required": true,
"description": "CSV file"
}
],
"consumes": [
"multipart/form-data"
],
"tags": [
"upload_profile"
]
}
}
},
"info": {
"title": "Upload Profile",
"version": "0.0.1"
},
"produces": [
"application/json"
],
"consumes": [
"application/json"
],
"tags": [
{
"name": "upload_profile",
"description": "Uploading User Profiles"
}
],
"responses": {
"ParseError": {
"description": "When a mask can't be parsed"
},
"MaskError": {
"description": "When any error occurs on mask"
}
}
}
When I go to API Gateway --> Documentation --> Import Documentation and paste the json, I get the following error:
How can the following issue be solved? If formData isn't supported by API Gateway, is there an alternate for hosting the swagger UI?
The problem is that AWS API Gateway expects swagger/OpenAPI version 3, and your file is version 2. If you only want a way to host swagger UI for documentation/collaboration purposes, take a look at SwaggerHub https://swagger.io/tools/swaggerhub/.
But, if you really have to use AWS API Gateway, then you need to get spec in OpenAPI-3 format. Since the API is rather small, I'd suggest preparing OpenAPI-3 spec yourself (rather than generating it) and testing it locally via swagger UI.

List users as non admin with custom fields

As per the documentation, I should be able to get a list of users with a custom schema as long as the field in the schema has a value of ALL_DOMAIN_USERS in the readAccessType property. That is the exact set up I have in the admin console; Moreover, when I perform a get request to the schema get endpoint for the schema in question, I get confirmation that the schema fields are set to ALL_DOMAIN_USERS in the readAccessType property.
The problem is when I perform a users list request, I don't get the custom schema in the response. The request is the following:
GET /admin/directory/v1/users?customer=my_customer&projection=full&query=franc&viewType=domain_public
HTTP/1.1
Host: www.googleapis.com
Content-length: 0
Authorization: Bearer fakeTokena0AfH6SMD6jF2DwJbgiDZ
The response I get back is the following:
{
"nextPageToken": "tokenData",
"kind": "admin#directory#users",
"etag": "etagData",
"users": [
{
"externalIds": [
{
"type": "organization",
"value": "value"
}
],
"organizations": [
{
"department": "department",
"customType": "",
"name": "Name",
"title": "Title"
}
],
"kind": "admin#directory#user",
"name": {
"fullName": "Full Name",
"givenName": "Full",
"familyName": "Name"
},
"phones": [
{
"type": "work",
"value": "(999)999-9999"
}
],
"thumbnailPhotoUrl": "https://photolinkurl",
"primaryEmail": "user#domain.com",
"relations": [
{
"type": "manager",
"value": "user#domain.com"
}
],
"emails": [
{
"primary": true,
"address": "user#domain.com"
}
],
"etag": "etagData",
"thumbnailPhotoEtag": "photoEtagData",
"id": "xxxxxxxxxxxxxxxxxx",
"addresses": [
{
"locality": "Locality",
"region": "XX",
"formatted": "999 Some St Some State 99999",
"primary": true,
"streetAddress": "999 Some St",
"postalCode": "99999",
"type": "work"
}
]
}
]
}
However, if I perform the same request with a super admin user, I get an extra property in the response:
"customSchemas": {
"Dir": {
"fieldOne": false,
"fieldTwo": "value",
"fieldThree": value
}
}
My understanding is that I should get the custom schema with a non admin user as long as the custom schema fields are set to be visible by all domain users. This is not happening. I opened a support ticket with G Suite but the guy that provided "support", send me in this direction. I believe this is a bug or maybe I overlooked something.
I contacted G Suite support and in fact, this issue is a domain specific problem.
It took several weeks for the issue to be addressed by the support engineers at Google but it was finally resolved. The behaviour is the intended one now.

Creating an eBay fulfilment policy. ("Please select a valid postage service.")

After a day of work with Postman, I have managed to reduce the number of errors down to 1. No idea how to get past it.
Definitely not an authorisation problem. I've been making lots of authorised calls.
URI:
POST https://api.ebay.com/sell/account/v1/fulfillment_policy
Body:
{
"categoryTypes": [
{
"name": "ALL_EXCLUDING_MOTORS_VEHICLES"
}
],
"freightShipping": "false",
"globalShipping": "false",
"handlingTime": {
"unit": "DAY",
"value": "1"
},
"localPickup": "true",
"marketplaceId": "EBAY_AU",
"name": "100 grams",
"shippingOptions": [
{
"costType": "CALCULATED",
"optionType": "DOMESTIC",
"shippingServices": [
{
"shippingCarrierCode": "Australia Post",
"shippingServiceCode": "AU_Regular"
}
]
}
]
}
Output:
{
"errors": [
{
"errorId": 20403,
"domain": "API_ACCOUNT",
"category": "REQUEST",
"message": "Invalid .",
"longMessage": "Please select a valid postage service.",
"inputRefIds": [
"service"
],
"parameters": [
{
"name": "XPATH",
"value": "DomesticItemShippingService[0].shippingService"
}
]
}
]
}
Things I've Tried:
Deleting "shippingOptions": [...] (and everything inside the []s) got rid of the errors and resulted in the successful creation of a new fulfillment policy. However, I wanted to include shipping options in my call.
shippingCarrierCode doesn't seem to do anything. I've changed it to all sorts of sensible and non sensible things, including deleting it entirely. No impact on the output.
Changing shippingServiceCode to anything non-standard (eg "shippingServiceCode": "potato") results in getting the exact same error, but twice instead of once. (See below) How can I get the same error twice with only one shipping option?
Including a domestic and international option, I get the same error twice also. (Same output as below, except the second DomesticItemShippingService[1].shippingService is instead DomesticItemShippingService[0].shippingService)
Making an international option AND a domestic option, BOTH with silly service names results in 3 errors. (I was expecting 4.)
Code:
{
"errors": [
{
"errorId": 20403,
"domain": "API_ACCOUNT",
"category": "REQUEST",
"message": "Invalid .",
"longMessage": "Please select a valid postage service.",
"inputRefIds": [
"service"
],
"parameters": [
{
"name": "XPATH",
"value": "DomesticItemShippingService[0].shippingService"
}
]
},
{
"errorId": 20403,
"domain": "API_ACCOUNT",
"category": "REQUEST",
"message": "Invalid .",
"longMessage": "Please select a valid postage service.",
"inputRefIds": [
"service"
],
"parameters": [
{
"name": "XPATH",
"value": "DomesticItemShippingService[1].shippingService"
}
]
}
]
}
What did I do wrong this time?
I did get an answer to my question. Not here, even though I sent them the link here, but I got the answer (eventually) through official eBay channels.
I'll post that answer here for everyone. I was thinking of adding correct stackoverflow formatting, but decided it would be better to leave the correct answer untouched from the original.
Hello Jonathon ,
Ignore my last message and Apologize for the confusion.
Here is the reason why you are getting the error:
You have to set "localPickup": "false", refer doc for your help:
https://developer.ebay.com/api-docs/sell/account/resources/fulfillment_policy/methods/createFulfillmentPolicy#request.localPickup
And you will get response as:
{ "name": "100 grams", "marketplaceId": "EBAY_AU",
"categoryTypes": [
{
"name": "ALL_EXCLUDING_MOTORS_VEHICLES",
"default": true
} ], "handlingTime": {
"value": 1,
"unit": "DAY" }, "shippingOptions": [
{
"optionType": "DOMESTIC",
"costType": "CALCULATED",
"shippingServices": [
{
"sortOrder": 1,
"shippingCarrierCode": "Australia Post",
"shippingServiceCode": "AU_Regular",
"freeShipping": false,
"buyerResponsibleForShipping": false,
"buyerResponsibleForPickup": false
}
],
"insuranceOffered": false,
"insuranceFee": {
"value": "0.0",
"currency": "AUD"
}
} ], "globalShipping": false, "pickupDropOff": false, "freightShipping": false, "fulfillmentPolicyId": "6104871000",
"warnings": [] }
Let us know if you need further assistance!!!
Best Regards, eBay Developer Support
Check them out here. Make sure you have all the required fields.
pickupDropOff
shippingOptions.shippingServices.shipToLocations.regionIncluded