I am using Appsync for graphql server. and use `$util.error(String, String, Object, Object)
in the response mapping template to response error to clients. However, the error messages clients get looks like below json. There are too many extra information there. What clients really care about is the messageI made this error. How can I response a simple json object like {"errorMessage": "I made this error"}` to clients?
{
"data": {
"getError": null
},
"errors": [
{
"path": [
"getError"
],
"data": null,
"errorType": "ALWAYS_ERROR",
"errorInfo": null,
"locations": [
{
"line": 2,
"column": 3,
"sourceName": null
}
],
"message": "I made this error"
}
]
}
This is the standard error format for GraphQl per specs - describing code, message and where the error occurred. The primary reason you see this complex format is to due to the nature of GraphQL which can return very deeply nested data. It would be extremely hard to deduce where the issue was with a simple message. In something like REST where the response is standardized its easier to provide that kind of message but in qgl, since the user is building essentially a unique query then we have to be much more thorough with the error response.
That being said, you can, in your client, simplify it by grabbing the first error:
if(data.errors) {
console.error(data.errors[0].errorType)
}
Related
I am trying to write "BatchPutItem" custom resolver, so that I can create multiple items (not more than 25 at a time), which should accept a list of arguments and then perform a batch operation.
Here is the code which I have in CusomtResolver:
#set($pdata = [])
#foreach($item in ${ctx.args.input})
$util.qr($item.put("createdAt", $util.time.nowISO8601()))
$util.qr($item.put("updatedAt", $util.time.nowISO8601()))
$util.qr($item.put("__typename", "UserNF"))
$util.qr($item.put("id", $util.defaultIfNullOrBlank($item.id, $util.autoId())))
$util.qr($pdata.add($util.dynamodb.toMapValues($item)))
#end
{
"version" : "2018-05-29",
"operation" : "BatchPutItem",
"tables" : {
"Usertable1-staging": $utils.toJson($pdata)
}
}
Response in the query console section:
{
"data": {
"createBatchUNF": null
},
"errors": [
{
"path": [
"createBatchUserNewsFeed"
],
"data": null,
"errorType": "MappingTemplate",
"errorInfo": null,
"locations": [
{
"line": 2,
"column": 3,
"sourceName": null
}
],
"message": "Unsupported operation 'BatchPutItem'. Datasource Versioning only supports the following operations (TransactGetItems,PutItem,BatchGetItem,Scan,Query,GetItem,DeleteItem,UpdateItem,Sync)"
}
]
}
And the query is :
mutation MyMutation {
createBatchUNF(input: [{seen: false, userNFUserId: "userID", userNFPId: "pID", user NFPOwnerId: "ownerID"}]) {
items {
id
seen
}
}
}
Conflict detection is also turned off
And when I check cloud logs I found this error:
I was able to solve this issue by disabling DataStore for entire API.
When we create a new project/backend app using amplify console, and in data tab it asks us to enable data store so that we can start modelling our GraphQL api, that's the reason which enables versioning for the entire API, and it prevents batchWrite to execute.
So in order to solve this issue, we can follow these steps:
[Note: I am using #aws-amplify/cli]
$ amplify update api
? Please select from one of the below mentioned services: GraphQL
? Select from the options below: Disable DataStore for entire API
And then amplify push
This will fix this issue.
Basically, I'm using Postman to send POST requests to
https://api.notion.com/v1/pages
It works for 70% of the times and rest of the times it gives the following error sometimes. That is, for the same input.
{
"object": "error",
"status": 400,
"code": "validation_error",
"message": "body failed validation. Fix one: body.parent.type should be not present, instead was `\"database_id\"`. body.parent.page_id should be defined, instead was `undefined`."
}
Here's how my body starts
{
"parent": {
"type": "database_id",
"database_id": "a94c42320ef04b6a9c1a7e5e73455557"
},
"properties": {
"Title": {
..................
I'm not posting the entire body because it works flawlessly sometimes.
Please help me out. Is there a way to check logs of the requests that come to my page?
First, I found out that type: database_id is not necessary in parent.
I also found out that syntax errors in the payload returns a 400 error:
body failed validation. Fix one: body.parent.type should be not present, instead was `\"database_id\"`. body.parent.page_id should be defined, instead was `undefined`.
In my case, I wrongly added a value in the same level as parent, properties. Like this:
{
"parent": {
"database_id": "<database_id>"
},
"properties": {
...
},
"wrong_value": {}
}
Since the errors are not that specific, check if you made the same misktake like me, and please also double check if the parent you are trying to post to is actually a database, not a page.
The issue was with having "type: database_id" inside "parent" in the request data.
{
"parent": {
"type": "database_id",(REMOVE THIS LINE)
"database_id": "a94c42320ef04b6a9c1a7e5e73455557"
},
"properties": {
"Title": {
..................
After removing "type" it worked fine. Notion needs to update their docs.
I am working with AppSync and it seems that validation errors don't trigger the error catch in the response mapping template.
The attribute values contain an AWSPhone input.
If I input a wrong format for AWSPhone, the error (as expected) is:
{
"data": null,
"errors": [
{
"path": null,
"locations": [
{
"line": 2,
"column": 17,
"sourceName": null
}
],
"message": "Validation error of type WrongType: argument 'input.company.phoneNumber' with value 'StringValue{value='+1-541-754-300'}' is not a valid 'AWSPhone' # 'createProfile'"
}
]
}
My request mapping template is like so:
{
"version": "2018-05-29",
"operation": "PutItem",
"key": {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.input.client),
},
"attributeValues": $util.dynamodb.toMapValuesJson($ctx.args.input),
}
My response mapping template:
#if($ctx.error)
$util.error("Custom error response")
#end
$util.toJson($ctx.result)
It's clear that an error does indeed happen but it doesn't trigger the case in my response template.
How do I go about returning a custom message for a validation error?
Is it even possible in this scenario?
Based on what you provided, it looks like your PutItem operation did succeed and returned the item that you put on the $ctx.result field. Then, when AppSync tried to coerce one of the fields on the output of your response mapping template into a AWSPhone it failed with the validation error you mentioned.
The #if($ctx.error) $util.error("Custom error response") will catch only DynamoDB errors. The GraphQL coercion from your result to the field output type happens after the template has evaluated.
One way to catch this would be to add validation into your request mapping template before saving it in DynamoDB or changing the value coming back from DynamoDB into the proper AWSPhone scalar format.
I want to make a Model for request where some part of the request structure may change.
As i don't have uniform structure here. How can i define json model for Amazon Api Gateway?
Request:
Here data inside items.{index}.data is changing according to type_id. Also we are not sure about which item with perticular type_id come at which {index}. even the type of items.{index}.data may change.
{
"name":"Jon Doe",
"items": [
{
"type_id":2,
"data": {
"km": 10,
"fuel": 20
}
},
{
"type_id": 5,
"data": [
[
"id":1,
"value":2
],
.....
]
},{
"type_id": 3,
"data": "data goes here"
},
....
]
}
How should i do this?
API Gateway uses JSON schema for model definitions. You can use a union datatype to represent your data object. See this question for an example of such a datatype.
Please note that a data model such as this will pose problems for generating SDKs. If you need SDK support for strictly typed languages, you may want to reconsider this data model.
Problem
Using the REST API, I have trained and deployed a model that I now want to use for prediction. I've defined the collections for prediction input and output and uploaded a json file formatted accordingly to the cloud storage. However, when trying to create a prediction job I cannot figure out what value to use for the dataFormat field, which is a required parameter. Is there any way to list all valid values?
What I've tried
My requests look like the one below. I've tried JSON, NEWLINE_DELIMITED_JSON (like when importing data into BigQuery), and even the json mime type application/json, in pretty much all different cases I can think of (upper and lower combined with snake, camel, etc.).
{
"jobId": "my_predictions_123",
"predictionInput": {
"modelName": "projects/myproject/models/mymodel",
"inputPaths": [
"gs://model-bucket/data/testset.json"
],
"outputPath": "gs://model-bucket/predictions/0/",
"region": "us-central1",
"dataFormat": "JSON"
},
"predictionOutput": {
"outputPath": "gs://my-bucket/predictions/1/"
}
}
All my attempts have only gotten me this back though:
{
"error": {
"code": 400,
"message": "Invalid value at 'job.prediction_input.data_format' (TYPE_ENUM), \"JSON\"",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"field": "job.prediction_input.data_format",
"description": "Invalid value at 'job.prediction_input.data_format' (TYPE_ENUM), \"JSON\""
}
]
}
]
}
}
From Cloud ML API reference document https://cloud.google.com/ml/reference/rest/v1beta1/projects.jobs#DataFormat, the data format field in your request should be "TEXT" for all text inputs (including JSON, CSV, etc).