Below is the autogenerated resolver of appsync with DynamoDB. I am unable to understand what object I have to pass to use the Query operation.
#set( $limit = $util.defaultIfNull($context.args.limit, 100) )
#set( $ListRequest = {
"version": "2018-05-29",
"limit": $limit
} )
#if( $context.args.nextToken )
#set( $ListRequest.nextToken = $context.args.nextToken )
#end
#if( $context.args.filter )
#set( $ListRequest.filter = $util.parseJson("$util.transform.toDynamoDBFilterExpression($ctx.args.filter)") )
#end
#if( !$util.isNull($modelQueryExpression)
&& !$util.isNullOrEmpty($modelQueryExpression.expression) )
$util.qr($ListRequest.put("operation", "Query"))
$util.qr($ListRequest.put("query", $modelQueryExpression))
#if( !$util.isNull($ctx.args.sortDirection) && $ctx.args.sortDirection == "DESC" )
#set( $ListRequest.scanIndexForward = false )
#else
#set( $ListRequest.scanIndexForward = true )
#end
#else
$util.qr($ListRequest.put("operation", "Scan"))
#end
$util.toJson($ListRequest)
I have tried to pass the below query but got the all data of table. I have also tried with simple filter scan it's working and returning the data but I want to use index.
let modelQueryExpression = {
"expression": "#bookingtype = :bookingtype AND #bookinguserid = :bookinguserid",
"expressionNames": {
"#bookinguserid": "bookinguserid",
"#bookingtype": "bookingtype",
},
"expressionValues": {
":bookingtype": {
"S": JSON.parse(user).id,
},
":parent": {
"S": 'Trip',
},
}
}
const bookingdata = await API.graphql({ query: listBookings, variables: { modelQueryExpression } });
Related
I am using aws appsync and dynomoDB. I have a table called Payment and I want to create each item with an unique sequential number that is PaymentNo.
If we want to achive this, we have to get last inserted item in order to update the PaymentNO count. Actually, I don't know, "how to get the last inserted item from table in the same resolver?
could anyone help me to write a resolver for this ?
schema
type Payment #model(subscriptions: null) {
id: ID!
paid_amount: Float
paid_date: AWSDate
description: String
PaymentNO: Int
}
My request resolver
## [Start] Set default values. **
$util.qr($context.args.input.put("id", $util.defaultIfNull($ctx.args.input.id, $util.autoId())))
#set( $createdAt = $util.time.nowISO8601() )
## Automatically set the createdAt timestamp. **
$util.qr($context.args.input.put("createdAt", $util.defaultIfNull($ctx.args.input.createdAt, $createdAt)))
## Automatically set the updatedAt timestamp. **
$util.qr($context.args.input.put("updatedAt", $util.defaultIfNull($ctx.args.input.updatedAt, $createdAt)))
## [End] Set default values. **
## [Start] Prepare DynamoDB PutItem Request. **
$util.qr($context.args.input.put("__typename", "FeePayment"))
#if( $modelObjectKey )
#set( $condition = {
"expression": "",
"expressionNames": {},
"expressionValues": {}
} )
#foreach( $entry in $modelObjectKey.entrySet() )
#if( $velocityCount == 1 )
$util.qr($condition.put("expression", "attribute_not_exists(#keyCondition$velocityCount)"))
#else
$util.qr($condition.put("expression", "$condition.expression AND attribute_not_exists(#keyCondition$velocityCount)"))
#end
$util.qr($condition.expressionNames.put("#keyCondition$velocityCount", "$entry.key"))
#end
#else
#set( $condition = {
"expression": "attribute_not_exists(#id)",
"expressionNames": {
"#id": "id"
},
"expressionValues": {}
} )
#end
#if( $context.args.condition )
#set( $condition.expressionValues = {} )
#set( $conditionFilterExpressions = $util.parseJson($util.transform.toDynamoDBConditionExpression($context.args.condition)) )
$util.qr($condition.put("expression", "($condition.expression) AND $conditionFilterExpressions.expression"))
$util.qr($condition.expressionNames.putAll($conditionFilterExpressions.expressionNames))
$util.qr($condition.expressionValues.putAll($conditionFilterExpressions.expressionValues))
#end
#if( $condition.expressionValues && $condition.expressionValues.size() == 0 )
#set( $condition = {
"expression": $condition.expression,
"expressionNames": $condition.expressionNames
} )
#end
{
"version": "2018-05-29",
"operation": "PutItem",
"key": #if( $modelObjectKey ) $util.toJson($modelObjectKey) #else {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.input.id)
} #end,
"attributeValues": $util.dynamodb.toMapValuesJson($context.args.input),
"condition": $util.toJson($condition)
}
## [End] Prepare DynamoDB PutItem Request. **
response resolver
#if( $ctx.error )
$util.error($ctx.error.message, $ctx.error.type)
#else
$util.toJson($ctx.result)
#end
I am working on DynamoDB with the help of AppSync GraphQL queries.
I have a one DynamoDB table in which username is Partition key(hash key) and timestamp_value is sort key(range key).
I am saving two things against one item i.e. one is reading and second one is activity (like exercise, sports etc.) For adding this two things we have different UI screens. Both things might have same timestamp so it will save in one item.
So now I need a upsert (insert or update) query which can be used for both the above operations because when you try to insert new reading then it will check that item is present or not. If present then it will update or not then it will insert the item and same thing has to happen when user want to add a new activity.
I am confused with the documentation and I didn't find exact AppSync request mapping resolver for doing upsert operation.
Below is the PutItem request mapping resolver :-
{
"version": "2017-02-28",
"operation": "PutItem",
"key": {
"username": $util.dynamodb.toDynamoDBJson($ctx.identity.username),
"timestamp": $util.dynamodb.toDynamoDBJson($ctx.args.input.timestamp),
},
"attributeValues": $util.dynamodb.toMapValuesJson($ctx.args.input),
"condition": {
"expression": "attribute_not_exists(#timestamp)",
"expressionNames": {
"#timestamp": "timestamp",
},
},
}
And below is the UpdateItem request mapping resolver:-
{
"version": "2017-02-28",
"operation": "UpdateItem",
"key": {
"username": $util.dynamodb.toDynamoDBJson($ctx.identity.username),
"timestamp": $util.dynamodb.toDynamoDBJson($ctx.args.input.timestamp),
},
## Set up some space to keep track of things we're updating **
#set( $expNames = {} )
#set( $expValues = {} )
#set( $expSet = {} )
#set( $expAdd = {} )
#set( $expRemove = [] )
## Iterate through each argument, skipping keys **
#foreach( $entry in $util.map.copyAndRemoveAllKeys($ctx.args.input, ["username", "timestamp"]).entrySet() )
#if( $util.isNull($entry.value) )
## If the argument is set to "null", then remove that attribute from the item in DynamoDB **
#set( $discard = ${expRemove.add("#${entry.key}")} )
$!{expNames.put("#${entry.key}", "${entry.key}")}
#else
## Otherwise set (or update) the attribute on the item in DynamoDB **
$!{expSet.put("#${entry.key}", ":${entry.key}")}
$!{expNames.put("#${entry.key}", "${entry.key}")}
$!{expValues.put(":${entry.key}", $util.dynamodb.toDynamoDB($entry.value))}
#end
#end
## Start building the update expression, starting with attributes we're going to SET **
#set( $expression = "" )
#if( !${expSet.isEmpty()} )
#set( $expression = "SET" )
#foreach( $entry in $expSet.entrySet() )
#set( $expression = "${expression} ${entry.key} = ${entry.value}" )
#if ( $foreach.hasNext )
#set( $expression = "${expression}," )
#end
#end
#end
## Continue building the update expression, adding attributes we're going to ADD **
#if( !${expAdd.isEmpty()} )
#set( $expression = "${expression} ADD" )
#foreach( $entry in $expAdd.entrySet() )
#set( $expression = "${expression} ${entry.key} ${entry.value}" )
#if ( $foreach.hasNext )
#set( $expression = "${expression}," )
#end
#end
#end
## Continue building the update expression, adding attributes we're going to REMOVE **
#if( !${expRemove.isEmpty()} )
#set( $expression = "${expression} REMOVE" )
#foreach( $entry in $expRemove )
#set( $expression = "${expression} ${entry}" )
#if ( $foreach.hasNext )
#set( $expression = "${expression}," )
#end
#end
#end
## Finally, write the update expression into the document, along with any expressionNames and expressionValues **
"update": {
"expression": "${expression}",
#if( !${expNames.isEmpty()} )
"expressionNames": $utils.toJson($expNames),
#end
#if( !${expValues.isEmpty()} )
"expressionValues": $utils.toJson($expValues),
#end
},
"condition": {
"expression": "SET attribute_exists(#username) AND attribute_not_exists(#timestamp)",
"expressionNames": {
"#username": "username",
"#timestamp": "timestamp",
},
}
}
So how can I update the resolver so that I can do upsert operation?
Based on docs, forum post etc, I came up with this solution to first transform the url encoded form post data into a variable and then base64 encode it before sending to Firehose (firehose needs the payload to be in this format)
#set($data = {})
#foreach( $token in $input.path('$').split('&') )
#set( $keyVal = $token.split('=') )
#set( $keyValSize = $keyVal.size() )
#if( $keyValSize >= 1 )
#set( $key = $util.urlDecode($keyVal[0]) )
#if( $keyValSize >= 2 )
#set( $val = $util.urlDecode($keyVal[1]) )
#else
#set( $val = '' )
#end
#end
$util.qr($data.put("$key", "$val"))
#end
{
"DeliveryStreamName": "my-firehose",
"Record": {
"Data": "$util.base64encode($data)"
},
"PartitionKey": "1"
}
However, the result after transformation is
{
"DeliveryStreamName": "my-firehose",
"Record": {
"Data": ""
},
"PartitionKey": "1"
}
I have tried various permutations of $util.qr($data.put("$key", "$val")) but none seem to work
$util.qr($data.put("$key", $val))
$util.qr($data.put("$key", $util.parseJson($val)))
$util.qr($data.put("$key", $util.toJson($util.parseJson($val))))
$util.qr($data.put("$key", "abc")) // hardcoded just to debug
But all of them result in an empty Data block in the final output.
And this one doesn't even transform (throws a 500)
#set($data = {
#foreach( $token in $input.path('$').split('&') )
#set( $keyVal = $token.split('=') )
#set( $keyValSize = $keyVal.size() )
#if( $keyValSize >= 1 )
#set( $key = $util.urlDecode($keyVal[0]) )
#if( $keyValSize >= 2 )
#set( $val = $util.urlDecode($keyVal[1]) )
#else
#set( $val = '' )
#end
#end
"$key": "$val" #if($foreach.hasNext),#end
#end
})
{
"DeliveryStreamName": "my-firehose",
"Record": {
"Data": "$util.base64encode($data)"
},
"PartitionKey": "1"
}
What am I messing up?
Updates
Based on #michael-sqlbot's pointers, I did find the magic recipe (not the full recipe though)
#set($data = {})
#foreach( $token in $input.path('$').split('&') )
#set( $keyVal = $token.split('=') )
#set( $keyValSize = $keyVal.size() )
#if( $keyValSize >= 1 )
#set( $key = $util.urlDecode($keyVal[0]) )
#if( $keyValSize >= 2 )
#set( $val = $util.urlDecode($keyVal[1]) )
#else
#set( $val = '' )
#end
#end
$!data.put("$key", "$util.parseJson($val)")
#end
{
"DeliveryStreamName": "my-firehose",
"Record": {
"Data": "$util.base64Encode($data)"
},
"PartitionKey": "1"
}
Also, base64encode needs to be base64Encode. With those changes, I'm seeing the data flowing through. Only issue remaining is that it's not exactly JSON:
"data": "{abc={user_info={session_id=}, event_id=77841543625, date_time=2019-12-16T21:26:17.911Z}, sb=, hello=world}"
It's also not quoting strings properly so maybe there's some more figuring out to do.
It took a bit of nudge from Michael - sqlbot to set me in the right direction, so thanks for that.
A few things were not quite correct in my initial approach:
API Gateway has a smaller set of helpers than something like AppSync. It only supports the methods that are documented here: API Gateway WebSocket API Mapping Template Reference
Somewhere buried in there is also the reference to Apache Velocity Template Language (VTL) which takes you to the reference page for Velocity Template Language.
VTL doc encourages you to check out its User Guide and it is not immediately clear if AWS supports just the language or the Engine. The answer, after a LOT of trial & error, is that it supports a mix.
Bad news is that I cannot use things like util.qr or util.toJson but for my purposes, I was able to hack together a solution that works for my goals.
Velocity Objects/Map is not same as JavaScript objects. When you use $myVar.put(...) you get an object that serializes to something like { key=Value, deepKey={nestedKey=nestedValue}. It is identical to a JS object if you substitute the = sign with :. In order to make it like JSON, you'd also have to properly quote the keys and values. As you can imagine, this can get ugly very quick.
I was pretty close to abandoning this approach and use a lambda which can do the transforms and write records to Firehose instead but my url encoded data was JSON underneath which helped in this case.
So without further ado, here's the solution that worked for me:
#set($data = "")
## parse through url encoded data, split into kv pairs
#foreach( $token in $input.path('$').split('&') )
#set( $keyVal = $token.split('=') )
#set( $keyValSize = $keyVal.size() )
#if( $keyValSize >= 1 )
#set( $key = $util.urlDecode($keyVal[0]) )
#if( $keyValSize >= 2 )
#set( $val = $util.urlDecode($keyVal[1]) )
#else
#set( $val = '' )
#end
#end
## append to stringified JSON string
#set($data = "${data}\""${key}\"":$util.escapeJavaScript($val)#if($foreach.hasNext),#end")
#end
#set($data = "$data.replaceAll('\\', '')}
")
{
"DeliveryStreamName": "my-firehose",
"Record": {
"Data": "$util.base64Encode($data)"
},
"PartitionKey": "1"
}
Explanation
Due to limitations mentioned above, I abandoned my earlier approach and instead decided to create the stringified JSON myself. That is being done here.
In order to get a proper string that can be serialized, you'd have to escape the double quotes - $util.escapeJavaScript($val) does exactly that.
#set($data = "${data}\""${key}\"":$util.escapeJavaScript($val)#if($foreach.hasNext),#end")
You'd have to wrap the key in double quotes and escape them as well otherwise the resulting string would be incorrect. \""${key}\"" does that. Figuring out how escape \" took me a few more tries since AWS doesn't support Velocity Escape Tool. The correct sequence is double double quotes (to escape the double quotes within the expression) and a single slash (code highlighting will indicate a broken string expression so I tried with \\ but turned out the syntax highlighting was wrong. Just a single slash works fine). This will result in a string like this:
"{\"abc\":{\"user_info\":{\"session_id\":\"\"},\"event_id\":\"77841543625\",\"date_time\":\"2019-12-16T21:26:17.911Z\",\"sb\":1,\"hello\":\"world\"}"
Next, I remove the slashes from the final JSON, add the wrapping braces and add a newline to each record:
#set($data = "{$data.replaceAll('\\', '')} // <- wrapping braces and slash removal
") // <- this is newline adding block, not a typo
Finally, the Firehose payload is what gets emitted:
{
"DeliveryStreamName": "my-firehose",
"Record": {
"Data": "$util.base64Encode($data)"
},
"PartitionKey": "1"
}
This results in records like this to the final destination:
{"abc":{"user_info":{"session_id": ""}, "event_id": "77841543625", "date_time":"2019-12-16T21:26:17.911Z"}, "sb": 1, "id": "home"}
{"abc":{"user_info":{"session_id": ""}, "event_id": "84154343625", "date_time":"2019-12-16T22:31:43.543Z"}, "sb": 1, "id": "sub"}
...
I'm trying to set up a graphql subscription but I'm getting the error:
Unhandled GraphQL subscription error Error: Error during subscription handshake
I'm using AWS Cognito User Pools for the authorisation.
To create the Subscription I'm using:
this.subscription = this.props.client.subscribe({ query: gql(onCreateVehicle) }).subscribe({
next: response => {
console.log(response.data.onCreateVehicle);
},
error: error => {
console.warn(error);
}
});
'onCreateVehicle' was automatically generated by Amplify and looks like this:
export const onCreateVehicle = `subscription OnCreateVehicle($owner: String!) {
onCreateVehicle(owner: $owner) {
id
name
events {
items {
id
name
date
isDeleted
owner
}
nextToken
}
sessions {
items {
id
name
isDeleted
createdAt
owner
}
nextToken
}
isDeleted
owner
}
}
`;
Request resolver:
{
"version": "2018-05-29",
"payload": {}
}
Response resolver:
## [Start] Determine request authentication mode **
#if( $util.isNullOrEmpty($authMode) && !$util.isNull($ctx.identity) && !$util.isNull($ctx.identity.sub) && !$util.isNull($ctx.identity.issuer) && !$util.isNull($ctx.identity.username) && !$util.isNull($ctx.identity.claims) && !$util.isNull($ctx.identity.sourceIp) && !$util.isNull($ctx.identity.defaultAuthStrategy) )
#set( $authMode = "userPools" )
#end
## [End] Determine request authentication mode **
## [Start] Check authMode and execute owner/group checks **
#if( $authMode == "userPools" )
## No Static Group Authorization Rules **
## [Start] Owner Authorization Checks **
#set( $isOwnerAuthorized = false )
## Authorization rule: { allow: owner, ownerField: "owner", identityClaim: "cognito:username" } **
#set( $allowedOwners0 = $util.defaultIfNull($ctx.args.owner, null) )
#set( $identityValue = $util.defaultIfNull($ctx.identity.claims.get("username"),
$util.defaultIfNull($ctx.identity.claims.get("cognito:username"), "___xamznone____")) )
#if( $util.isList($allowedOwners0) )
#foreach( $allowedOwner in $allowedOwners0 )
#if( $allowedOwner == $identityValue )
#set( $isOwnerAuthorized = true )
#end
#end
#end
#if( $util.isString($allowedOwners0) )
#if( $allowedOwners0 == $identityValue )
#set( $isOwnerAuthorized = true )
#end
#end
## [End] Owner Authorization Checks **
## [Start] Throw if unauthorized **
#if( !($isStaticGroupAuthorized == true || $isOwnerAuthorized == true) )
$util.unauthorized()
#end
## [End] Throw if unauthorized **
#end
## [End] Check authMode and execute owner/group checks **
$util.toJson(null)
I know it has something to do with authorisation as it didn't throw the handshake error when I removed the #auth from the schema.graphql. Should there be something in the request resolver to handle the authorisation?
Adam
I just had to pass the 'owner' value through the variables object.
subscribeToVehicles: async () => props.data.subscribeToMore({
document: gql(onCreateVehicle),
variables: {
owner: (await Auth.currentSession()).getIdToken().payload.sub
},
...
I am doing this in my Appsync Resolver:
{
"version" : "2017-02-28",
"operation" : "UpdateItem",
"key" : {
"pk" : { "S" : "Container" },
"id" : { "S" : "${ctx.args.id}" }
},
"update" : {
"expression" : "SET #name = :name, description = :description",
"expressionNames": {
"#name" : "name"
},
"expressionValues": {
":name" : { "S": "${context.arguments.name}" },
":description" : { "S": "${context.arguments.description}" },
}
}
}
But sometimes I may not pass in both name and description. How would I make it not SET those columns when those args are null?
All you need to do is to create your own SET expression with condition checked based on your need. Below expression check if any argument is null or empty, I don't want to update it.
#set( $expression = "SET" )
#set( $expValues = {} )
## NAME
#if( !$util.isNullOrEmpty(${context.arguments.name}) )
#set( $expression = "${expression} name = :name" )
$!{expValues.put(":name", { "S" : "${context.arguments.name}" })}
#end
## DESCRIPTION
#if( !$util.isNullOrEmpty(${context.arguments.description}) )
#if( ${expression} != "SET" )
#set( $expression = "${expression}," )
#end
#set( $expression = "${expression} description = :description" )
$!{expValues.put(":description", { "S" : "${context.arguments.description}" })}
#end
{
"version" : "2017-02-28",
"operation" : "UpdateItem",
"key" : {
"pk" : { "S" : "Container" }
"id" : { "S" : "${context.arguments.id}" }
},
"update" : {
"expression" : "${expression}",
"expressionValues": $util.toJson($expValues)
}
}
Hope it is useful!
This is very much possible. You just have to add a simple if statement to check if the value is there. A parallel example can be seen in the docs here: https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-dynamodb-resolvers.html
Specifically, that example (below) uses the application of optional arguments into a list operation.
{
"version" : "2017-02-28",
"operation" : "Scan"
#if( ${context.arguments.count} )
,"limit": ${context.arguments.count}
#end
#if( ${context.arguments.nextToken} )
,"nextToken": "${context.arguments.nextToken}"
#end
}
Just applying that if's null check should work for you.