I have tried to use direct lambda resolver with appsync schema and trying to get items from dynamoDB table but getting error. I can see the results in cloudwatch log but I can't get the result in query page.
One other question that my post has #hasMany comments and comments #belongsTo post how can I get the comments directly with post using direct lambda resolver. I think I have to do a separate query for comments? Appsync pipeline is good at getting queries but it is very very slow.
My lambda function:
const AWS = require("aws-sdk");
const dynamo = new AWS.DynamoDB.DocumentClient();
exports.handler = async (event) => {
const response = await dynamo.get({
TableName: "Post-xxxxxxxxxxxxx",
Key: {
id: event.arguments.id
}
}).promise();
console.log(response); // I can see the response here in cloudwatch.
return JSON.stringify(response);
};
My query :
query MyQuery {
getPost(id: "xxxxxx-xxxxx-xxxx-xxxxxxx") {
id
title
}
}
Query result:
{
"data": {
"getPost": null
},
"errors": [
{
"path": [
"getPost",
"id"
],
"locations": null,
"message": "Cannot return null for non-nullable type: 'ID' within parent 'Post' (/getPost/id)"
},
{
"path": [
"getPost",
"title"
],
"locations": null,
"message": "Cannot return null for non-nullable type: 'String' within parent 'Post' (/getPost/title)"
}
]
}
Related
I'm using an amplify stack and need to perform some actions to my graphql api which has dynamodb behind it. The request in my lambda function returns an Unauthorized error: "Not Authorized to access getSourceSync on type SourceSync", where getSourceSync is the gql query and SourceSync is the model name.
My schema.grapqhl for this particular model is set up as following. Note auth rule allow private provider iam:
type SourceSync #model (subscriptions: { level: off }) #auth(rules: [
{allow: private, provider: iam}
{allow: groups, groups: ["Admins"], provider: userPools},
{allow: groups, groups: ["Users"], operations: [create], provider: userPools},
{allow: groups, groupsField: "readGroups", operations: [create, read], provider: userPools},
{allow: groups, groupsField: "editGroups", provider: userPools}]) {
id: ID! #primaryKey
name: String
settings_id: ID #index(name: "bySettingsId", queryField: "sourceSyncBySettingsId")
settings: Settings #hasOne(fields: ["settings_id"])
childLookup: String
createdAt: AWSDateTime!
updatedAt: AWSDateTime!
_createdBy: String
_lastChangedBy: String
_localChanges: AWSJSON
readGroups: [String]
editGroups: [String]
}
My lambda function's role has the following inline policy attached to it. (Actual ID values have been omitted for security purposes on this post):
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"appsync:GraphQL"
],
"Resource": [
"arn:aws:appsync:us-east-1:111myaccountID:apis/11mygraphqlapiID/*"
],
"Effect": "Allow"
},
{
"Action": [
"appsync:GetType"
],
"Resource": [
"*"
],
"Effect": "Allow"
}
]
}
And finally my lambda function is set up as follows with a simple query test:
/* stuff */
"use strict";
const axios = require("axios");
const awsAppSync = require("aws-appsync").default;
const gql = require("graphql-tag");
require("cross-fetch/polyfill");
const { PassThrough } = require("stream");
const aws = require("aws-sdk");
aws.config.update({
region: process.env.AWS_REGION,
});
const appSync = new aws.AppSync();
const graphqlClient = new awsAppSync({
url: process.env.API_GRAPHQLAPIENDPOINTOUTPUT,
region: process.env.AWS_REGION,
auth: {
type: "AWS_IAM",
credentials: aws.config.credentials,
},
disableOffline: true
});
exports.handler = async (event, context) => {
console.log('context :: '+JSON.stringify(context));
console.log('aws config :: '+JSON.stringify(aws.config));
const sourceSyncTypes = await appSync
.getType({
apiId: process.env.API_GRAPHQLAPIIDOUTPUT,
format: "JSON",
typeName: "SourceSync",
})
.promise();
console.log('ss = '+JSON.stringify(sourceSyncTypes));
try {
const qs = gql`query GetSourceSync {
getSourceSync(id: "ov3") {
id
name
}
}`;
const res = await graphqlClient.query({query: qs, fetchPolicy: 'no-cache'});
console.log(JSON.stringify(res));
}
catch(e) {
console.log('ERR :: '+e);
console.log(JSON.stringify(e));
}
};
Found the solution, there seems to be an issue with triggering a rebuild of the resolvers on the api after permitting a function to access the graphql api. However there is a distinction to note:
If the graphql api is part of an amplify app stack, then only functions created through the amplify cli for that app (ex: amplify add function) and that are given access to the api through there will be able to access the api.
additonally during the update when you either create, or update the function to give it permissions, you must ensure that during the amplify push operation, the api stack will also be updating. you can trigger this by simply adding or removing a space in a comment inside of your amplify/backend/api//schema.graphql file.
If the function was created "adhoc" directly through the aws console, but it is trying to access a graphql api that was created as part of an amplify app stack, then you will need to put that function's role in amplify/backend/api/< apiname>/custom-roles.json in the format
{
"adminRoleNames": ["<role name>", "<role name 2>", ...]
}
Documentation references here.
If neither your api or lambda function were created with the amplify cli as part of an app stack, then just need to give access to the graphql resources for query, mutation and subscription to the lambda's role in IAM, via inline policies or a pre-defined policy.
I have created a schema and tested the query (by making a GET request to the HTTP Endpoint as data source) in it which works fine. However, any mutation (containign POST/PUT/PATCH/DELETE requests) to HTTP endpoint does not send the value in request payload to the server.
This is the GraphQL mutation that I am running from AppSync Console -
mutation MyMutation {
createMain(input: {displayName: "test user", firstName: "test", id: 0, lastName: "user"}) {
displayName
firstName
id
lastName
}
}
Following is the response of the above mutation -
{
"data": {
"createMain": {
"displayName": null,
"firstName": null,
"id": 11,
"lastName": null
}
}
}
The point worth noting down here is that these values are getting saved in the DB that is how "id" is getting generated.
Any help is highly appreciated. Thanks!
I have an API gateway setup which sends to SQS which fires a Lambda, I am trying to pass message attributes to the SQS but when I hit the endpoint in postman I keep getting a 400 Bad Request.. what is the right way to send the attributes over a JSON POST body
here is body from postman (have tried a few options based on this link)
"message": "Message",
"MessageAttributes": {
"Name": "Name",
"Type": "String",
"Value": "my value"
}
}
Here is how API Gateway is configured
Incase someone stumbles on this later here is worked from the CDK side
let intergation = new apiGateway.CfnIntegration(this, 'Integration', {
apiId: props.httpApi.httpApiId,
payloadFormatVersion: '1.0',
integrationType: 'AWS_PROXY',
credentialsArn: apigwRole.roleArn,
integrationSubtype: 'SQS-SendMessage',
requestParameters: {
QueueUrl: sqsqueue.queueUrl,
MessageBody: '$request.body',
MessageAttributes: '$request.body.MessageAttributes'
}
})
new apiGateway.CfnRoute(this, 'Route', {
apiId: props.httpApi.httpApiId,
routeKey: apiGateway.HttpRouteKey.with('/url/foo', apiGateway.HttpMethod.POST).key,
target: `integrations/${intergation .ref}`
}).addDependsOn(intergation);
and the cloudformation
MessageBody: $request.body
MessageAttributes: $request.body.MessageAttribute
then in post man the POST body content type as application/json
{
"message": "Message",
"MessageAttributes": {
"Attributes": {
"DataType": "String",
"StringValue": "my value"
}
}
}
the lamba would log out both separate for each Record from the event body
Records: [
{
....
body: 'Message',
attributes: [Object],
messageAttributes: [Object]
}
]
}
the messageAttributes object from above:
{
Attributes: {
stringValue: 'my value',
stringListValues: [],
binaryListValues: [],
dataType: 'String'
}
}
This is using AWS API Gateway v2 HTTP API also
I am using AWS Console and NodeJS.
I have the dynamodb table of users with partition key (user_id) and sort key (company_id) and other attributes.
One of my attributes is email of user. Email is unique attribute.
I need to get user_id by email but I haven't his user_id and company_id.
I think that I should use a Global Secondary Index.
I clicked on the users table, opened the Indexes tab and created GSI for this table. (name: email, type: GSI, Partition Key: email string, attributes: user_id)
I am using method Query from documentClient. This is my payload:
payload = {
"TableName": "users",
"IndexName": "email",
"KeyConditionExpression": "#index = :index_value",
"ExpressionAttributeNames":{
"#index": "email"
},
"ExpressionAttributeValues": {
":index_value": {"S": "test#gmail.com"}
},
"ProjectionExpression": "user_id",
"ScanIndexForward": false
};
}
This is my error from CloudWatch:
"errorMessage": "One or more parameter values were invalid: Condition parameter type does not match schema type"
I have found a solution while I was writing this question.
So as I use documentClient my payload should looks like this
payload = {
"TableName": "users",
"IndexName": "email",
"KeyConditionExpression": "#index = :index_value",
"ExpressionAttributeNames":{
"#index": "email"
},
"ExpressionAttributeValues": {
":index_value": "test#gmail.com" // <----------------
},
"ProjectionExpression": "user_id",
"ScanIndexForward": false
};
}
Hope it helps to someone
I have the following Lambda function configured in AWS Lambda :
var AWS = require('aws-sdk');
var DOC = require('dynamodb-doc');
var dynamo = new DOC.DynamoDB();
exports.handler = function(event, context) {
var item = { id: 123,
foo: "bar"};
var cb = function(err, data) {
if(err) {
console.log(err);
context.fail('unable to update hit at this time' + err);
} else {
console.log(data);
context.done(null, data);
}
};
// This doesn't work. How do I get current stage ?
tableName = 'my_dynamo_table_' + stage;
dynamo.putItem({TableName:tableName, Item:item}, cb);
};
Everything works as expected (I insert an item in DynamoDB every time I call it).
I would like the dynamo table name to depend on the stage in which the lambda is deployed.
My table would be:
my_dynamo_table_staging for stage staging
my_dynamo_table_prod for stage prod
However, how do I get the name of the current stage inside the lambda ?
Edit: My Lambda is invoked by HTTP via an endpoint defined with API Gateway
If you have checked "Lambda Proxy Integration" in your Method Integration Request on API Gateway, you should receive the stage from API Gateway, as well as any stageVariable you have configured.
Here's an example of an event object from a Lambda function invoked by API Gateway configured with "Lambda Proxy Integration":
{
"resource": "/resourceName",
"path": "/resourceName",
"httpMethod": "POST",
"headers": {
"header1": "value1",
"header2": "value2"
},
"queryStringParameters": null,
"pathParameters": null,
"stageVariables": null,
"requestContext": {
"accountId": "123",
"resourceId": "abc",
"stage": "dev",
"requestId": "456",
"identity": {
"cognitoIdentityPoolId": null,
"accountId": null,
"cognitoIdentityId": null,
"caller": null,
"apiKey": null,
"sourceIp": "1.1.1.1",
"accessKey": null,
"cognitoAuthenticationType": null,
"cognitoAuthenticationProvider": null,
"userArn": null,
"userAgent": "agent",
"user": null
},
"resourcePath": "/resourceName",
"httpMethod": "POST",
"apiId": "abc123"
},
"body": "body here",
"isBase64Encoded": false
}
I managed it after much fiddling. Here is a walkthrough:
I assume that you have API Gateway and Lambda configured. If not, here's a good guide. You need part-1 and part-2. You can skip the end of part-2 by clicking the newly introduced button "Enable CORS" in API Gateway
Go to API Gateway.
Click here:
Click here:
Then expand Body Mapping Templates, enter application/json as content type, click the add button, then select mapping template, click edit
And paste the following content in "Mapping Template":
{
"body" : $input.json('$'),
"headers": {
#foreach($param in $input.params().header.keySet())
"$param": "$util.escapeJavaScript($input.params().header.get($param))" #if($foreach.hasNext),#end
#end
},
"stage" : "$context.stage"
}
Then click the button "Deploy API" (this is important for changes in API Gateway to take effect)
You can test by changing the Lambda function to this:
var AWS = require('aws-sdk');
var DOC = require('dynamodb-doc');
var dynamo = new DOC.DynamoDB();
exports.handler = function(event, context) {
var currentStage = event['stage'];
if (true || !currentStage) { // Used for debugging
context.fail('Cannot find currentStage.' + ' stage is:'+currentStage);
return;
}
// ...
}
Then call your endpoint. You should have a HTTP 200 response, with the following response body:
{"errorMessage":"Cannot find currentStage. stage is:development"}
Important note:
If you have a Body Mapping Template that is too simple, like this: {"stage" : "$context.stage"}, this will override the params in the request. That's why body and headers keys are present in the Body Mapping Template. If they are not, your Lambda has not access to it.
For those who use the serverless framework it's already implemented and they can access to event.stage without any additional configurations.
See this issue for more information.
You can get it from event variable. I logged my event object and got this.
{ ...
"resource": "/test"
"stageVariables": {
"Alias": "beta"
}
}