Unable to access DynamoDB from Lambda - amazon-web-services

I have created a lambda java project to get items from dynamoDB.But I am getting error while accessing.
Code that I wrote:
dbClient = AmazonDynamoDBClientBuilder.standard().withRegion(Regions.US_EAST_1).build();
DynamoDB dynamoDB = new DynamoDB(dbClient);
Table table = dynamoDB.getTable("TokenSystem");
Item item = table.getItem("TokenId", 123456);
return item.toJSON();
Error getting in lambda console:
com.amazonaws.services.dynamodbv2.model.AmazonDynamoDBException: The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 9OJVLPJHV011KLGKI20Q8QN2FNVV4KQNSO5AEMVJF66Q9ASUAAJG)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1658)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1322)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1072)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:745)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:719)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:701)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:669)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:651)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:515)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:3609)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:3578)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:3567)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.executeGetItem(AmazonDynamoDBClient.java:1869)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.getItem(AmazonDynamoDBClient.java:1840)
at com.amazonaws.services.dynamodbv2.document.internal.GetItemImpl.doLoadItem(GetItemImpl.java:77)
at com.amazonaws.services.dynamodbv2.document.internal.GetItemImpl.getItemOutcome(GetItemImpl.java:40)
at com.amazonaws.services.dynamodbv2.document.internal.GetItemImpl.getItemOutcome(GetItemImpl.java:99)
at com.amazonaws.services.dynamodbv2.document.internal.GetItemImpl.getItem(GetItemImpl.java:111)
at com.amazonaws.services.dynamodbv2.document.Table.getItem(Table.java:624)
at com.amazonaws.lambda.service.TokenValidatorService.retrieveItemFromDB(TokenValidatorService.java:82)
at com.amazonaws.lambda.service.TokenValidatorService.checkToken(TokenValidatorService.java:42)
In DynamoDB I have a table with the SAME NAME and REGION.I am using the AWS package com.amazonaws.services.dynamodbv2 to do all the operations.Can anyone help me to solve the issue.

The provided key element does not match the schema
Make sure you provide the right key Name

Related

How to access Columnar URL INDEX using Amazon Athena

I am new to AWS and I'm following this tutorial to access Columnar dataset in Common Crawl. I executed this query:
SELECT COUNT(*) AS count,
url_host_registered_domain
FROM "ccindex"."ccindex"
WHERE crawl = 'CC-MAIN-2018-05'
AND subset = 'warc'
AND url_host_tld = 'no'
GROUP BY url_host_registered_domain
HAVING (COUNT(*) >= 100)
ORDER BY count DESC
And I keep getting this error:
Error opening Hive split s3://commoncrawl/cc-index/table/cc-main/warc/crawl=CC-MAIN-2018-05/subset=warc/part-00082-248eba37-08f7-4a53-a4b4-d990640e4be4.c000.gz.parquet (offset=0, length=33554432): com.amazonaws.services.s3.model.AmazonS3Exception: Please reduce your request rate. (Service: Amazon S3; Status Code: 503; Error Code: SlowDown; Request ID: ZSRS4FD2ZTNJY9PV; S3 Extended Request ID: IvDfkWdbDYXjjOPhmXSQD3iVkBiE2Kl1/K3xaFc1JulOhCIcDbWUhnbww7juthZIUm2hZ9ICiwg=; Proxy: null), S3 Extended Request ID: IvDfkWdbDYXjjOPhmXSQD3iVkBiE2Kl1/K3xaFc1JulOhCIcDbWUhnbww7juthZIUm2hZ9ICiwg=
What's the reason? And how do I resolve it?
You are hitting the request rate limit of S3 since your query is trying to access too many parquet files at the same time. Consider compacting the underlying files into less.

Dynamically Insert/Update Item in DynamoDB With Python Lambda using event['body']

I am working on a lambda function that gets called from API Gateway and updates information in dynamoDB. I have half of this working really dynamically, and im a little stuck on updating. Here is what im working with:
dynamoDB table with a partition key of guild_id
My dummy json code im using:
{
"guild_id": "126",
"guild_name": "Posted Guild",
"guild_premium": "true",
"guild_prefix": "z!"
}
Finally the lambda code:
import json
import boto3
def lambda_handler(event, context):
client = boto3.resource("dynamodb")
table = client.Table("guildtable")
itemData = json.loads(event['body'])
guild = table.get_item(Key={'guild_id':itemData['guild_id']})
#If Guild Exists, update
if 'Item' in guild:
table.update_item(Key=itemData)
responseObject = {}
responseObject['statusCode'] = 200
responseObject['headers'] = {}
responseObject['headers']['Content-Type'] = 'application/json'
responseObject['body'] = json.dumps('Updated Guild!')
return responseObject
#New Guild, Insert Guild
table.put_item(Item=itemData)
responseObject = {}
responseObject['statusCode'] = 200
responseObject['headers'] = {}
responseObject['headers']['Content-Type'] = 'application/json'
responseObject['body'] = json.dumps('Inserted Guild!')
return responseObject
The insert part is working wonderfully, How would I accomplish a similar approach with update item? Im wanting this to be as dynamic as possible so I can throw any json code (within reason) at it and it stores it in the database. I am wanting my update method to take into account adding fields down the road and handling those
I get the follow error:
Lambda execution failed with status 200 due to customer function error: An error occurred (ValidationException) when calling the UpdateItem operation: The provided key element does not match the schema.
A "The provided key element does not match the schema" error means something is wrong with Key (= primary key). Your schema's primary key is guild_id: string. Non-key attributes belong in the AttributeUpdate parameter. See the docs.
Your itemdata appears to include non-key attributes. Also ensure guild_id is a string "123" and not a number type 123.
goodKey={"guild_id": "123"}
table.update_item(Key=goodKey, UpdateExpression="SET ...")
The docs have a full update_item example.

Terraform "primary workGroup could not be created"

I'm trying to execute query on my table In amazone but i cant execute any query i had this error msg :
Before you run your first query, you need to set up a query result location in Amazon S3.
Your query has the following error(s):
No output location provided. An output location is required either through the Workgroup result configuration setting or as an API input. (Service: AmazonAthena; Status Code: 400; Error Code: InvalidRequestException; Request ID: b6b9aa41-20af-4f4d-91f6-db997e226936)
So i'm trying to add Workgroup but i have this problem
'Error: error creating Athena WorkGroup: InvalidRequestException: primary workGroup could not be created
{
RespMetadata: {
StatusCode: 400,
RequestID: "c20801a0-3c13-48ba-b969-4e28aa5cbf86"
},
AthenaErrorCode: "INVALID_INPUT",
Message_: "primary workGroup could not be created"
}
'
Mycode
resource "aws_s3_bucket" "tony" {
bucket = "tfouh"
}
resource "aws_athena_workgroup" "primary" {
name = "primary"
depends_on = [aws_s3_bucket.tony]
configuration {
enforce_workgroup_configuration = false
publish_cloudwatch_metrics_enabled = true
result_configuration {
output_location = "s3://${aws_s3_bucket.tony.bucket}/"
encryption_configuration {
encryption_option = "SSE_S3"
}
}
}
}
please if there are solution
This probably happens because you already have primary work group. Thus, you can't create new one of the same name. Just create a work group with different name if you want:
name = "primary2"
#Marcin suggested a valid approach, but what may be closer to what you are looking for would to to import existing workgroup into the state:
terraform import aws_athena_workgroup.primary primary
Once the state knows about the already existing resource it can do the plan and apply possible changes.

Get Item from a table in Dynamodb AWS

I am trying to get_item from a table in dynamodb.
def read_table_item(table_name, pk_name, pk_value):
"""
Return item read by primary key.
"""
table = dynamodb.Table(table_name)
response = table.get_item( Key={pk_name: pk_value})
return response
print (read_table_item(table_name,pk_name="_id",pk_value={"S":str(1)}))
The error I get is
"botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the GetItem operation: The provided key element does not match the schema"
It will be helpful if somebody review the above piece and help us rectify the issue.
Thanks
it seems to me what ever you are passing to pk_name is not in the DynamodbSchema. See following for more informaton.
Incorrect dynamo db key mapping

AppSync+ DynamoDB: The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException;

How to map the Number type field to AppSync table schema?
We have a dynamo table with Number type filed (as Primary partition key). In AppSync console, when we create API via "Import DynamoDB table", i have tried
(1) mapped to String/ID: we can Scan but GetItem does not work. Both will give error message:
The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 4PA6K11N026Q9O2AOJ408PSUHNVV4KQNSO5AEMVJF66Q9ASUAAJG)
(2)mapped to int: both Scan but GetItem do notwork. Scan will give error message:
Can't serialize value (/listInstruments/items[0]/instrumentPermId) : Expected type 'Int' but was 'Long'
And GetItem will give error message:
Validation error of type WrongType: argument 'instrumentPermId' with value 'IntValue{value=192760238682}' is not a valid 'Int' # 'getInstrument'
Could anyone suggest?