This is pretty much straight out of the documentation (http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Expressions.Modifying.html), but like most things AWS, it just doesn't work and isn't documented well at all.
The following fails with "The document path provided in the update expression is invalid for update"
var messageId = "f244ed33-d678-4763-8f46-0f06514d5139"
var sequenceId = "00000000-0000-0000-0000-000000000000"
var now = DateTime.UtcNow;
var lastActivityThreshold = now.Subtract(TimeSpan.FromSeconds(10));
var response = client.UpdateItem(new UpdateItemRequest
{
TableName = "TheTable",
Key = new Dictionary<string, AttributeValue> { { "SequenceId", new AttributeValue { S = sequenceId } } },
ExpressionAttributeValues = new Dictionary<string, AttributeValue>
{
{":MessageId", new AttributeValue {S = messageId}},
{":Now", new AttributeValue {N = now.Ticks.ToString()}},
{":LastActivityThreshold", new AttributeValue {N = lastActivityThreshold.Ticks.ToString() }},
},
UpdateExpression = "REMOVE Messages[0] SET LastActivity = :Now",
ConditionExpression = "Messages[0] <> :MessageId AND (LastActivity <= :LastActivityThreshold OR attribute_not_exists(LastActivity))",
ReturnValues = ReturnValue.UPDATED_NEW
});
This is the document I'm trying to update (as seen in JSON view in the AWS Management Console):
{
"LastActivity": {
"N": "635753575712635873"
},
"Messages": {
"SS": [
"f244ed33-d678-4763-8f46-0f06514d5139",
"f668d2a5-3a4a-4564-8384-5b5a51c9bad3"
]
},
"SequenceId": {
"S": "00000000-0000-0000-0000-000000000000"
}
}
I've tried many variations of the code above, striaght down to removing all ExpressionAttributeValues and the ConditionExpression and just using REMOVE Messages[0], but it doesn't work and throws the same error.
It looks like you're trying to apply a document path to a non JSON item. There's no concept of ordering in a set, so if you want to remove the first item, you'll need to load it into memory and iterate over it. In short, you'll need to use a list in this case.
Related
I have a DynamoDB table that I need to read/write to. I am trying to create a model for reading and writing from DynamoDB with Kotlin. But I keep encountering com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: MyModelDB[myMap]; could not unconvert attribute when I run dynamoDBMapper.scanPage(...). Some times myMap will be MyListOfMaps instead, but I guess it's from iterating the keys of a Map.
My code is below:
#DynamoDBTable(tableName = "") // Non-issue, I am assigning the table name in the DynamoDBMapper
data class MyModelDB(
#DynamoDBHashKey(attributeName = "id")
var id: String,
#DynamoDBAttribute(attributeName = "myMap")
var myMap: MyMap,
#DynamoDBAttribute(attributeName = "MyListOfMapItems")
var myListOfMapItems: List<MyMapItem>,
) {
constructor() : this(id = "", myMap = MyMap(), myListOfMaps = mutableListOf())
#DynamoDBDocument
class MyMap {
#get:DynamoDBAttribute(attributeName = "myMapAttr")
var myMapAttr: MyMapAttr = MyMapAttr()
#DynamoDBDocument
class MyMapAttr {
#get:DynamoDBAttribute(attributeName = "stringValue")
var stringValue: String = ""
}
}
#DynamoDBDocument
class MyMapItem {
#get:DynamoDBAttribute(attributeName = "myMapItemAttr")
var myMapItemAttr: String = ""
}
}
I am using the com.amazonaws:aws-java-sdk-dynamodb:1.11.500 package and my dynamoDBMapper is initialised with DynamoDBMapperConfig.Builder().build() (along with some other configurations).
My question is what am I doing wrong and why? I have also seen that some Java implementations use DynamoDBTypeConverter. Is it better and I should be using that instead?
Any examples would be appreciated!
A couple comments here. First, you are not using the AWS SDK for Kotlin. You are using another SDK and simply writing Kotlin code. Using this SDK, you are not getting full benefits of Kotlin such as support of Coroutines.
The AWS SDK for Kotlin (which does offer full support of Kotlin features) was just released as DEV Preview this week. See the DEV Guide:
Setting up the AWS SDK for Kotlin
However this SDK does not support this mapping as of now. To place items into an Amazon DynamoDB table using the AWS SDK for Kotlin, you need to use:
mutableMapOf<String, AttributeValue>
Full example here.
To map Java Objects to a DynamoDB table, you should look at using the DynamoDbEnhancedClient that is part of AWS SDK for Java V2. See this topic in the AWS SDK for Java V2 Developer Guide:
Mapping items in DynamoDB tables
You can find other example of using the Enhanced Client in the AWS Github repo.
Ok, I eventually got this working thanks to some help. I edited the question slightly after getting a better understanding. Here is how my data class eventually turned out. For Java users, Kotlin compiles to Java, so if you can figure out how the conversion works, the idea should be the same for your use too.
data class MyModelDB(
#DynamoDBHashKey(attributeName = "id")
var id: String = "",
#DynamoDBAttribute(attributeName = "myMap")
#DynamoDBTypeConverted(converter = MapConverter::class)
var myMap: Map<String, AttributeValue> = mutableMapOf(),
#DynamoDBAttribute(attributeName = "myList")
#DynamoDBTypeConverted(converter = ListConverter::class)
var myList: List<AttributeItem> = mutableListOf(),
) {
constructor() : this(id = "", myMap = MyMap(), myList = mutableListOf())
}
class MapConverter : DynamoDBTypeConverter<AttributeValue, Map<String,AttributeValue>> {
override fun convert(map: Map<String,AttributeValue>>): AttributeValue {
return AttributeValue().withM(map)
}
override fun unconvert(itemMap: AttributeValue?): Map<String,AttributeValue>>? {
return itemMap?.m
}
}
class ListConverter : DynamoDBTypeConverter<AttributeValue, List<AttributeValue>> {
override fun convert(list: List<AttributeValue>): AttributeValue {
return AttributeValue().withL(list)
}
override fun unconvert(itemList: AttributeValue?): List<AttributeValue>? {
return itemList?.l
}
}
This would at least let me use my custom converters to get my data out of DynamoDB. I would go on to define a separate data container class for use within my own application, and I created a method to serialize and unserialize between these 2 data objects. This is more of a preference for how you would like to handle the data, but this is it for me.
// For reading and writing to DynamoDB
class MyModelDB {
...
fun toMyModel(): MyModel {
...
}
}
// For use in my application
class MyModel {
var id: String = ""
var myMap: CustomObject = CustomObject()
var myList<CustomObject2> = mutableListOf()
fun toMyModelDB():MyModelDB {
...
}
}
Finally, we come to the implementation of the 2 toMyModel.*() methods. Let's start with input, this is what my columns looked like:
myMap:
{
"key1": {
"M": {
"subKey1": {
"S": "some"
},
"subKey2": {
"S": "string"
}
}
},
"key2": {
"M": {
"subKey1": {
"S": "other"
},
"subKey2": {
"S": "string"
}
}
}
}
myList:
[
{
"M": {
"key1": {
"S": "some"
},
"key2": {
"S": "string"
}
}
},
{
"M": {
"key1": {
"S": "some string"
},
"key3": {
"M": {
"key4": {
"S": "some string"
}
}
}
}
}
]
The trick then is to use com.amazonaws.services.dynamodbv2.model.AttributeValue to convert each field in the JSON. So if I wanted to access the value of subKey2 in key1 field of myMap, I would do something like this:
myModelDB.myMap["key1"]
?.m // Null check and get the value of key1, a map
?.get("subKey2") // Get the AttributeValue associated with the "subKey2" key
?.s // Get the value of "subKey2" as a String
The same applies to myList:
myModelDB.myList.foreach {
it?.m // Null check and get the map at the current index
?.get("key1") // Get the AttributeValue associated with the "key1"
...
}
Edit: Doubt this will be much of an issue, but I also updated my DynamoDB dependency to com.amazonaws:aws-java-sdk-dynamodb:1.12.126
This appears to be a nightmare, sure its easy to upgrade the nuget package to 3.11 I think the latest is, but then nothing at all compiles. So you fix the compile errors, and then it doesn't work. I'm getting an error when it tries to create the PowerBI client.
Getting the token and also creating the client appears to be totally different to v2.
This is my code:
public PowerBiConfig GetPowerBiConfig(string reportId)
{
var result = new PowerBiConfig();
try
{
if (!Guid.TryParse(reportId, out var _))
{
result.ErrorMessage = $"Invalid report guid: {reportId}";
return result;
}
var credential = new UserPasswordCredential(_powerBiProMasterUsername, _powerBiProMasterPassword);
var authenticationContext = new AuthenticationContext(AuthorityUrl);
// Taken from https://stackoverflow.com/questions/5095183/how-would-i-run-an-async-taskt-method-synchronously
var authenticationResult = authenticationContext.AcquireTokenAsync(ResourceUrl, dataArchiverSettings.PowerBiApplicationId, credential).GetAwaiter().GetResult();
if (authenticationResult == null)
{
result.ErrorMessage = "Authentication Failed.";
return result;
}
var tokenCredentials = new TokenCredentials(authenticationResult.AccessToken, "Bearer");
using (var client = new PowerBIClient(new Uri(ApiUrl), tokenCredentials))
{
var report = client.Reports.GetReportInGroup(dataArchiverSettings.PowerBiWorkspaceId, reportId);
if (report == null)
{
result.ErrorMessage = $"No report with the ID {reportId} was found in the workspace.";
return result;
}
var datasets = client.Datasets.GetDatasetById(dataArchiverSettings.PowerBiWorkspaceId, report.DatasetId);
result.IsEffectiveIdentityRequired = datasets.IsEffectiveIdentityRequired;
result.IsEffectiveIdentityRolesRequired = datasets.IsEffectiveIdentityRolesRequired;
GenerateTokenRequest tokenRequest;
if (datasets.IsEffectiveIdentityRequired == true)
{
var username = UserHelper.GetCurrentUser();
var roles = _userService.GetRolesForUser(username);
tokenRequest = new GenerateTokenRequest(accessLevel: "view",
identities: new List<EffectiveIdentity>
{
new EffectiveIdentity(username: username,
roles: new List<string> (roles.Select(x=> x.RoleName)),
datasets: new List<string> {datasets.Id})
});
}
else
{
tokenRequest = new GenerateTokenRequest(accessLevel: "view");
}
var tokenResponse =
client.Reports.GenerateTokenInGroup(dataArchiverSettings.PowerBiWorkspaceId, report.Id,
tokenRequest);
if (tokenResponse == null)
{
result.ErrorMessage = "Failed to generate embed token.";
return result;
}
// Generate Embed Configuration.
result.EmbedToken = tokenResponse;
result.EmbedUrl = report.EmbedUrl;
result.Id = report.Id.ToString();
result.WorkloadResourceName = dataArchiverSettings.PowerBiWorkloadResourceName.Trim();
}
}
catch (HttpOperationException exc)
{
result.ErrorMessage =
$"Status: {exc.Response.StatusCode} ({(int)exc.Response.StatusCode})\r\n" +
$"Response: {exc.Response.Content}\r\n" +
$"RequestId: {exc.Response.Headers["RequestId"].FirstOrDefault()}";
}
catch (Exception exc)
{
result.ErrorMessage = exc.ToString();
}
return result;
}
The closest to "upgrade guide" is the announcement in Power BI blog. It looks like your code is using v2 (e.g. reportId is string, while in v3 it should be Guid).
Here is a brief summary of the changes:
What you should know about v3
Here are the key changes with this version update:
Namespaces renaming:
Microsoft.PowerBI.Api.V2 was changed to Microsoft.PowerBI.Api
Microsoft.PowerBI.Api.Extensions.V2 was changed to Microsoft.PowerBI.Api.Extensions
Microsoft.PowerBI.Api.V1 namespace was removed.
SetAllConnections and SetAllConnectionsInGroup operations are deprecated and marked as obsolete. You should use UpdateDatasources or UpdateParameters APIs instead.
PowerBI artifacts IDs typing was changed* from string to Guid, we recommend to work with Guid when possible.
*Dataset ID is an exception and it’s typing will remain string.
ODataResponse[List[Object]] types was changed to Objects, thus returning an objects collection on responses. For example, a response of ODataResponse[List[Report]] type will now return Reports collection as the return type.
New credentials classes allow easier build of credentialDetails. The new classes include: BasicCredentials, WindowsCredentials, OAuth2Credentials, and more.
Read Configure credentials article to learn more.
New encryption helper classes for easier encryption when creating CredentialDetails.
For example, using AsymmetricKeyEncryptor class with a gateway public key:
GatewayPublicKey publicKey = new GatewayPublicKey
{
Exponent = "...",
Modulus = "..."
};
CredentialsBase credentials = new BasicCredentials("<USER>", "<PASSWORD>");
var credentialsEncryptor = new AsymmetricKeyEncryptor(publicKey);
var credentialDetails = new CredentialDetails(credentials, PrivacyLevel.None, EncryptedConnection.Encrypted, credentialsEncryptor);
Read Configure credentials article to learn more.
Consistency on field names.
For example, reportKey, datasetKey, dashboardKey and tileKey was changed to reportId, datasetId, dashboardId and tileId.
Consistency on operations names.
For example, use GetDataset instead of GetDatasetById. The effected opertation names are imports, datasets, gateways and datasources.
Use enum class instead of string for enumerated types.
For example, In the generateTokenRequest, we recommend to use TokenAccessLevel.View, and not explicitly use “view” as value.
Required fields was marked – some fields was changed to required fields are not nullable anymore.
Examples
Change in Get Reports call if WorkspaceId is a string:
var reports = await client.Reports.GetReportsInGroupAsync(WorkspaceId);
var reports = await client.Reports.GetReportsInGroupAsync(new Guid( WorkspaceId ) );
Change in response handling if a string is expected:
report = reports.Value.FirstOrDefault(r => r.Id.Equals(ReportId, StringComparison.InvariantCultureIgnoreCase));
report = reports.Value.FirstOrDefault(r => r.Id .ToString() .Equals(ReportId, StringComparison.InvariantCultureIgnoreCase));
Change in Generate token:
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync(WorkspaceId, report.Id, generateTokenRequestParameters);
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync( new Guid( WorkspaceId ), report.Id, generateTokenRequestParameters);
Change in Generate token response handling if a string is expected:
m_embedConfig.Id = report.Id;
m_embedConfig.Id = report.Id .ToString() ;
Required fields are not nullable, i.e. Expiration is not nullable and the Value property should be removed:
var minutesToExpiration = EmbedToken.Expiration .Value – DateTime.UtcNow;
var minutesToExpiration = EmbedToken.Expiration – DateTime.UtcNow;
Consistency on operations names, i.e. use GetDataset instead of GetDatasetById:
var datasets = await client.Datasets.GetDataset ById InGroupAsync(WorkspaceId, report.DatasetId);
var datasets = await client.Datasets.GetDatasetInGroupAsync(new Guid(WorkspaceId), report.DatasetId);
I'm using DynamoDB to store data and trying to read items using JavaScript. I have a function that will read an item from a specified table, but I want to detect when an item doesn't exist in the table:
function getChapterLocations() {
var table = bibleSelect.value.replace(/\s/g, '');
var chapter = bookSelect.value + " " + chapterSelect.value;
var params = {
TableName: table,
Key:{
"chapter": chapter
}
};
var docClient = new AWS.DynamoDB.DocumentClient();
docClient.get(params, function(err, data) {
if (err) {
currentLocations = {};
} else {
currentLocations = data.Item.locations;
}
getText();
});
}
The problem is, even when an item doesn't exist in the table, err is always null. When an item doesn't exist, all I get is an error that looks something like this in the console:
callListeners https://sdk.amazonaws.com/js/aws-sdk-2.164.0.min.js:48:1027
emit https://sdk.amazonaws.com/js/aws-sdk-2.164.0.min.js:48:695 emitEvent
https://sdk.amazonaws.com/js/aws-sdk-2.164.0.min.js:47:18756
I would just ignore this error, but when an item doesn't exist, it prevents getText() from being called.
I am using SolrNet to do query on my default search field and not on any specific field. How can I use Boost on a specific field in that case? Below is the code snippet.
List filter = BuildQuerySingleLine(arrParams);
var customer = solr.Query(parameters.SingleLineSearch, new QueryOptions
{
FilterQueries = filter,
SpellCheck = new SpellCheckingParameters { Collate = true },
OrderBy = new[] { new SortOrder("score", Order.DESC), SortOrder.Parse("score DESC") },
StartOrCursor = new StartOrCursor.Start(parameters.StartIndex),
Rows = parameters.NumberOfRows
});
At last I found the solution to this problem. For this I have used dismax request handler and passed the qf param value through SOLRNET.
With this you can pass the dynamic boost value to the SOLR query, on different fields.
var extraParams = new Dictionary<string, string> { { "qt", "dismax" }, { "qf", "fieldName^1 FieldName^0.6" } };
var customer = solr.Query(parameters.SingleLineSearch, new QueryOptions
{
StartOrCursor = new StartOrCursor.Start(parameters.StartIndex),
Rows = parameters.NumberOfRows,
},
ExtraParams = extraParams
});
According to this document: Querying and The DisMax Query Parser
var extraParams = new List<KeyValuePair<string, string>>();
extraParams.Add(new KeyValuePair<string, string>("bq", "SomeQuery^10"));
extraParams.Add(new KeyValuePair<string, string>("bq", "SomeOtherQuery^10"));
var options new new QueryOptions();
options.ExtraParams = extraParams; //Since my List implements the right interface
solr.Query(myQuery, options)
the bq parameter should be used to boost the Query. #Abhijit Guha has an excellent answer, to use the same idea on the Field: qf (Query fields with optional boosts)
QueryOptions options = new QueryOptions
{
ExtraParams = new KeyValuePair<string, string>[]
{
new KeyValuePair<string,string>("qt", "dismax"),
new KeyValuePair<string,string>("qf", "title^1")
},
Rows = 10,
Start = 0
};
Thank You!
I'm using AWS Lambda and try to write something to AWS DynamoDB. I use the following code:
var tableName = "locations";
var item = {
deviceId: {
S: event.deviceId
},
timestamps: {
S: event.timestamp
}
}
var params = {
TableName: tableName,
Item: item
};
dynamo.putItem(params, function(err, data) {
if (err) {
context.fail(new Error('Error ' + err));
} else {
context.success(null);
}
});
And I get the following error:
returns Error ValidationException: One or more parameter values were invalid: Type mismatch for key deviceId expected: S actual: M
This happened because the aws sdk for Nodejs had changed!
If you are using:
var doc = require('dynamodb-doc');
var dynamo = new doc.DynamoDB();
Then the parameters to the putItem call (and most other calls) have changed and instead needs to be:
var tableName = "locations";
var item = {
deviceId: event.deviceId,
timestamp: event.timestamp,
latitude: Number(event.latitude),
longitude: Number(event.longitude)
}
var params = {
TableName: tableName,
Item: item
};
Read all about the new sdk here: https://github.com/awslabs/dynamodb-document-js-sdk