I am working on an application written in Flask and backed by Amazon's DynamoDB accessed through boto.
For a specific use case, we need to retrieve a value from a table and then make it unavailable for other users.
However, by retrieving and then deleting the value, a race-condition could occur in between the retrieval and deletion.
Is there any way to retrieve an item from a table and immediately delete or update it in an atomic fashion?
If your logic:
get item
delete
without any additional logic to determine whether deletion should occur, then you can actually send delete request immediately, here is example (I haven't checked it, mostly take from: http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LowLevelJavaItemCRUD.html)
HashMap<String, AttributeValue> key = new HashMap<String, AttributeValue>();
key.put("Id", new AttributeValue().withN("101"));
DeleteItemRequest deleteItemRequest = new DeleteItemRequest()
.withTableName(tableName)
.withKey(key)
.withReturnValues(ReturnValue.ALL_OLD);
DeleteItemResult deleteItemResult = client.deleteItem(deleteItemRequest);
Map<String,AttributeValue> deletedItem = deleteItemResult.getAttributes();
Documentation:
withReturnValues
getAttributes
Related
I want to load an entity by key, update it, save it back to Datastore and then return it to the client from an endpoint. However, I want to make absolutely sure that when I return the entity to the client, the updated entity has been saved and propagated across the datastore. This way, if the client queries for that entity with another endpoint immediately after, the updated one will return.
Using Objectify, this is what I have so far:
First I load the entity, update some values and then load it again by the entity key and return it. Will this second load and return of the entity be strongly consistent and reflect the new value?
// Load the entity by key
Key<Thing> thingKey = Key.create(Thing.class, id);
Thing thing = ofy().load().key(thingKey).now();
// Update some values
thing.setSomeProperty("new value");
// Save entity
ofy().save().entity(thing);
// Load entity by key - will this loaded entity be guaranteed to reflect the above update?
return ofy().load().key(thingKey).now();
Note: I do not want to return the local entity I set the new values to because I do not want the client to potentially query for the entity with the new updates and not get it because they haven't been committed yet due to eventual consistency.
Could this below be another option to achieve the same effect?
// Load the entity by key
Key<Thing> thingKey = Key.create(Thing.class, id);
Thing thing = ofy().load().key(thingKey).now();
// Update some values
thing.setSomeProperty("new value");
// Will this save wait until the entity is saved/propagated across the datastore and thus any queries to this entity after this statement will reflect the new update?
ofy().save().entity(thing).now();
// Just return the entity we updated
return thing
There is no difference between those two sequences. The last load() simply loads the same object out of the session cache. You might as well just return the object.
Keep in mind that you need to run this inside a transaction to be safe. Otherwise you run the risk that some other conflicting write will get overwritten and therefore lost between the load and the save.
return ofy().transact(() -> {
Key<Thing> thingKey = Key.create(Thing.class, id);
Thing thing = ofy().load().key(thingKey).now();
thing.setSomeProperty("new value");
ofy().save().entity(thing).now();
return thing;
}
BTW with Firestore, eventual consistency really isn't a thing anymore. But that doesn't mean that data won't necessarily be stale. Technically any piece of data can be stale the moment after it is fetched from the datastore. A transaction ensures a consistent view of the data (ie, all transactions are serializable so data will not be lost).
I am trying to retrieve an entity immediately after it was saved. When debugging, I insert the entity, and check entities in google cloud console, I see it was created.
Key key = datastore.put(fullEntity)
After that, I continue with getting the entity with
datastore.get(key)
, but nothing is returned. How do I retrieve the saved entity within one request?
I've read this question Missing entities after insertion in Google Cloud DataStore
but I am only saving 1 entity, not tens of thousands like in that question
I am using Java 11 and google datastore (com.google.cloud.datastore. package)*
edit: added code how entity was created
public Key create.... {
// creating the entity inside a method
Transaction txn = this.datastore.newTransaction();
this.datastore = DatastoreOptions.getDefaultInstance().getService();
Builder<IncompleteKey> builder = newBuilder(entitykey);
setLongOrNull(builder, "price", purchase.getPrice());
setTimestampOrNull(builder, "validFrom", of(purchase.getValidFrom()));
setStringOrNull(builder, "invoiceNumber", purchase.getInvoiceNumber());
setBooleanOrNull(builder, "paidByCard", purchase.getPaidByCard());
newPurchase = entityToObject(this.datastore.put(builder.build()));
if (newPurchase != null && purchase.getItems() != null && purchase.getItems().size() > 0) {
for (Item item : purchase.getItems()) {
newPurchase.getItems().add(this.itemDao.save(item, newPurchase));
}
}
txn.commit();
return newPurchase.getKey();
}
after that, I am trying to retrieve the created entity
Key key = create(...);
Entity e = datastore.get(key)
I believe that there are a few issues with your code, but since we are unable to see the logic behind many of your methods, here comes my guess.
First of all, as you can see on the documentation, it's possible to save and retrieve an entity on the same code, so this is not a problem.
It seems like you are using a transaction which is right to perform multiple operations in a single action, but it doesn't seem like you are using it properly. This is because you only instantiate it and close it, but you don't put any operation on it. Furthermore, you are using this.datastore to save to the database, which completely neglects the transaction.
So you either save the object when it has all of its items already added or you create a transaction to save all the entities at once.
And I believe you should use the entityKey in order to fetch the added purchase afterwards, but don't mix it.
Also you are creating the Transaction object from this.datastore before instantiating the latter, but I assume this is a copy-paste error.
Since you're creating a transaction for this operation, the entity put should happen inside the transaction:
txn.put(builder.builder());
Also, the operations inside the loop where you add the purchase.getItems() to the newPurchase object should also be done in the context of the same transaction.
Let me know if this resolves the issue.
Cheers!
Please, consider following scenario:
IgniteUI 16.1 igGrid powered with igGridUpdating feature and RESTDataSource
User creates a new record through modal dialog
Post request is initiated with form data
Server processes the create request and returns an object, populated with correct ID
In success handler on the client side, the newly added in the grid row has to be found and updated with correct ID returned from the server.
The ID column serves as a grid's primary key and it's hidden
What happens when a new row is adding?
We are watching infragistics.lob-16.1.js
In _dialogOpening(), row 68167, _originalValues are computed via $.extend(this._originalValues, values, this._originalValues), where values = _getDefaultValues() or with other words values.id = this._pkVal. And _pkVal is a counter that is incremented each time when a new row appears.
Keeping that in mind, later, _endEditDialog() is called, where newValues, representing the entered data by the user, are merged with default values of the input form: newValues = this._getNewValuesForRow(colElements) followed by newValues = $.extend({}, prevValues, newValues) and prevValues are the same _originalValues from above.
Then an _addRow() is called, which calls on its run grid.dataSource.addRow() and a transaction is created.
My point here is the updating feature generates ID automatically for the new row and ID = CurrentRowsCount + 1.
So, if the grid contains 8 records, then newly created record will automatically be assigned with ID = 9. And imagine, if one of existing records has an ID = 9, then igGridUpdating's updateRow(rowId, values) will update both rows, existing and the new one. And I realy want to call this method in order to update the row with the data, returned from the server.
How could I intervene in the whole picture and accomplish the update of the new row?
The auto-generated primary keys are only meant to cover the most basic scenarios. If your app supports row deletion you should change them with something that will keep them unique using the generatePrimaryKeyValue event.
Using updateRow after receiving the permanent keys from the server is the way to go, however, remember to pop the transaction from the allTransactions array so the update doesn't go to the server on the next saveChanges call.
I am trying to develop a tool (in Visual Studio 2010, C#) which can read all the items present in an Appfabric cache and store them in a Table. I don't have to use powershell.
First I thought that If I can get all the regions present in the cache, I can make use of the DataCache.GetObjectsInRegion Method to complete my task. But I was not able to get all the region names from the cache as it does not shows the user defined region names but only the default ones, so now I am giving up on this approach.
Can anyone please guide me here, my main goal is to read all the items present in a cache.
There is no built-in method to list all items in the cache.
You're correct, it's possible to list all items using GetObjectsInRegion for a named cache. You have to know first all regions names (if used) or call GetSystemRegions to get all (default) system regions. A simple foreach will allow you to list all items. When you put something into the cache without region name, it will be added to a system region.
Here is a basic example
// Declare array for cache host(s).
DataCacheServerEndpoint[] servers = new DataCacheServerEndpoint[1];
servers[0] = new DataCacheServerEndpoint("YOURSERVERHERE", 22233);
// Setup the DataCacheFactory configuration.
DataCacheFactoryConfiguration factoryConfig = new DataCacheFactoryConfiguration();
factoryConfig.Servers = servers;
factoryConfig.SecurityProperties = new DataCacheSecurity(DataCacheSecurityMode.None, DataCacheProtectionLevel.None);
// Create a configured DataCacheFactory object.
DataCacheFactory mycacheFactory = new DataCacheFactory(factoryConfig);
// Get a cache client for the default cache
DataCache myCache = mycacheFactory.GetDefaultCache(); //or change to mycacheFactory.GetCache(myNamedCache);
//inserty dummytest data
myCache.Put("key1", "myobject1");
myCache.Put("key2", "myobject2");
myCache.Put("key3", "myobject3");
Random random = new Random();
//list all items in the cache : important part
foreach (string region in myCache.GetSystemRegions())
{
foreach (var kvp in myCache.GetObjectsInRegion(region))
{
Console.WriteLine("data item ('{0}','{1}') in region {2} of cache {3}", kvp.Key, kvp.Value.ToString(), region, "default");
}
}
I am making a lot of async calls and using loadMany to preload the ember data store like this:
if(data.feed.activities.length > 0){
App.store.loadMany(App.Activity, data.feed.activities);
}
Some of my bindings are screwing up if I am readding the same item more than once which is a possibility.
Is there a way of not reloading the item if it is already in the store? I don't want to have to iterate over each item and check if that is possible.
This is from the load() documentation in store.js
"Load a new data hash into the store for a given id and type
combination. If data for that record had been loaded previously, the
new information overwrites the old. If the record you are loading data
for has outstanding changes that have not yet been saved, an exception
will be thrown."
As you can see, the new information overwrites the old, so it should be ok to reload the same data. Maybe you have another issue. Have you configured your id correctly?