How to know when an item in cache expires in django - django

I'm currently building an order and checkout system for an application in django. I need to know when an order expires in a cache to trigger a code that returns items back to the inventory,up till now i haven't made progress in knowing how to do this in django.
I just need a way to trigger a code after item expires in cache similar to signals for db save and delete.

Related

People.Connections.List nextSyncToken expires after one week

I am working on an contact sync solution to be able to keep the contacts in our app in sync with the google contacts of the user.
Our code uses the php library for the google people api latest Version (v1).
Everything is working fine for one week with each user but after that week we get:
400 - Error "Sync token is expired.
Clear local cache and retry call without the sync token".
My question now:
Is this intended behaviour that you have to clear all your cache after one week with no changes or am I doing something wrong?
Is there any possibility to renew a syncToken if there were no changes?
I already checked the whole code to be sure that the new received nextSyncToken is saved at our side and used for the next incremental sync request. It seems that the new sync Token is always the same as the one sent in the request. Thus it is just clear that we get that errors if a sync token expires after exactly one week.
I also tried to set the option requestSyncToken to true for every list request, even if also a syncToken is set. No success. Sync token stays the same after each request with no changes.
Just in case someone is also facing this problem (syncToken expiration after one week without changes in the persons/contacts list):
Our solution was:
Save the creation date and time of a new syncToken each time you
get one together with the syncToken.
When you receive a syncToken in an incremental sync process compare that token to the stored one. If the syncToken is a new
one, overwrite the old one and its creation date/time.
Use a continuous process that checks each syncToken. If one is about one week old (for security reasons we used 6 days) create a new
syncToken (process see below). As the people API does not offer
things like the watch-channels of the calendar API you would anyway
need some continuous processes that do list-calls in fixed time
intervals for a complete real-time synchronization - so maybe you
could combine these tasks depending on your solution for this
problem?!
Process for creation of a new SyncToken:
Do a new list request without providing a syncToken.
For additional security do some checkups like compare the total persons received with the total persons expected by the old/current
data. And do this renew process at a time of the day when almost no
one does changes generally, for example like 2am.
Overwrite the old syncToken and date/time with the new one and the current date/time.
That's it.
But attention! You can still miss some changes that were made if your syncToken renew process is running exactly at the time a change is made!
Create a dummy contact before the sync to get a new syncToken. After the sync delete the dummy contact from both Google Contacts and your cache.

Changing Django session engine without destroying existing sessions

I'm currently running a Django application with SESSION_ENGINE configured as django.contrib.sessions.backends.db. I'd like to change this to django.contrib.sessions.backends.cached_db for a performance boost.
Can I make this change without destroying the existing sessions?
Yes, you can make this change without logged in users suddenly finding themselves being logged out. That's because cached_db checks memcache first for the key and if it cannot be found in it, goes to the database. Thus making this change will not cause a loss of session data. Fragment of code from cached_db
def load(self):
try:
data = self._cache.get(self.cache_key)
except Exception:
# Some backends (e.g. memcache) raise an exception on invalid
# cache keys. If this happens, reset the session. See #17810.
data = None
if data is None:
# Duplicate DBStore.load, because we need to keep track
# of the expiry date to set it properly in the cache.
However please note that cached sessions backends are a bit over rated. Depending on the middleware that you have, the session object may be updated very often, as often as every request if only to change the expire date. In that case you will find that the database is being written to all the time. Which means the cached value has to be discarded too.
You should be able to. cached_db backend is just a write-through cache to a database backed, persistent, db backend which speeds up your read queries. It will not speed up your write queries, so you should try and find out how much you are reading and writing the session data.
Your Django SECRET_KEY setting determines your session key hashing parametrs along with Session settings that determine the cache you will use for sessions and session your TTLs, so if you are not changing those variables, you should be good.

Wait for datastore update before proceeding

I'm working on a lightweight app, and I have quite a few situations where the user submits a form, the form data is processed and pushed to the datastore, and then the user is redirected to a page that displays some of the same data. It's quite often the case that the user gets to the page before the datastore has been updated, so they see old data. Is there any way to have the app wait for the datastore to update before proceeding? The obvious hacky solution is calling sleep(1), but that's obviously not ideal and takes longer than the actual update.
Just get the key the .put() returns:
key = mything.put()
and pass it on to the other page (e.g via urlsafe). The other page can rebuild the key and then
thething = key.get()
will always give you updated data -- i.e, this (key.get()) is one simple way to get strong consistency from the GAE datastore!

Ember-Data .find() vs .all() - how to control cache?

I was told that in order to not make a request all the time, one can use .all() method to load data that is kept in the store. But how does Ember deal with cache? I have a couple of questions.
How do you control cache? When do you use .find() and when .all(). Do you use .find() and then .all()? For how long?
Does .all() have some expiration date so that after some time it can make a new request? Or it uses Local Storage so that I have to clear it manually?
Suppose that I have some data I'd like to refresh only once a week? How should I go about this? Now every time I enter or re-visit the same route a new request is made. How can I avoid this?
So will start by answering question from your comment:
I'd rather to know how can I load data when an app starts (not via routes as I don't have to update it so often). Is it possible
So OK technically this is still via routes, but the best way to load data when an app "starts" is via the Application Route's model hook.
App.ApplicationRoute = Ember.Route.extend({
model: function({
return App.Post.find();
})
})
The router will wait for promise returned by find() to resolve, so you can be sure that response from server has come back before any other routes are entered.
How do you control cache?
Mostly you don't worry about it. You can refresh() an individual record after some timeout if needed.
When do you use .find() and when .all(). Do you use .find() and then .all()? For how long?
Depends what you want to achieve. In our app we use find() in the application route, then either all() or a filter() in other routes.
Does .all() have some expiration date so that after some time it can make a new request?
Nope. It will never make a new request
Or it uses Local Storage so that I have to clear it manually?
It does not use local storage, records are in memory. So for sure an F5 will clear the cache.
Suppose that I have some data I'd like to refresh only once a week? How should I go about this? Now every time I enter or re-visit the same route a new request is made. How can I avoid this?
So OK let's assume you use find() only in the application route, and that user keeps browser open for 1 week and the records have expired. There are many ways to refresh, what's easy/best depends on if they all expire at once or if they time-out one at a time.
Have some timer checks for expired records and calls refresh() as needed.
Have a Ping model that you update on some schedule. When server responds to update it can sideload any changed records.
Or can just refresh the browser once per week (via window.location...)
What you call cache is the content of the store. There are usually 2 ways to update the store to reflect changes made on the backend side:
the change happens with a user interaction/call to the server. If you update mulitple records on the backend side, you can sideload them with the response of that request.
the change happens asynchronously on the backend side (background job). You can use a websocket to push those changes to the client.

accurate page view count in Django

What is a good approach to keeping accurate counts of how many times a page has been viewed?
I'm using Django. Specifically, I don't want refreshing the page to up the count.
As far as I'm aware, no browsers out there at the moment send any kind of message/header to the server saying whether the request was from a refresh or not.
The only way I can see to not count a user refreshing the page is to track the IPs and times that a user views a page, and then if the user last viewed the page less than 30 minutes ago, say, you would dismiss it as a refresh and not increment the page view count.
IMO most page refreshes should be counted as a page view anyway, as the only reason I have for refreshing is to see new data that might have been added, or the occasional accidental refresh/reloading after a browser crash (which the above method would dismiss).
You could give each user cookie, that expires at the end of the day, containing a unique number. If he reloads a page you can check wether she has been counted already that day.
You could create a table with unique visitors of the pages, e.g. VisitorIP + X-Forwarded-For content, User-Agent string along with a PageID of some sorts. If the data itself is irrelevant, you can create a md5/sha1 hash from these values (besides the PageID of course). Be warned however that this table will grow really fast.
I'd advise against setting cookies for that purpose. They have a limited size and with many visited pages by the user, you could reach that limit and make the solution unreliable. Also it makes it harder to cache such page on client-side (see Cacheability), since it becomes interactive content.
You can write a django middleware and catch request.url, then setup a table with url / accesses columns. Beware of transactions for concurrent update.
If you have load problems, you can use memcached with incr or add function and periodicaly update the database table to avoid transaction locks.