Session based rollback through tastypie API call example code? - django

Lets say I have django.contrib.sessions.middleware.SessionMiddleware installed in django and I'm using SessionAuthentication class for API authentication in tastypie. Within a session I'm doing some changes in models through my API and after that I want to roll back. Can I do it through tastypie? If yes, what method should I execute? I can't find such a method in tastypie docs. Do you have any working example of that?

Django supports database transactions, which will commit multiple state changes atomically. (Documentation...)
It is unclear in your question how you want to trigger the rollback. One option is to use request transactions, which will rollback if an unhandled exception is issued by the view function. If you want more fine grained control, familiarize yourself with the options in the linked-to documentation. For example, you may explicitly create a transaction and then roll it back inside your view.
With respect to Tastypie, you may need to place your transaction management inside the appropriate method on the Resource interface.
I hope this gives you some pointers. Please update your question with more details if necessary.

So you want to commit changes to your models to the database, and then roll them back on a future request? That's not something that TastyPie supports (or, for that matter, Django or SQL). It's not really possible to do a clean rollback like that, considering other requests could have interacted with/changed/ built relationships with those objects in the mean time.
The best solution would probably be to integrate something like Reversion that would allow you to restore objects to a previous state.
If you want to be able to roll back all of the operations during a session, you'd probably need to keep track of the session start time and the list of objects that had been changed. If you wanted to do a rollback, you'd just have to iterate over that list and invoke reversion's revert method, like
reversion.get_for_date(your_model, session_start_datetime).revert()
However, again, that will also roll back any changes any other users have made in the same time frame, but that will be the case for any solution to this requirement.

Related

Detect Presence in AWS AppSync

I am working on an app that relies heavily on detecting when users go offline and go back online. I wanted to do this with AWS AppSync, but I can't seem to find a way to do this in the documentation. Is there a way to do it in AppSync?
Thanks for the question. Detecting presence is not currently support out of the box but you can likely build similar features yourself depending on the use case.
For example, a resolver on a subscription field is invoked every time a new device tries to open a subscription. You can use this resolver field to update some data source to tell the rest of your system that some user is currently subscribed. If using something like DynamoDB, you can use a TTL field to have records automatically removed after a certain amount of time and then require a user to "ping" every N minutes to specify that they are still online.
You could also have your application call a mutation when it first starts to register the user as online, then have the application call another mutation when the app closes to register it as offline. You could combine this with TTLs to prevent stale records in situations where the app crashes or something prevents the call to register as offline.
Thanks for the suggestion and hope this helps in the meantime.

Django API beyond simple data handling

I have a django application that deploys the model logic and data handling through the administration.
I also have in the same project a python file (scriptcl.py) that makes use of the model data to perform heavy calculations that take some time, per example 5 secs, to be processed.
I have migrated the project to the cloud and now I need an API to call this file (scriptcl.py) passing parameters, process the computation accordingly to the parameters and data of the DB (maintained in the admin) and then respond back.
All examples of the django DRF that I've seen so far only contain authentication and data handling (Create, Read, Update, Delete).
Could anyone suggest an idea to approach this?
In my opinion correct approach would be using Celery to perform this calculations asynchronous.
Write a class which inherits from DRF APIView which handles authentication, write whatever logic you want or call whichever function, Get the final result and send back the JsonReposen. But as you mentioned if the Api takes more time to respond. Then you might have to think of some thing else. Like giving back a request_id and hit that server with the request_id every 5seconds to get the data or something like that.
Just to give a feedback to this, the approach that I took was to build another API using flask and normal python scripts.
I also used sqlalchemy to access the database and retrieve the necessary data.

How do I update a web API resource by request while also reacting with backend?

How do you update (RESTful) resources in a web API from the client, when you also need the backend to take actions regarding these changes?
Let's say I have a RESTful web API with the following resources:
Worker - id, first_name, last_name, ...
Project - id, title, due_date, ..., worker [ref to Worker]. A project can exist only if it belongs to a worker.
In the client (which is typically a mobile app), users can retrieve a list of some worker's projects. They can then modify each project in that list (update), delete, or create new ones.
Changes must take place locally on the client side, until a "send" command is dispatched, only then the server should receive the updates. Kind of like a classic form use case.
The tricky part:
I need the backend to take actions according to each change, both individually and also as a whole. For example:
A user retrieved some worker's projects list, deleted a project, and also updated the due_date of another.
According to this change, the backend needs to both send push notifications to all of that project's members, and also recalculate the relevant worker's priorities according the total change in their projects (one was deleted, another got postponed...).
How do I achieve this in the best way?
If I update/delete/create each project by itself (with seperate POSTs, PUTs and DELETEs), when will the backend do the overall recalculation task?
If I update them all together as a bulk (with PUT), the backend will then need to understand what exactly changed (which got deleted, which modified...), which is a hard chore.
Another option I heard is to create a third utility resource, that's something like "WorkerProjectUpdater" that holds the changes that need to be made, like transactions, and then have a "daemon" going through it and actually committing the changes. This is also hard to achieve as in the real story there are many many types of modifications, and it'll be quite complex to create a resource (with a model and DB records) for every type of change.
I'm using Django with Django Rest Framework for that web service.
Appreciate your help!

Using Django sessions to store logged in user

I'm creating a REST-centric application that will use a NoSQL data store of some kind for most of the domain-specific models. For the primary site that I intend to build around the REST data framework, I still want to use a traditional relational database for users, billing info, and other metadata that's outside the scope of the domain data model.
I've been advised that this approach is only a good idea if I can avoid performing I/O to both the RDBMS and NoSQL data stores on the same request as much as possible.
My questions:
Is this good advice? (I'm assuming so, but the rest of these questions are useless if the first premise is wrong.)
I'd like to cache at least the logged on user as much as possible. Is it possible to use Django sessions to do this in a way that is secure, reliably correct, and fault-tolerant? Ideally, I would like to have the session API be a safe, drop-in replacement for retrieving the current user with as little interaction with the users table as possible. What legwork will I need to do to hook everything up?
If this ends up being too much of a hassle, how easy is it to store user information in the NoSQL store (that is, eliminate the RDBMS completely) without using django-nonrel? Can custom authentication/authorization backends do this?
I'm pondering using the same approach for my application and I think it is generally safe but requires special care to tackle cache consistency issues.
The way Django normally operates is that when request is received, a query is run against a Session table to find a session associated with a cookie from the request. Then, when you access request.user, a query is run against a User table to find a user for a given session (if any, because Django supports anonymous sessions). So, by default, Django needs two queries to associate each request with a user, which is expensive.
A nice thing about Django session is that it can be used as a key, value store without extending any model class (unlike for example User class that is hard to extend with additional fields). So you can for example put request.session['email'] = user.email to store additional data in the session. This is safe, in a sense, that what you read from request.session dictionary is for sure what you have put there, client has no way to change these values. So you can indeed use this technique to avoid query to the User table.
To avoid query to the Session table, you need to enable session caching (or store session data in the client cookie with django.contrib.sessions.backends.signed_cookies, which is safe, because such cookies are cryptographically protected against modification by a client).
With caching enabled, you need 0 queries to associate a request with user data. But the problem is cache consistency. If you use local in memory cache with write through option (django.core.cache.backends.locmem.LocMemCache with django.contrib.sessions.backends.cached_db) the session data will be written to a DB on each modification, but it won't be read from a DB if it is present in the cache. This introduces a problem if you have multiple Django processes. If one process modifies a session (for example changes session['email']), other process can still use an old, cached value.
You can solve it by using shared cache (Memcached backend), which guarantees that changes done by one process are visible to all other processes. In this way, you are replacing a query to a Session table with a request to a Memcached backend, which should be much faster.
Storing session data in a client cookie can also solve cache consistency issues. If you modify an email field in the cookie, all future requests send by the client should have a new email. Although client can deliberately send an old cookie, which still carries old values. Whether this is a problem is application depended.

Tracing requests of users by logging their actions to DB in django

I want to trace user's actions in my web site by logging their requests to database as plain text in Django.
I consider to write a custom decorator and place it to every view that I want to trace.
However, I have some troubles in my design.
First of all, is such logging mecahinsm reasonable or because of my log table will be enlarging rapidly it causes some preformance problems ?
Secondly, how should be my log table's design ?
I want to keep keywords if the user call search view or keep the item's id if the user call details of item view.
Besides, IP addresses of user's should be kept but how can I seperate users if they connect via single IP address as in many companies.
I am glad to explain in detail if you think my question is unclear.
Thanks
I wouldn't do that. If this is a production service then you've got a proper web server running in front of it, right? Apache, or nginx or something. That can do logging, and can do it well, and can write to a form that won't bloat your database, and there's a wealth of analytical tools for log analysis.
You are going to have to duplicate a lot of that functionality in your decorator, such as when you want to switch it on or off, or change the log level. The only thing you'll get by doing it all in django is the possibility of ultra-fine control, such as only logging views of blog posts with id numbers greater than X or something. But generally you'd not want that level of detail, and you'd log everything and do any stripping at the analysis phase. You've not given any reason currently why you need to do it from Django.
If you really want it in a RDBMS, reading an apache log file into Postgres or MySQL or one of those expensive ones is fairly trivial.
One thing you should keep in mind is that SQL databases don't offer you a very good writing performance (in comparison with reading), so if you are experiencing heavy loads you should probably look for a better in-memory solution (eg. some key-value-store like redis).
But keep in mind, that, especially if you would use a non-sql solution you should be aware what you want to do with the collected data (just display something like a 'log' or do some more in-deep searching/querying on the data).
If you want to identify different users from the same IP address you should probably look for a cookie-based solution (if you are using django's session framework the session's are per default identified through a cookie - so you could just simply use sessions). Another solution could be doing the logging 'asynchronously' via javascript after the page has loaded in the browser (which could give you more possibilities in identifying the user and avoid additional load when generating the page).