django-webtest with multiple test client - django

In django-webtest, every test TestCase subclass comes with self.app, which is an instance of webtest.TestApp, then I could make it login as user A by self.app.get('/',user='A').
However, if I want to test the behavior if for both user A and user B in a test, how should I do it?
It seems that self.app is just DjangoTestApp() with extra_environ passed in. Is it appropriate to just create another instance of it?

I haven't tried setting up another instance of DjangoTestApp as you suggest, but I have written complex tests where, after making requests as user A I have then switched to making requests as user B with no issue, in each case passing the user or username in when making the request, e.g. self.app.get('/', user'A') as you have already written.
The only part which did not work as expected was when making unauthenticated requests, e.g. self.app.get('/', user=None). This did not work as expected and instead continued to use the user from the request immediately prior to this one.
To reset the app state (which should allow you to emulate most workflows with several users in a sequential manner) you can run self.renew_app() which will refresh your app state, effectively logging the current user out.
To test simultaneous access by more than one user (your question does not specify exactly what you are trying to test) then setting up another instance of DjangoTestApp would seem to be worth exploring.

Related

How to handle network errors when saving to Django Models

I have a Django .save() execution that loops at n times.
My concern is how to guard against network errors during saving, as some entries could be saved while others won't and there could be no telling.
What is the best way to make sure that the execution is completed?
Here's a sample of my code
# SAVE DEBIT ENTRIES
for i in range(len(debit_journals)):
# UPDATE JOURNAL RECORD
debit_journals[i].approval_no = journal_transaction_id
debit_journals[i].approval_status = 'approved'
debit_journals[i].save()
Either use bulk_create / bulk_update to execute a single DB query, or use transaction.atomic as decorator for your function so that any error on save will rollback your database before your function was run.
Try something like below (I suppose your model name is DebitJournal and debit_journals is a list).
for debit_journal in debit_journals:
debit_journal.approval_no = journal_transaction_id
debit_journal.approval_status = 'approved'
DebitJournal.objects.bulk_update(debit_journals, ["approval_no", "approval_status"])
If debit_journals is a QuerySet you can also try
debit_journals.update(approval_no=journal_transaction_id, approval_status='approved').
It depends of what you call a network error, if it's between the user and the django application or between the django application and the database. If it's only between the user and the app, note that if the request has been sent correctly even if the user lose the connection afterward the objects will be created. So a user might not have the request response, but objects will still be created.
If it's between the database and the django application some objects might still be created before the error.
Usually if you want a "All or Nothing" behaviour you should use manual transaction as described there: https://docs.djangoproject.com/en/4.1/topics/db/transactions/
Note that if the creation is really long you might hit the request timeout. If the creation takes more than a few seconds you should consider making it a background task. The request is only there to create the task.
See Python task queue alternatives and frameworks for 3rd party solutions.

Check if user has a permission with specific codename in django

I am trying to check if a user has a permission, which I have defined in the class Meta of the model, with a specific codename. At the moment I have:
if request.user.has_perm('app_label.code_name'):
do something
What I am trying to avoid is using the app_label, however using has_perm('code_name') does not seem to work.
The way this function works is that you need to pass the app_label, so not much you can do there.
One workaround can be to write your own wrapper function, something like:
def _has_perm(user, code_name, app_label="app_label"):
return user.has_perm(app_label + "." + code_name)
The reason you need to provide the app label is that permissions are application specific. That means if you have two apps, app_a and app_b, both with a model named Farm, they could both have a permission called can_create_new_chickens. It is very important to understand that there are two separate permissions here:
app_a.farm.can_create_new_chickens
app_b.farm.can_create_new_chickens
These are independent permissions, and a user can have neither, both or one or the other. This means it would be insecure to validate permissions without referring to the application name. Permissions given to a user in one application could affect their permissions in another application.
Back to your question, the answer is no, you cannot check permissions without the application name for the reasons given above.

Multi-tenant Django applications using Mongoengine

I want to build a multi tenant architecture for a SAAS system. We are using Django as our backend and mongoengine as our main database and gunicorn as our web-server.
Our clients are a few big companies, so the number of databases pre-allocating space shouldn't be a problem.
The first approach we took was to write a middleware to determine the source of the request to properly connect to a mongoengine database. Here is the code:
class MongoConnectionMiddleware(object):
def process_request(self, request):
if request.user.is_authenticated():
mongo_connect(request.user.profile.establishment)
And the mongo_connect method:
def mongo_connect(establishment):
db_name = 'db_client_%d' % establishment.id
connect(db_name)
This will register the "default" alias as the db_name for every mongoengine request.
But it seems that when many concurrent users from different companies are making requests, each one sets the default db_name to it's own name.
As an example:
Company A makes a request and connects to database A. While A is making it's work company B connects to database B. This makes A also connect to B's database in the process, so A fails to find some ids.
¿Is there a way to isolate the connection to the mongo database per request to avoid this problem?
Unfortunately MongoEngine seems to be designed around a very basic use case of a single primary connection and multiple auxiliary connections.
http://docs.mongoengine.org/en/latest/guide/connecting.html#connecting-to-mongodb
To get around the default connection logic, I define the first connection I come across as the default, I also add it as a named connection. I then add any subsequent connection as named connections only.
https://github.com/MongoEngine/mongoengine/issues/607#issuecomment-38651532
You can use the with_db decorator to switch from one connection to another, but it's a contextmanager call, which means as soon as you leave the with statement, it will revert. It also still requires a default connection.
http://docs.mongoengine.org/en/latest/guide/connecting.html#switch-database-context-manager
You might be able to put it inside a function and then yield inside the with to prevent it reverting immediately, I'm not sure if this is valid.
You could use a wrapper of some kind, either a function, class or a custom QuerySet, that checks the current django/flask session and switches the db to the appropriate connection.
I'm not sure if a QuerySet can do this, but it would probably be the nicest way if it can.
http://docs.mongoengine.org/en/latest/guide/querying.html#custom-querysets
I included some code in this issue here where I change the database connection for my models.
https://github.com/MongoEngine/mongoengine/issues/605
def switch(model, db):
model._meta['db_alias'] = db
# must set _collection to none so it is re-evaluated
model._collection = None
return model
MyDocument = switch(MyDocument, 'db-alias')
You'll also want to take a look at the code that mongoengine uses to switch dbs.
Beware that mongo engine likes to cache things, so changing a few variables here and there doesn't always cause an effect. It's full of surprises like this.
Edit:
I should also add, that the 'connect' call won't pick up value changes. So calling connect with new parameters wont take effect unless its a new alias. Even the disconnect function (which isn't exposed publically) doesn't let you do this as the models will cache the connection. I mention this in some of the issues linked above and also here: https://github.com/MongoEngine/mongoengine/issues/566

RallyDev - Requesting List of Iterations for a Users DefaultProject

Using the RallyDev Web Services API v2.0 I would like to request the iterations for a users default project.
I can do this now by first calling:
https://rally1.rallydev.com/slm/webservice/v2.0/iteration:current?pretty=true
Parsing out Iteration->Project->Ref, and then calling calling:
https://rally1.rallydev.com/slm/webservice/v2.0/project/[ProjectID]/Iterations?pretty=true
or
https://rally1.rallydev.com/slm/webservice/v2.0/iteration?query=(Project.Oid=[ProjectID])&pretty=true
Wondering if there is a better way?
I saw UserProfile had DefaultProject and DefaultWorkspace, but I couldn't figure out how to use them as fetching just returned 'null'.
Your queries on Iteration are spot on for looking up Iterations for a particular Project. Note that for UserProfile - the Default Workspace/Project settings are not required fields. They are empty unless the User has explicitly set these in his or her profile settings. Only the user him/herself can set these - a (Workspace/Subscription) Administrator cannot set them on behalf of the User. So if you're getting empty values back for these, it is likely because the User of concern does not have the Default Workspace/Project set.

How to avoid sending 2 duplicate POST requests to a webservice

I send a POST request to create an object. That object is created successfully on the server, but I cannot receive the response (dropped somewhere), so I try to send the POST request again (and again). The result is there are many duplicated objects on the server side.
What is the official way to handle that issue? I think it is a very common problem, but I don't know its exact name, so cannot google it. Thanks.
In REST terminology, which is how interfaces where POST is used to create an object (and PUT to modify, DELETE to delete and GET to retrieve) are called, the POST operation is attributed un-'safe' and non-'idempotent, because the second operation of every other type of petition has no effect in the collection of objects.
I doubt there is an "official" way to deal with this, but there are probably some design patterns to deal with it. For example, these two alternatives may solve this problem in certain scenarios:
Objects have unicity constraints. For example, a record that stores a unique username cannot be duplicated, since the database will reject it.
Issue an one-time use token to each client before it makes the POST request, usually when the client loads the page with the input form. The first POST creates an object and marks the token as used. The second POST will see that the token is already used and you can answer with a "Yes, yes, ok, ok!" error or success message.
Useful link where you can read more about REST.
It is unreliable to fix these issues on the client only.
In my experience, RESTful services with lots of traffic are bound to receive duplicate incoming POST requests unintentionally - e.g. sometimes a user will click 'Signup' and two requests will be sent simultaneously; this should be expected and handled by your backend service.
When this happens, two identical users will be created even if you check for uniqueness on the User model. This is because unique checks on the model are handled in-memory using a full-table scan.
Solution: these cases should be handled in the backend using unique checks and SQL Server Unique Indices.