I wrote a django app for quizzing, and it checks the user's answers and updates scores as soon as the user submits an answer. Here is the corresponding view to do this -
current_question_key = 0 #This is a global variable.
def check_answer(request):
current_user = request.user
current_team = Team.objects.get(user = current_user)
current_score = current_team.score
if request.method == "POST":
answer = request.POST.get('answer')
question = Question.objects.get(id = current_question_key)
if answer == question.answer:
if question in current_team.questions_answered.all(): #This is required to prevent the score from increasing if the somebody submits a correct answer to the same question more than once
pass
else:
current_team.score = current_score + question.score_increment
current_team.questions_answered.add(question)
current_team.save()
else:
# This is required to prevent the score from decreasing if someone has answered it correctly earlier
if question in current_team.questions_answered.all():
pass
else :
current_team.score = current_score - question.score_increment//negative_marking_factor
current_team.save()
return HttpResponse(status=204) #This means that the server has successfully processed the request and is not going to return any data.
else:
return HttpResponse("Error404")
The value of current_question_key is changed from the view used to send the question to the front end -
def game(request):
if request.method == "POST":
key = request.POST.get('questionKey')
global current_question_key
current_question_key = key
question = Question.objects.get(id = key)
question_text = question.question_text
data = {
'question_text':question_text
}
return JsonResponse(data)
else:
current_user = request.user
current_team = Team.objects.get(user = current_user)
score = current_team.score
name = current_user.username
return render(request, 'Base/main.html', {'teamname':name, 'score':score})
When tested on django's development server, this worked perfectly fine even when around 10 people were using it simultaneously. But, as soon as I tried to serve it with nginx (hosted on my laptop, with 5 simultaneous users), the app went totally haywire and even correct answers were evaluated as wrong.
I tried apache too and had the same problem with it. Almost all requests were handled incorrectly. Could this be related to race conditions? What exactly might be going on here?
You cannot use a global variable like this in Django. A Django application usually runs in multiple server processes which do not share memory. Calling the game view would only set the global variable current_question_key in one of the processes. All other processes would still have old values. As a request can be served by any process, you get more or less random results.
The Django development server uses multi-threading instead of multi-processing. Threads, as opposed to processes, share the same memory, so all request see the same value for current_question_key.
You have to store current_question_key for each user in a way that is accessible to all processes. The most obvious solution would be to store this information in the user's session:
request.session['current_question_key'] = ...
Alternatively, you could store it in the database, e.g. with ForeignKey in a customer user model, or if you want to keep track of games any in a separate table like this:
from django.contrib.auth import get_user_model
from django.db import models
class Game(model.Model)
user = models.ForeignKey(
get_user_model(),
on_delete=models.CASCADE
)
question = models.ForeignKey(Question, on_delete=models.CASCADE)
created_at = models.DateTimeField(auto_now_add=True, db_index=True)
You can then get the current game for a user by sorting by creation date:
Game.objects.filter(user=request.user).order_by('-created_at').first()
Depending on how often the current question changes, you could also consider using a key-value like Redis, although that complicates things a bit.
Related
I'm in the process of modifying the Flask app created in following along Miguel Grinberg's Flask Mega Tutorial such that it is possible to post tweets. I have imported tweepy for accessing the twitter api and modified the databases to hold the scheduled time of a tweet.
I wish to iterate over the current_user's posts and the corresponding times from the SQLAlchemy database and post when the current time matches the scheduled time.
The database model modifications in model.py are as follows:
class Post(db.Model):
id = db.Column(db.Integer, primary_key = True)
body = db.Column(db.String(140))
timestamp = db.Column(db.DateTime, index=True, default=datetime.utcnow)
socialnetwork = db.Column(db.String(40))
user_id = db.Column(db.Integer, db.ForeignKey('user.id'))
#This is the stuff for scheduling, just date
hour = db.Column(db.Integer)
minute = db.Column(db.Integer)
day = db.Column(db.Integer)
month = db.Column(db.Integer)
year = db.Column(db.Integer)
ampm = db.Column(db.String(2))
Just as a test, I wanted to iterate over the current user's posts and tweet them using tweepy:
#app.before_first_request
def activate_job():
def run_job():
posts = current_user.followed_posts().filter_by(socialnetwork ='Twitter')
for post in posts:
tweepy_api.update_status(message)
time.sleep(30)
thread = threading.Thread(target=run_job)
thread.start()
However, this returned the error:
AttributeError: 'NoneType' object has no attribute 'followed_posts'
on the terminal. This is perplexing me as I have used current_user multiple times in the same file to filter the posts by social network.
As in the following case in routes.py
#app.route('/<username>')
#login_required
def user(username):
user = User.query.filter_by(username = username).first_or_404()
socialnetwork = request.args.get("socialnetwork")
if socialnetwork == 'Facebook':
posts = current_user.followed_posts().filter_by(socialnetwork = 'Facebook')
elif socialnetwork == 'Twitter':
posts = current_user.followed_posts().filter_by(socialnetwork = 'Twitter')
else:
posts = current_user.followed_posts()
return render_template('user.html', user = user, posts = posts, form = socialnetwork)
The above yields no error and works perfectly.
If anyone could shed some light on what I am doing wrong, I'd be truly grateful.
You're likely running into issues because you're trying to get current_user on a different thread (see the Flask docs for more details). You're calling run_job() in a different context that doesn't have any current user (because there's no active request).
I'd rework it so that you get the current user's posts on the main thread (i.e. in activate_job(), then pass the list of posts to the background process to work on.
Something like:
def activate_job():
posts = current_user.followed_posts().filter_by(socialnetwork ='Twitter')
def run_job(posts):
for post in posts:
tweepy_api.update_status(message)
time.sleep(30)
thread = threading.Thread(target=run_job, args=[posts])
thread.start()
It's also worth noting that you may want to rethink your overall approach. Rather than checking with each request if there are any scheduled tweets to send, you should use some sort of background task queue that an operate independently of the web process. That way, you're not checking redundantly on each request, and you're not dependant on the user making requests around the scheduled time.
See The Flask Mega-Tutorial Part XXII: Background Jobs for more details, and look into Celery.
I have a simple Task model:
class Task(models.Model):
name = models.CharField(max_length=255)
order = models.IntegerField(db_index=True)
And a simple task_create view:
def task_create(request):
name = request.POST.get('name')
order = request.POST.get('order')
Task.objects.filter(order__gte=order).update(order=F('order') + 1)
new_task = Task.objects.create(name=name, order=order)
return HttpResponse(new_task.id)
View shifts existing tasks that goes after newly created by + 1, then creates a new one.
And there are lots of users of this method, and I suppose something will go wrong one day with ordering because update and create definitely should be performed together.
So, I just want to be shure, will it be enough to avoid any data corruptions:
from django.db import transaction
def task_create(request):
name = request.POST.get('name')
order = request.POST.get('order')
with transaction.atomic():
Task.objects.select_for_update().filter(order__gte=order).update(order=F('order') + 1)
new_task = Task.objects.create(name=name, order=order)
return HttpResponse(new_task.id)
1) Probably, something more should be done in task creation line like select_for_update before filter of existing Task.objects?
2) Does it matter where return HttpResponse() is located? Inside transaction block or outside?
Big thx
1) Probably, something more should be done in task creation line like select_for_update before filter of existing Task.objects?
No - what you have currently looks fine and should work the way you want it to.
2) Does it matter where return HttpResponse() is located? Inside transaction block or outside?
Yes, it does matter. You need to return a response to the client regardless of whether your transaction was successful or not - so it definitely needs to be outside of the transaction block. If you did it inside the transaction, the client would get a 500 Server Error if the transaction failed.
However if the transaction fails, then you will not have a new task ID and cannot return that in your response. So you probably need to return different responses depending on whether the transaction succeeds, e.g,:
from django.db import IntegrityError, transaction
try:
with transaction.atomic():
Task.objects.select_for_update().filter(order__gte=order).update(
order=F('order') + 1)
new_task = Task.objects.create(name=name, order=order)
except IntegrityError:
# Transaction failed - return a response notifying the client
return HttpResponse('Failed to create task, please try again!')
# If it succeeded, then return a normal response
return HttpResponse(new_task.id)
You could also try to change your model so you don't need to update so many other rows when inserting a new one.
For example, you could try something resembling a double-linked list.
(I used long explicit names for fields and variables here).
# models.py
class Task(models.Model):
name = models.CharField(max_length=255)
task_before_this_one = models.ForeignKey(
Task,
null=True,
blank=True,
related_name='task_before_this_one_set')
task_after_this_one = models.ForeignKey(
Task,
null=True,
blank=True,
related_name='tasks_after_this_one_set')
Your task at the top of the queue would be the one that has the field task_before_this_one set to null. So to get the first task of the queue:
# these will throw exceptions if there are many instances
first_task = Task.objects.get(task_before_this_one=None)
last_task = Task.objects.get(task_after_this_one=None)
When inserting a new instance, you just need to know after which task it should be placed (or, alternatively, before which task). This code should do that:
def task_create(request):
new_task = Task.objects.create(
name=request.POST.get('name'))
task_before = get_object_or_404(
pk=request.POST.get('task_before_the_new_one'))
task_after = task_before.task_after_this_one
# modify the 2 other tasks
task_before.task_after_this_one = new_task
task_before.save()
if task_after is not None:
# 'task_after' will be None if 'task_before' is the last one in the queue
task_after.task_before_this_one = new_task
task_after.save()
# update newly create task
new_task.task_before_this_one = task_before
new_task.task_after_this_one = task_after # this could be None
new_task.save()
return HttpResponse(new_task.pk)
This method only updates 2 other rows when inserting a new row. You might still want to wrap the whole method in a transaction if there is really high concurrency in your app, but this transaction will only lock up to 3 rows, not all the others as well.
This approach might be of use to you if you have a very long list of tasks.
EDIT: how to get an ordered list of tasks
This can not be done at the database level in a single query (as far as I know), but you could try this function:
def get_ordered_task_list():
# get the first task
aux_task = Task.objects.get(task_before_this_one=None)
task_list = []
while aux_task is not None:
task_list.append(aux_task)
aux_task = aux_task.task_after_this_one
return task_list
As long as you only have a few hundered tasks, this operation should not take that much time so that it impacts the response time. But you will have to try that out for yourself, in your environment, your database, your hardware.
I've turned to the experts at stackoverflow in hopes I could figure this out.
Let's look at a popular application like Instagram for example. Let's say you decide to look for posts with the hashtag #love (about 551,677,074 posts and growing every second!)
Each page contains 15 results in descending order of post time. As you scroll down and load more, it returns the next set of 15 results and so on. One might ask, well, if the queryset is growing every second, by the time I'm ready to view page 2, how can I be sure that page 2 will contain the next set of results in order relative to the results I just got in page 1? If the queryset is growing, isn't there a chance that I might see some or all of the results I previously got from page 1? In the eyes of the user, its as if the queryset is constantly being pushed forward relative to where they we're last time.
Well that right there is my dilemma folks. If the queryset is growing, how can assure that the next page I request will start off from where I left off on page 1?
I've created the following model below:
class Blog(models.Model):
author = models.ForeignKey(User)
published_time = models.DateTimeField(auto_now_add=True)
text = models.CharField(max_length=1000, blank=True)
I would like to create a view that returns 10 blogs objects at a time ordered by published_time.
from django.core.paginator import Paginator
def BlogView(request):
if request.META.has_key('HTTP_AUTHORIZATION') and request.method == 'GET':
authmeth, auth = request.META['HTTP_AUTHORIZATION'].split(' ', 1)
if authmeth.lower() == 'basic':
auth = auth.strip().decode('base64')
username, password = auth.split(':', 1)
authuser = authenticate(username=username, password=password)
if authuser:
if authuser.is_active:
queryset = Blog.objects.all().order_by('-published_time') #descending order
paginator = Paginator(queryset, 10) # 10 objects per page
blogs = paginator.page(page_number)
data = []
for blog in blogs:
data.append(
{'id': blog.pk,
'author_id': blog.author_id,
'text': blog.text
'published_time': blog.published_time})
return HttpResponse(json.dumps({'results':data}), content_type="application/json")
Very simple setup, yet, I can't figure out on how to do this, any suggestions and examples would extremely be appreciated!
This question is answered in this post on how to set a cache in django:
Caching query results in django
Let's say I have Two Models:
class Thinclient(models.Model):
hostname = models.TextField(_('hostname'),unique=True, \
editable=False)
logs = models.ManyToManyField(Log, blank=True, null=True)
class Log(models.Model):
logname = models.TextField(editable=False)
created = models.DateTimeField(auto_now_add=True, editable=False)
As you can see I have thin clients. Now If one of those thin clients boots it is supposed to send a POST request to my app with the thin client name and a logname (i.e. "booting ...").
Now my view will handle all the work and that's where I have my problem. It currently looks like this:
def log(request):
if request.method == 'POST':
form = ThinclientForm(request.POST)
if form.is_valid():
message = form.cleaned_data['logname']
Log.objects.get_or_create(logname=message)
return HttpResponse(content="", mimetype=None, status=200)
else:
return HttpResponse(content="Unsuccessful", mimetype=None,
status=400)
return render_to_response('thin/status', {
'form': form, })
However that won't wok because I have to assign a message to one specific thin client. I suppoose I have to write my own form with hostname, logname and that is where i have my problem how can I save my models in a way that the message is assigned to a thin?
I hope I could explain what I need to know, if not tell me. And Thanks for any help in this
Assuming your Thinclient name is also included in the POST, you can just get it from there and use it to look up the actual object, then assign that Log message to it.
log = Log.objects.get_or_create(logname=message)
client = Thinclient.objects.get(hostname=request.POST['clientname']
client.logs.add(log)
(One note: you shouldn't use TextFields for things like client names - these are stored as BLOB/TEXT objects in the database, which is far less efficient than normal VARCHARs. Use CharField instead.)
There are several problems here. The first is in the design of your models. The link from Log to ThinClient should be a ForeignKey from Log to ThinClient
class Thinclient(models.Model):
hostname = models.TextField(_('hostname'),unique=True, \
editable=False)
class Log(models.Model):
thin_client = models.ForeignKey(Thinclient)
logname = models.TextField(editable=False)
created = models.DateTimeField(auto_now_add=True, editable=False)
Do you even need a form here? Aren't your thin clients just doing a post to this url? Are they really doing a GET to get the form? The benefit of forms is the ability to turn them into HTML and data validation. I don't think you need either of those here.
The host name of the remote client is stored in the REMOTE_HOST request header so you can use this to pull it out.
def log(request):
thin_client, _ = Thinclient.get_or_create(hostname=request.META["REMOTE_HOST"])
Log(thin_client=thin_client, logname=request.POST["logname"]).save()
return HttpResponse(content="OK", mimetype="text/plain")
As a side note you should always return some content. Some proxies do not like zero byte responses. You should also always specify a mimetype, even if it's the default of text/html.
I am sure that someone has a pluggable app (or tutorial) out there that approximates this, but I have having trouble finding it: I want to be able to track the number of "views" a particular object has (just like a question here on stackoverflow has a "view count").
If the user isn't logged in, I wouldn't mind attempting to place a cookie (or log an IP) so they can't inadvertently run up the view count by refreshing the page; and if a user is logged in, only allow them one "view" across sessions/browsers/IP addresses. I don't think I need it any fancier than that.
I figure the best way to do this is with Middleware that is decoupled from the various models I want to track and using an F expression (of sorts) -- other questions on StackOverflow have alluded to this (1), (2), (3).
But I wonder if this code exists out in the wild already -- because I am not the savviest coder and I'm sure someone could do it better. Smile.
Have you seen it?
I am not sure if it's in the best taste to answer my own question but, after a bit of work, I put together an app that solves the problems in earnest: django-hitcount.
You can read about how to use it at the documentation page.
The ideas for django-hitcount came came from both of my two original answers (Teebes -and- vikingosegundo), which really got me started thinking about the whole thing.
This is my first attempt at sharing a pluggable app with the community and hope someone else finds it useful. Thanks!
You should use the django built-in session framework, it already does a lot of this for you. I implemented this in the following way with a Q&A app where I wanted to track views:
in models.py:
class QuestionView(models.Model):
question = models.ForeignKey(Question, related_name='questionviews', on_delete=models.CASCADE)
ip = models.CharField(max_length=40)
session = models.CharField(max_length=40)
created = models.DateTimeField(default=datetime.datetime.now())
in views.py:
def record_view(request, question_id):
question = get_object_or_404(Question, pk=question_id)
if not QuestionView.objects.filter(
question=question,
session=request.session.session_key):
view = QuestionView(question=question,
ip=request.META['REMOTE_ADDR'],
created=datetime.datetime.now(),
session=request.session.session_key)
view.save()
return HttpResponse(u"%s" % QuestionView.objects.filter(question=question).count())
Vikingosegundo is probably right though that using content-type is probably the more reusable solution but definitely don't reinvent the wheel in terms of tracking sessions, Django already does that!
Last thing, you should probably have the view that records the hit be either called via Ajax or a css link so that search engines don't rev up your counts.
Hope that helps!
You could create a generic Hit model
class Hit(models.Model):
date = models.DateTimeField(auto_now=True)
content_type = models.ForeignKey(ContentType)
object_id = models.PositiveIntegerField()
content_object = generic.GenericForeignKey('content_type', 'object_id')
In your view.py you write this function:
def render_to_response_hit_count(request,template_path,keys,response):
for key in keys:
for i in response[key]:
Hit(content_object=i).save()
return render_to_response(template_path, response)
and the views that you are interested in return
return render_to_response_hit_count(request, 'map/list.html',['list',],
{
'list': l,
})
This approach gives you the power, not only to count the hit, but to filter the hit-history by time, contenttype and so on...
As the hit-table might be growing fast, you should think about a deletion strategy.
I know this question is an old one and also thornomad has put an app to solve the problem and inspire me with me solution. I would like to share this solution since I didn't find much information about this topic and it may help someone else.
My approach is to make a generic model can be used with any view based on the view path (url).
models.py
class UrlHit(models.Model):
url = models.URLField()
hits = models.PositiveIntegerField(default=0)
def __str__(self):
return str(self.url)
def increase(self):
self.hits += 1
self.save()
class HitCount(models.Model):
url_hit = models.ForeignKey(UrlHit, editable=False, on_delete=models.CASCADE)
ip = models.CharField(max_length=40)
session = models.CharField(max_length=40)
date = models.DateTimeField(auto_now=True)
views.py
def get_client_ip(request):
x_forwarded_for = request.META.get('HTTP_X_FORWARDED_FOR')
if x_forwarded_for:
ip = x_forwarded_for.split(',')[0]
else:
ip = request.META.get('REMOTE_ADDR')
return ip
def hit_count(request):
if not request.session.session_key:
request.session.save()
s_key = request.session.session_key
ip = get_client_ip(request)
url, url_created = UrlHit.objects.get_or_create(url=request.path)
if url_created:
track, created = HitCount.objects.get_or_create(url_hit=url, ip=ip, session=s_key)
if created:
url.increase()
request.session[ip] = ip
request.session[request.path] = request.path
else:
if ip and request.path not in request.session:
track, created = HitCount.objects.get_or_create(url_hit=url, ip=ip, session=s_key)
if created:
url.increase()
request.session[ip] = ip
request.session[request.path] = request.path
return url.hits
I did this by creating a model PageViews and making a column "Hits" in it. Every time when Homepage url is hit. I increment the first and only row of column Hit and render it to the template. Here how it looks.
Views.py
def Home(request):
if(PageView.objects.count()<=0):
x=PageView.objects.create()
x.save()
else:
x=PageView.objects.all()[0]
x.hits=x.hits+1
x.save()
context={'page':x.hits}
return render(request,'home.html',context=context)
Models.py
class PageView(models.Model):
hits=models.IntegerField(default=0)
I did it using cookies. Don't know if it's a good idea to do that or not. The following code looks for an already set cookie first if it exists it increases the total_view counter if it is not there the it increases both total_views and unique_views. Both total_views and unique_views are a field of a Django model.
def view(request):
...
cookie_state = request.COOKIES.get('viewed_post_%s' % post_name_slug)
response = render_to_response('community/post.html',context_instance=RequestContext(request, context_dict))
if cookie_state:
Post.objects.filter(id=post.id).update(total_views=F('total_views') + 1)
else:
Post.objects.filter(id=post.id).update(unique_views=F('unique_views') + 1)
Post.objects.filter(id=post.id).update(total_views=F('total_views') + 1)
response.set_cookie('viewed_post_%s' % post_name_slug , True, max_age=2678400)
return response