Use a code executing every hour in Django - django

I'm using Django for developing an app for booking cars, then I need an automated code to check if a car is booked automatically every hour, I'm new in Django and I don't have idea how to do this

I agree with abhijeetviswa's answer and also the linked answer that mentioned Celery (if you need something a bit more complex).
I'd also think very carefully about what it is you are trying to achieve and to consider if there is a different way to do it. Unless you are going to use Django signals to be able to respond to the user when it finds that the car is booked, you might not actually need this to be a Django thing at all.
For example, if you just wanted to know if the car was booked or not, you could consider refreshing that information just before you need to know it, (i.e. before building some results) rather than polling for it every hour.
Obviously, this depends heavily on what you want to achieve.

You can look into management commands and cronjobs. You basically create a management command that will perform a specific function. You can then schedule a cronjob to run every hour which will execute this management command.
Check out this answer as well.

Related

How to structure a daily habit tracking app

Background
I'm a personal trainer and trying to build an app that tracks whether or not my clients are working on their daily habits (and bugs them about it at a chosen time every day). I'd love to hear if any of you all have ideas on how to structure this as I'm new to both Django and coding in general.
Models
I currently have two models: Habits and Checks.
Habits represent "What are you working on improving?" and have a ForeignKey to a user.
Checks represent "Did you complete your habit today?" and have a ForeignKey to a habit.
Current status
There is a nice solution where you create all the Checks for a Habit based on it's end date, but I'm trying to structure this with an indefinite end date because, as a coach, then I can show hard data when someone isn't making progress. Though I am still willing to accept that maybe this app would work better if habits had deadlines.
I wrote a custom manage.py script that Heroku runs automatically at the same time every day, but that doesn't scale with users' individual time zones. I run it manually on my local computer.
I originally tried getting it to work with Celery but that did not go well on my Windows machine.
Should I push the script out a day or week in advance and hide the days that are in the future?
Should I avoid the script and just create a year's worth of rows and hope they don't want to track it for more than a year?
Is there a better option?
Help requested
The two issues I'm having at this point:
How can I have a Check created for each day? Is there a better way than what I've done already?
How can I make the timezone for each day relative to the user?

Django best practices to validate data in other tables -taking complexity from view file?

I was wondering about best practices in Django of validating the tables content
I am creating a Sales Orders and my SO should check availability of the items I have in stock and if they are not in stock it will trigger manufacturing orders and purchase orders.
I don't want to make very complex view and looking for a way to decouple logic from there and also I predict performance issues.
What are best practices or ready solutions I can use in Django framework to address view complexity ?
I see different possibilities but I am wondering what will be the best fit in my case :
managers
celery - just to run a job occasionally I want the app to be
real time so I don't like this option.
using signals /pre_save/post_sav
model validation
creating extra layer like services.py file
Since I am new to Django I am a bit puzzled what root to take.
Not sure if this is the answer you are looking for.
Signals are for doing things automatically when events happen. Most commonly used to do things before and after model operations. So if you need to do something every time you save a record or every time you create a new record or delete that is where you use signals.
Managers are used to manage record retrieval and manipulations. If you want to do some clever way of retrieving data you can define a custom manager and add some custom methods to it. If you want to override some default behaviors of querysets you would also do it with a custom manager.
Celery is for running things asynchronously. If you are worried that some processing you are doing might take a long time that is were you might consider offloading things to celery. A friendly warning though, doing things asynchronously raises complexity of your code quite a bit, since you need to add some mechanism to pass the data back from celery tasks into your django app and your users.
services.py link that you posted seems to do what you want, it just provides a place where you can put logic that is not specific to a particular view.
Here on stackoverflow, i got an advice from some experienced developers that premature optimization is the root of all evil.
What i suggest is keep it simple. Making the view a little more complex is actually better than effectively adding one more layer of complexity. I would suggest that you try to put most of you logic in models and whatever remains after that in views.
Also, unnecessarily using multiple packages would not solve much of your problem so use the when its necessary. Otherwise try to write the minimal logic yourself so that you donot have to use many apps.
Signals and other things as everybody say is not a great thing however promising it may seem. Just try to make things simpler.
One more point from my side as you are just starting out, go through class based views and try to use them when you get familiar. That will simplify your views the most. Plus, if ou are new to django, read a little code. https://github.com/vitorfs/bootcamp might help you in initiation.

Suggestion for Authnet AIM/CIM membership site

I have built a software solution for recurring billing using ARB and I now have the task of using AIM and CIM. I have searched for an hour before asking just as an FYI.
I assume for single transactions I use AIM. Then, to store the card on Authnet servers for future charges I use createCustomerProfile followed by createCustomerPaymentProfile.
My question is this:
Should I use AIM to charge the card, if its successful then I make the call to createCustomerProfile and use the return id for createCustomerPaymentProfile.
I know this seems like a simple question but I just want to be sure before I start into it.
Yes. You always want to use AIM first as it is both fast and allows you the opportunity to verify the card is valid before creating a CIM profile.

Multiple HITs or ExternalQuestion in Mechanical Turk?

First of all I must say I am totally new to MT so forgive me if I am thinking in a totally wrong way.
I have to create a task for workers where they have to classify a sentence if it is spam or if it falls into a certain category. I will have about 2500 sentences to classify a day.
What is the best way to use the API to do this. I understand how to create a HIT using the API, but it is my understanding that I can't create a recurrent HIT that changes itself once each of the sentence is classified. Do I need to create 2500 HITs?
I researched and found out about the External Question which I can setup in my server and make it change with each form submit.
In that case will it be just 1 HIT? is that the correct way to do this?
I am confused in the dynamic part of MT.
Any tip, documentation (updated) or suggestion will be appreciated.
Thanks!
You likely want to create separate HITs.
If you create an single External HIT (hosted on your server), a
MTurk Worker who takes your HIT will not be eligible to take another
task (e.g. a classification task) since Workers are not allowed to
take a single HIT more than once. However, if you create separate
HITs, a Worker can take as many of them as they wish, which is
probably what you want.
You are correct that you cannot automatically change a HIT
dynamically unless it is run on your own server.

Optimisation tips when migrating data into Sitecore CMS

I am currently faced with the task of importing around 200K items from a custom CMS implementation into Sitecore. I have created a simple import page which connects to an external SQL database using Entity Framework and I have created all the required data templates.
During a test import of about 5K items I realized that I needed to find a way to make the import run a lot faster so I set about to find some information about optimizing Sitecore for this purpose. I have concluded that there is not much specific information out there so I'd like to share what I've found and open the floor for others to contribute further optimizations. My aim is to create some kind of maintenance mode for Sitecore that can be used when importing large columes of data.
The most useful information I found was on Mark Cassidy's blogpost http://intothecore.cassidy.dk/2009/04/migrating-data-into-sitecore.html. At the bottom of this post he provides a few tips for when you are running an import.
If migrating large quantities of data, try and disable as many Sitecore event handlers and whatever else you can get away with.
Use BulkUpdateContext()
Don't forget your target language
If you can, make the fields shared and unversioned. This should help migration execution speed.
The first thing I noticed out of this list was the BulkUpdateContext class as I had never heard of it. I quickly understood why as a search on the SND forum and in the PDF documentation returned no hits. So imagine my surprise when i actually tested it out and found that it improves item creation/deletes by at least ten fold!
The next thing I looked at was the first point where he basically suggests creating a version of web config that only has the bare essentials needed to perform the import. So far I have removed all events related to creating, saving and deleting items and versions. I have also removed the history engine and system index declarations from the master database element in web config as well as any custom events, schedules and search configurations. I expect that there are a lot of other things I could look to remove/disable in order to increase performance. Pipelines? Schedules?
What optimization tips do you have?
Incidentally, BulkUpdateContext() is a very misleading name - as it really improves item creation speed, not item updating speed. But as you also point out, it improves your import speed massively :-)
Since I wrote that post, I've added a few new things to my normal routines when doing imports.
Regularly shrink your databases. They tend to grow large and bulky. To do this; first go to Sitecore Control Panel -> Database and select "Clean Up Database". After this, do a regular ShrinkDB on your SQL server
Disable indexes, especially if importing into the "master" database. For reference, see http://intothecore.cassidy.dk/2010/09/disabling-lucene-indexes.html
Try not to import into "master" however.. you will usually find that imports into "web" is a lot faster, mostly because this database isn't (by default) connected to the HistoryManager or other gadgets
And if you're really adventureous, there's a thing you could try that I'd been considering trying out myself, but never got around to. They might work, but I can't guarantee that they will :-)
Try removing all your field types from App_Config/FieldTypes.config. The theory here is, that this should essentially disable all of Sitecore's special handling of the content of these fields (like updating the LinkDatabase and so on). You would need to manually trigger a rebuild of the LinkDatabase when done with the import, but that's a relatively small price to pay
Hope this helps a bit :-)
I'm guessing you've already hit this, but putting the code inside a SecurityDisabler() block may speed things up also.
I'd be a lot more worried about how Sitecore performs with this much data... assuming you only do the import once, who cares how long that process takes. Is this going to be a regular occurrence?