How do i implement Logic to Django? - django

So I have an assignment to build a web interface for a smart sensor,
I've already written the python code to read the data from the sensor and write it into sqlite3, control the sensor etc.
I've built the HTML, CSS template and implemented it into Django.
My goal is to run the sensor reading script pararel to the Django interface on the same server, so the server will do all the communication with the sensor and the user will be able to read and configure the sensor from the web interface. (Same logic as modern routers - control and configure from a web interface)
Q: Where do I put my sensor_ctl.py script in my Django project and how I make it to run independent on the server. (To read sensor data 24/7)
Q: Where in my Django project I use my classes and method from sensor_ctl.py to write/read data to my djangos database instead of the local sqlite3 database (That I've used to test sensor_ctl.py)

Place your code in app/appname/management/commands folder. Use Official guide for management commands. Then you will be able to use your custom command like this:
./manage getsensorinfo
So when you will have this command registered, you can just put in in cron and it will be executed every minute.
Secondly you need to rewrite your code to use django ORM models like this:
Stat.objects.create(temp1=60,temp2=70) instead of INSERT into....

Related

Restoring test database for unit testing in Neo4j

I would like to unit test CRUD operations against a pre-populated Neo4j database.
I am thinking that a way to do this might be to:
Create a an empty database (let's call it testDB)
Create a database backup (let's call it testingBackup)
On running tests:
Delete any data from testDB
Populate testDB from testingBackup
Run unit test queries on the now populated testDB
I am aware of the backup / restore functions, the load / dump functions and the export to csv / load from csv etc. However, I'm not sure which of these will be most appropriate to use and can be automated most easily. I'm on Ubuntu and using python.
I would need to be able to quickly and easily alter the backup data as the application evolves.
What is the best approach for this please?
I have something dome somthing similar, with some caveats. I have done tests like these using Java and testcontainers. Also, i didn't use neo4j. I have used postgress, sqlserver and mongodb for my tests. Using the same technique for neo4j should be similar to one of those. I will post the link to my github examples for mongodb/springboot/java. Take a look.
The idea is to spin up a testcontainer from the test (ie, a docker container for tests), populate it with data , make the application use this for its database use, then assert at the end.
In your example, there is no testingbackup. Only a csv file with data.
-Your test spins up a testcontainer with neo4j from your test (this is your testdb).
-Load the csv into this container.
-get the ip, port, user, password of the testcontainer (this part depends on the type of database image available for testcontainers. Some images allow you to set your own port, userid and password. Some of them won't.)
-pass these details to your application and start it (i am not sure how this part will work for a python app. here you are on your own. See the link to a blog i found for a python/testcontainer example below. I have used spring-boot app. You can see my code in github)
-once done, execute queries to your containerized neo4j and assert.
-when the test ends, the container is disposed off with the data.
-any change is done to the csv file which can create new scenarios for your test.
-create another csv file/test as needed.
Here are the links,
https://www.testcontainers.org/
testcontainers neo4j module https://www.testcontainers.org/modules/databases/neo4j/
A blog detailing testcontainers and python.
https://medium.com/swlh/testcontainers-in-python-testing-docker-dependent-python-apps-bd34935f55b5
My github link to a mongodb/springboot and sqlserver/springboot examples.
One of these days i will add a neo4j sample as well.
https://github.com/snarasim123/testcontainers

Good way to deploy a django app with an asynchronous script running outside of the app

I am building a small financial web app with django. The app requires that the database has a complete history of prices, regardless of whether someone is currently using the app. These prices are freely available online.
The way I am currently handling this is by running simultaneously a separate python script (outside of django) which downloads the price data and records it in the django database using the sqlite3 module.
My plan for deployment is to run the app on an AWS EC2 instance, change the permissions of the folder where the db file resides, and separately run the download script.
Is this a good way to deploy this sort of app? What are the downsides?
Is there a better way to handle the asynchronous downloads and the deployment? (PythonAnywhere?)
You can write the daemon code and follow this approach to push data to DB as soon as you get it from Internet. Since your daemon would be running independently from the Django, you'd need to take care of data synchronisation related issues as well. One possible solution could be to use DateTimeField in your Django model with auto_now_add = True, which will give you idea of time when data was entered in DB. Hope this helps you or someone else looking for similar answer.

Django WebServer During Testing

I'm writing a complicated web application in Django. There are many components. Two in particular, are the Django server (lets call this Server), and a C++ application server (lets call this Calculator) which serves calculations to Server. When Server wants a calculation done, it sends a command to a socket on which Calculator is listening. Like this:
{
"command": "doCalculations"
}
Now, Calculator might need different pieces of information at different times to do its work. So instead of passing the data directly to Calaculator in the command, it is up to Calculator to ask for what it needs. It does this by calling a RESTful API on Server:
https://Server/getStuff?with=arguments
Calculator then uses the data from this call to do its calculations, and respond to Server with an answer.
The problems begin when I try to do unit testing using Djangos unittest framework. I set up a bunch of data structures in my test, but when Server calls Calculator, it needs to have this data available in the REST API so Calculator can get what it needs. The trouble is that the Django test framework doesn't spin up a webserver, and if I do this manually it reads the data from the real database, and not the test-case.
Does anybody know how to run a unit test with the data made available to external people/processes?
I hope that makes sense...
You need specify the fixtures to load in your test class.
https://docs.djangoproject.com/en/1.7/topics/testing/tools/#fixture-loading
class MyTest(TestCase):
fixtures = ['data.json']
def setUp(self):
# do stuff
def tearDown(self):
# do stuff
Where data.json can be retrieved by using python manage.py dumpdata.
It will be filled with data from your main db in JSON format.
data.json should exist in the fixtures folder of the app you are testing. (Create one if necessary).

Django Directory Structure - Non-website code

I have a Django project that includes code for processes that are scheduled to run (via cron) independently from the website. The processes update the database using the models from one of my apps so I guess the code for these processes could be considered part of that app even though it's not part of the website. Should I create a package inside the app directory to hold these modules?
If the code you're supposed to run is tied to models in a certain app, you can write a custom management command for it.
The code lives inside your app (in myapp/management/commands/command_name.py) and you'll be able to call it using manage.py or django-admin.py, which allows you to add an entry to cron very easily.

Rather than using crontab, can Django execute something automatically at a predefined time

How to make Django execute something automatically at a particular time.?
For example, my django application has to ftp upload to remote servers at pre defined times. The ftp server addresses, usernames, passwords, time, day and frequency has been defined in a django model.
I want to run a file upload automatically based on the values stored in the model.
One way to do is to write a python script and add it to the crontab. This script runs every minute and keeps an eye on the time values defined in the model.
Other thing that I can roughly think of is maybe django signals. I'm not sure if they can handle this issue. Is there a way to generate signals at predefined times (Haven't read indepth about them yet).
Just for the record - there is also celery which allows to schedule messages for the future dispatch. It's, however, a different beast than cron, as it requires/uses RabbitMQ and is meant for message queues.
I have been thinking about this recently and have found django-cron which seems as though it would do what you want.
Edit: Also if you are not specifically looking for Django based solution, I have recently used scheduler.py, which is a small single file script which works well and is simple to use.
I've had really good experiences with django-chronograph.
You need to set one crontab task: to call the chronograph python management command, which then runs other custom management commands, based on an admin-tweakable schedule
The problem you're describing is best solved using cron, not Django directly. Since it seems that you need to store data about your ftp uploads in your database (using Django to access it for logs or graphs or whatever), you can make a python script that uses Django which runs via cron.
James Bennett wrote a great article on how to do this which you can read in full here: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
The main gist of it is that, you can write standalone django scripts that cron can launch and run periodically, and these scripts can fully utilize your Django database, models, and anything else they want to. This gives you the flexibility to run whatever code you need and populate your database, while not trying to make Django do something it wasn't meant to do (Django is a web framework, and is event-driven, not time-driven).
Best of luck!