I need to feed the django models with data from excel file when the file is uploaded. This excel file contains weekly schedule of employees so the data changes evey week. How do I update the database every time the file is uploaded??
The easy way out is to use django-import-export which is a library that does just that and comes with admin interface integration.
Related
I have a form that allows to upload excel files. How can I read that file in automate and import it into dataverse automatically.
I have tried this but not work.
I started to learn Django. I want to read from CSV file records and save them in Sqlite when the server just loaded (after running the command python manage.py runserver )
I don't understand where should I write the code that does it.
it should be on the view? it should be on the command?
Thanks for any help!
Receive the file (from the form or get it from the disc) (entry point: views.py)
Read the csv's data to pandas (external module in external file which receiving csv file from view and save it to pandas dataframe)
Save from pandas to sqlite in external module
About the separation of business logic and views you can check: Separation of business logic and data access in django
I am creating a scientific based website that does some calculation and plots. In order to do those calculations, I need to use some data that is in a csv file. The csv file has around 10,000 columns and 7 rows. Will I be able to do this with DJANGO?
Simple. Backend of Django is just python, which means you can use any python packages and what not in your views for example. You can probably start off here.
Edit: If you just parse the csv, you can use that data any way you'd like in django, maybe passing to the frontend template, storing it in the database, etc.
I have a Django model that represents data files on a server, with some metadata about each file. These files are generated by an instrument and can appear at any time throughout the day. I would like the Django table to reflect the files that are actually available for the user to select.
Here is what I have so far:
I have a python script that scans the directory, produces an intial_data.json file and puts it in the app/fixtures directory. (The script pulls out important metadata from each file to make it easy for the user to make selections.)
I have fixtures working so that when I run syncdb, it loads the data into the model.
My question is, how do I do this repeatedly (hourly? on-demand? -- for example, triggered by clicking a button on the page?)
My impression is that syncdb is only meant to be run occasionally, like, for a data migration. Am I wrong - can I run it "at the click of a button"?
Is there a better way of keeping my table in sync with the file system? I have considered using FileField or FilePathField but these seem not workable, because I want to pre-load the table with the file metadata.
I don't understand why you want you use syncdb for this. That's really for creating tables. If all you're doing is loading a fixture, why don't you do that directly? You can use django.core.serializers to parse and load your JSON file (and I'd recommend calling it something other than initial_data).
I'm working on a simple Django application in which the user upload a SQLite file; the data is read and added to the main database (PostgreSQL).
My idea is to use two databases, one for the main application and the other to manage the uploaded file (the structure is always the same so I can create models for it).
What do you think about this solution? Is it possible to dynamically change the settings.py file for the second database so I can modify the path and easily read data inside it?
Thanks!
Django supports multiple databases in one project, you can set up postgres as default DB and sqlite as secondary (just for upload).