Should I have a seperate table for each media type - django

See initially I thought this would be simple but this is killing me right now
I have a post model which contains a field (foreign key to media)
Media has 3 colums Image, Video, Audio (each related to their own tables)
And each of those three tables above has 1 file field (except for image because that holds x and y coordinated too for the tagging purpose) I don't wanna disturb the order of the upload(the size if the field is gonna change in future because I have to store a thumbnail for each file)
let's say a user uploaded a picture first and then audio and then a video I wanna display them in the order as they were uploaded but I really don't know how to achieve this
Can anyone please help me figure out, what should I do to implement this

Related

Django - set model's `CharField` primary key from processed `ImageField` data

This is maybe a poor way to ask this question, as I haven't yet tried anything since I'm not sure if it's even possible without a bunch of custom JS added to the admin.
Given a model such as:
class MyModel(models.Model):
sku = models.CharField('SKU', max_length=20, primary_key=True)
bar_code = models.ImageField(upload_to='images/barcodes')
I want the user to select an image which will be a photo of a product's barcode. The image then needs to be processed to scan for the barcode (I'm using pyzbar) and this value should be saved to the primary key field sku.
Since I can't save the model without a primary key, and I need to upload the image field to scan the barcode to discover the value to be used for the primary key, I'm thinking that the only way to do this would be to use some client side JS in the admin to upload the image to a temp location using a DRF endpoint (or similar), read the barcode and return that value to the client which could then set the value of sku with some basic javascript. Then the model can be saved and image uploaded (a second time).
Is there a more straightforward way to do this in Django without adding my own client side JS to the admin and having to upload the photo twice?
Aside from the primary key, which may complicate things - you can extend ImageField and see how it populates its width and height columns, then do the same for the pyzbar generated barcode. See the update_dimension_fields for inspriation:
Update field's width and height fields, if defined.
This method is hooked up to model's post_init signal to update
dimensions after instantiating a model instance. However, dimensions
won't be updated if the dimensions fields are already populated. This
avoids unnecessary recalculation when loading an object from the
database.
Dimensions can be forced to update with force=True, which is how
ImageFileDescriptor.set calls this method.
You would add a method to do the work for pyzbar and then connect the signal in the same way.

Power BI parameter for image

I wanted to know if there's a way to have a parameter that holds an image.
The problem: I have 10 PBIX, each one containing over 20 screens. every report has the clients logo. it is an image let's say the client changed his logo, that means I'll need to change the logo 200 times!! (10 PBIX X 20 screens).
Wanted solution: is there a way to hold a parameter that will contain the logo, therefore changing the logo only 10 times (for each PBIX) that will be a HUGE improvement for my time and productivity. sorry for not including pictures this is to keep my client anonymous
Thank you!
Here is how you can achieve your requirement as stated below-
Step-1 create a custom function in Power Query as stated below. The function is taken from Here and you can check this blog for more details if you are interested.
let
BinaryToImage = (BinaryContent as binary) as text =>
let
Base64 = "data:image/jpeg;base64, " & Binary.ToText(BinaryContent, BinaryEncoding.Base64)
in
Base64
in
BinaryToImage
Here below is the image for the custom function-
Step-2 create report wise folder in your local directory and keep your image there. I am considering one folder here, but you can do your own once you get the workaround.
Step-3 Create a data connection for Folder as shown below and point to the folder you kept your logo/image-
And now you will have a table as below with one row as there are one image only. I named the table as PBI_Images.
Here basically Content column holding the Binary data for the image and we have to convert it to Base64 using the Function we created earlier.
Step-4 Now lets Invoke the function in our table. Select the table and do what shown in the below image-
Now you have a new column with data like below-
Step-5 Get back to report by clicking "Close & Apply" button.
Step-6 Download the following App from AppStore-
Step-7 Just add the new downloaded visual to your report and put the newly created column to the "Image URL" field as below-
Step-8 Now add the Image visual to all your report pages and tag the Image URL as stated in step-7.
You are all done now. Just Change the Image in your source file keeping the same name and after that refresh your table (PBI_Images my case) and you will see Image changes everywhere in the report.
Finally, if everything works expected, you can first try will all logos from one single folder. If do not work, go for folder per customer as mentioned earlier.

Django - Determine model fields and create model at runtime based on CSV file header

I need to determine the best approach to determine the structure of my Django app models at runtime based on the structure of an uploaded CSV file, which will then be held constant once the models are created in Django.
I have come across several questions relating to dynamically creating/altering Django models at run-time. The consensus was that this is bad practice and one should know before hand what the fields are.
I am creating a site where a user can upload a time-series based csv file with many columns representing sensor channels. The user must then be able to select a field to plot the corresponding data of that field. The data will be approximately 1 Billion rows.
Essentially, I am seeking to code in the following steps, but information is scarce and I have never done a job like this before:
User selects a CSV (or DAT) file.
The app then loads only the header row in (these files are > 4GB).
The header row is split by ",".
I use the results from 3 to create a table for each channel (columns), with the name of the field the same as the individual header entry for that specific channel.
I then load the corresponding data into the respective tables and I ahve my models for my app that will then not be changed again.
Another option I am considering is creating model with 10 fields, as I know there will never be more than 10 channels. Then reading my CSV into the table when a user loads a file, and just having those fields empty.
Has anyone had experience with similar applications?
That are allot of records, never worked with so many. For performance the fixed fields idea sounds best. If you use PostgreSQL you could look at the JSON field but don't know the impact on so many rows.
For flexible models you could use the EAV pattern but this works only for small data sets in my experience.

Alternatives to dynamically creating model fields

I'm trying to build a web application where users can upload a file (specifically the MDF file format) and view the data in forms of various charts. The files can contain any number of time based signals (various numeric data types) and users may name the signals wildly.
My thought on saving the data involves 2 steps:
Maintain a master table as an index, to save such meta information as file names, who uploaded it, when, etc. Records (rows) are added each time a new file is uploaded.
Create a new table (I'll refer to this as data tables) for each file uploaded, within the table each column will be one signal (first column being timestamps).
This brings the problem that I can't pre-define the Model for the data tables because the number, name, and datatype of the fields will differ among virtually all uploaded files.
I'm aware of some libs that help to build runtime dynamic models but they're all dated and questions about them on SO basically get zero answers. So despite the effort to make it work, I'm not even sure my approach is the optimal way to do what I want to do.
I also came across this Postgres specifc model field which can take nested arrays (which I believe fits the 2-D time based signals lists). In theory I could parse the raw uploaded file and construct such an array and basically save all the data in one field. Not knowing the limit of size of data, this could also be a nightmare for the queries later on, since to create the charts it usually takes only a few columns of signals at a time, compared to a total of up to hundreds of signals.
So my question is:
Is there a better way to organize the storage of data? And how?
Any insight is greatly appreciated!
If the name, number and datatypes of the fields will differ for each user, then you do not need an ORM. What you need is a query builder or SQL string composition like Psycopg. You will be programatically creating a table for each combination of user and uploaded file (if they are different) and programtically inserting the records.
Using postgresql might be a good choice, you might also create a GIN index on the arrays to speed up queries.
However, if you are primarily working with time-series data, then using a time-series database like InfluxDB, Prometheus makes more sense.

Mapping user spreadsheet columns to database fields

I’m not sure where to start on this project. I know how to read the contents of the excel spreadsheet, I know how to identify the header row, I know how to loop over the contents. I believe I have the UX portion worked out but I am not sure how to process the data.
I’ve googled and only found .Net solutions but I’m looking for a ColdFusion/Lucee solution.
I have a working form allowing me to map a user's spreasheet column to my database values (this is being kept simple for this post; user does not have direct access to the database).
Now that I have my data, I'm not sure how to loop over the data results. I believe there will be several loops (an outer and an inner). Then of course I also need to loop over the file contents but I think if I can get the headings mapped out,I can figure out the remaining.
Any good links, tutorials, or guides would be greatly appreciated.
Some pseudo code might be enough to get me started.
User uploads form
System reads headers and content.
User is presented form with a list of columns from their uploaded spreadsheet to match with available database fields (eg “column1” matches “customer name”.
User submits form.
Now what?
UPDATED
Here is what the data looks like AFTER the mapping has been done in my form. The column deliiter is the ::: and within the column the ||| indicates the ID associated with the selected column value. I've included the id and the column value since I plan on displaying the mapping again as a confirmation. Having the ID saves a trip to the database.
If I understand correctly, your question is: how do you provide the user a form allowing them to map their spreadsheet columns to that of the database
Since you have their spreadsheet column names, and you have the database column names, then this problem is essentially a UI/UX problem. You need to show both lists, and allow the user to map them. I can imagine several approaches to this. My first thought would be some sort of drag/drop operation, as follows:
Create a list of boxes, one for each field in your database table, and include the field name in (or above) the box. I'll call this the db field list. Then, create another list for each column from the spreadsheet, which I'll call the spreadsheet column list. The user would drag/drop items from the spreadsheet column list to the db field list.
When a mapping has been completed by the user, you would store the column/field names in as data for the DOM element of the db field list box. Then upon submission, you would acquire the mapping data by visiting each box and adding it to an array. Then you would serialize that array into JSON and send that to your form submission handler.
This could be difficult or easy, depending on your knowledge of UI implementations using JavaScript. jQuery makes this easy (if you know jQuery). There's even a jquery UI plugin that does this: https://jqueryui.com/droppable/.
A quick search for javascript drag drop would help, and here's a few articles I found:
https://www.w3schools.com/html/html5_draganddrop.asp
https://medium.com/quick-code/simple-javascript-drag-drop-d044d8c5bed5
You would also need to submit the array of mappings using javascript. You could search for that as well, and here's an article I found:
https://codereview.stackexchange.com/questions/94493/submit-an-array-as-an-html-form-value-using-javascript