Auto printing orders from sql using django/laravel - django

Is there any possibility to create app (django, laravel) - that would get data from database, table called orders and would automatically create pdf with data and download it?
Let's say, i've got 5 new orders, each for different address and different goods.
App should automatically create pdf and download for each of those 5 orders.
It could also be a react app getting orders from API.
Is this possible?
Thanks in advance.

Related

Importing data from Smartsheet into Power BI using web and not connector

The PowerBI connector doesn't work, since it defaults to the IE browser, and our IE browser is locked down by our IT department (can't connect to smartsheet site). So, I need to use the "web" API option to get my smartsheet data. I have the data coming in, but I'm brand new at this and I'm having trouble organizing it. I have 8 columns that are repeating as rows, and the cell data contains a bunch of metadata values that I don't want. How do I get this so the columns become columns again, with the cell data as rows (with just the face values) and not repeating?
screenshot of data in PowerBI
I think you need to read a good part of the documentation in order to get used to the responses you get from Smarsheet, in brief you get a Json data structure.
First read, https://smartsheet-platform.github.io/api-docs/#how-to-read-a-sheet-response
Then you need to make the json connection to PowerBI like this example, https://www.mssqltips.com/sqlservertip/4621/using-power-bi-with-json-data-sources-and-files/

PowerBI report service - data flow questions

This is what I am trying to do: I have various SQL server databases with data. I created views in all of them. All views will need to be imported, and I specify their relationships. I want this to be refreshed nightly. I want to build various reports of the same data source.
Do I have to use a PowerBI desktop application to import data into PowerBI Report Service? [I have done this so far, but then can create new reports in the cloud on existing data. It would make sense to connect directly from PowerBI report service to my SQL servers.]
Once I uploaded data using a desktop application (as I have done so far), how can I view the data model in the report service once it is uploaded in the cloud?
In order to get routinely refreshed data I need to setup a gateway. Is the local PowerBI desktop application still involved in this process, or could I [in theory] delete the local desktop application that pushed the data in initially?
For your questions:
You have two options, use PBI Desktop to connect to the data using import/direct query, then load it to the service. You can use dataflows to create an import based on your views, but you will then need to create reports from those. Using dataflows, you'll have to set up a refresh schedule, then for the dataset(s) built on top of those, you'll have to set another refresh schedule.
You will be limited to the dataset sizes of 1GB for the workspace if importing data. You cannot use direct query on dataflows (unless you have enhanced compute with PBI premium). Once the dataset is loaded, you can then create new reports in the service or via desktop on top of that dataset. If possible it is recommended to use direct query.
To see the data model, you can use desktop to connect to PBI Service Dataset. This will connect in 'Live Connection' mode, and will be limited to that one dataset, you can't add others to it, Excel, CSV, SQL etc. You can also use Analyse in Excel, a plugin for Excel, that can connect to the data model. You can create new reports in the service for existing data models as well.
When creating the report in PBI Desktop it does not use the Gateway, you connect to your data sources as normal, then once you load the dataset to Power BI it will match the data sources in the file to the ones set up in the Gateway Admin settings. So you will still need PBI Desktop to create reports, but the gateway is there for the refreshing. The Desktop is not used in the process for refreshing. You could delete the workbook or application, but if you have to make changes, what will you refer to? (You could download a copy of the report from the service).+ It is easier to make changes in the desktop app, then the service, as there is a feature difference between dataset creation in the desktop vs service.

Need guidance with creating Django based dashboard

I'm a beginner at Django, and as a practice project I would like to create a webpage with a dashboard to track investments in a particular p2p platform. They do not have a nice dashboard (but provide excel file with all data). As I see it, main steps that I need to do in this project are as follow:
Create login so that users would have account where they upload their excel files.
Make it possible to import excel file to a database
Manipulate/calculate data for it to be later used in dashboard
Create dashboard.
Host webpage.
After some struggle I have implemented point no. 2, and will deal with 1 and 5 later. But number 3 is my biggest issue now.
I'm completely unsure what I need to do, and google did not help. I need to calculate data before I can make dashboard from it. Union two of the tables, and then join them together with a third table, creating some additional needed calculated fields. Do I create a view in the database and somehow fetch this data to Django? Or do I need to create some rules so that new table would be created at the time of the import? I think having table instead of a view would have better performance. Or maybe I'm doing it completely wrong, and should take completely different approach for this kind of task? Also, is SQLite a good database for a task (I'm using it, because it was a default in Django)?
I assume for vizualization part I will need to do it with some JavaScript library, such as D3? Which then would use data from step 3.
For part 3 there is 2 way, either do these stuff and save the result in your database or you can do it when you need it using django model features like annotation, aggregation and etc.
Option 1 requires to add a table for you calculation which is Models in django.
Option 2 requires to create a doing the annotations in a view or model managers and then using them in views.
Django docs: Aggregation
Which is the best is depended on how big your data is, how complicated the calculation is and how often you need them.
And for database; SQLite is just a database for development use not the production and surly not with a lot of data and a lot of calculations. The recommended database for django is postgresql which is pretty good at handling millions and even billions of data and doing heavy calculation.
And for vizualization you should handle it on the template side which is basically HTML, CSS and JS.

How to use real database in Power BI and make it refresh dynamicaly?

I'm new to Power BI (Free Version) and I have been asked to develop a report system which generates report from an excel sheet, the reports work good for only the data I have collected.
but my question is how to connect to the data immediately from SQL server without the need to convert it to excel and then import it in power BI, I also want the data to be refreshed dynamically.
One of the solutions I tried is to add new dataset but I get the following message:
Refresh can't be scheduled because the data set doesn't contain any
data model connections, or is a worksheet or linked table. To schedule
refresh, the data must be loaded into the data model.
I have looked for many solutions but none has worked.
am I missing a concept? thank you
If this data is stored in a SQL SERVER table it is a pretty straight forward process.
When you create a new power bi report (.Pbix) you should see a prompt asking you if you want to "Get Data". You would select the 'SQL Server Database' option - See the image below:
Then, you will be asked to enter the Server and Database name, and to specify either 'Import' or 'Direct Query' mode. If you choose 'Import' the data will be refreshed every time you access the report or upon 'Refresh' within a report session. If you choose the latter, the connection will always be live i.e. any changes to the data in your database will be reflected in the report.
Once you get passed this window, you will be asked to either specify credentials or use a windows authentication to access the database and server. After that you can either specify a query to pull in some data or you can select from a list of tables.
I hope this helps!!

how to executing user defined stored procedure using sync up framework

My database have 6-7 tables on server side. i want only few 10-50 list of customers which is get me by store procedure (selecting records by joining of 6-7 tables).
I created application(used in both online & offline environment) which is sync up table from server to client vise versa. Which is displaying that customers name in combo box (records from stored procedure).
I am using sync framework. but this 6-7 tables contain huge records near around 67k. I don't want to sync up that 6-7 table. I want to sync up only those list of customer as per the login user.
I created one table like:
Customer_List user_Id Customer_Name customer_Id
and stored procedure return list of customers as per above table structure:
I want to sync up this table with my stored procedure using sync framework.
How I can do this?
there is on publicly available API in Sync Framework for you to specify/invoke a custom stored proc.
seems like what you're SP does can be represented as a Filter...
e.g., side.CustomerId IN (SELECT CustomerId FROM Customer_List WHERE User_Id =#User_Id)