Using python to connect to a SAS/IntrNet server - python-2.7

I look for a way to access to data which are stored on a SAS/IntrNet database using Python. I don't find any information on the way to achieve it!
Currently, I access to data using a web viewer on a adress like :
http://remote.server/cgi-bin/broker.dll?_service=prod_sas9_ux&_program=prog.include.sas.
Is someone know a solution to access to the data using a python script ?

Try the python requests (http://docs.python-requests.org/en/master/) library. This will allow you to call the SAS service via HTTP.
You will have to parse the results to build your python data structure

Related

How do i implement Logic to Django?

So I have an assignment to build a web interface for a smart sensor,
I've already written the python code to read the data from the sensor and write it into sqlite3, control the sensor etc.
I've built the HTML, CSS template and implemented it into Django.
My goal is to run the sensor reading script pararel to the Django interface on the same server, so the server will do all the communication with the sensor and the user will be able to read and configure the sensor from the web interface. (Same logic as modern routers - control and configure from a web interface)
Q: Where do I put my sensor_ctl.py script in my Django project and how I make it to run independent on the server. (To read sensor data 24/7)
Q: Where in my Django project I use my classes and method from sensor_ctl.py to write/read data to my djangos database instead of the local sqlite3 database (That I've used to test sensor_ctl.py)
Place your code in app/appname/management/commands folder. Use Official guide for management commands. Then you will be able to use your custom command like this:
./manage getsensorinfo
So when you will have this command registered, you can just put in in cron and it will be executed every minute.
Secondly you need to rewrite your code to use django ORM models like this:
Stat.objects.create(temp1=60,temp2=70) instead of INSERT into....

Can we use pyspark in the backend to build the models and then use Rshiny in frontend to build an interactive dashboard?

I am trying to use SparkR now to build the backend where I have a random forest model running. But I need decision tree and sparkR does not have that. Also, sparkR lacks proper documentation . Also, I dont know if there is something as easy as Rshiny in Python. So I want to know if it is possible to build an application with Pyspark and ShinyR.
For a quickly working web app I found that shiny was pretty useful. I called .py scripts from the R server and it worked well, but it was a bit slow since it has to start up a session and allocate resources.
I found the best work around was to use spark streaming instead. It checked if new files had been written by shiny into a directory, reads them, processes them and writes output results (you can lag the R server so that it has time to write the output before reading them into R).
For easy to use python based frameworks, you can check out python-dash.

SQLite vs LocalStorage with Ionic/Firebase

I am writing a chat app in Ionic 2. I want to save some of the the messages locally on the phone.
I am using Firebase as the messaging system. It stores the messages in JSON notation. When I read a message I want to store it locally and delete it from Firebase
Should I use SQLite or Local Storage?
I would normally say SQLite bexcuse it's more reliable, but because Firebase uses JSON, should I rather store the local messages as JSON in Local Storage?
Any advise appreciated.
Because you are using Ionic, I would suggest that you use the ionic-storage module.
By default it uses the most advanced storage mechanism available. So in Chrome, it will use IndexedDB and fall back to WebSQL or LocalStorage.
If you install the cordova-sqlite-storage plugin, ionic-storage will use sqlite as the storage engine when running on the device.
You can only store key-value pairs using ionic-storage, so you can't use custom sql-queries. But storing JSON works without any modifications.

Online / Offline managing Mysql database

I'm creating a software in c++ with Qt library where the database is loaded when you log in to the software.
When it's done you can use it without connection and edit your database, then when you have internet access you can update your online database.
How can I do that ?
Write it into a file with my own format and then parse and do the mysql request when I have internet connection ?
Use a database with localhost and then copy it to the real one ?
An other solution ?
When two users update their database offline, how can I check they don't use the sameId ? By the way I need the ID as father/son elements.
Thanks.
I found a good solution which is to use a sqlite database in my C++ program.
So I have the same requests in both directions ("Upload", "Download") and access to all my data in the whole program (use of singleton for database access)

Using MS Access to return on-demand reports in a web server?

I have built a MS Access 2007 application that can create reports files in various formats (PDF, XLS, CSV, XML).
I would like to allow the creation of these reports to be accessible from a web page where users would just click on a link and get a download of the report produced by my Access application.
I would like to keep it simple and I'm not interested at this stage in rewriting the data processing in .Net. I'd just like to find a way to automate the creation of the user report to return a file that can be downloaded.
In essence, my Access application would act as a web service of some kind.
The web server is IIS on Windows 2003.
Any pointers or ideas would be welcome. I'm not well versed in IIS administration or ASP pages.
The first quick and dirty method i could think of would be to call Access from a shell and pass it a few parameters to open as read only and run a macro.
That macro would have to pull it's report parameters from somewhere (possibly env variables), run the report and save it as Excel, PDF or whatever to a unique name. To du this you'll need to pass the report name, a unique request id, and a param array to handle multiple (or none) parameters.
Last but not least your Access macro / VBA Sub will need to shut access down.
This isnt a good solution as starting one copy of Access per request isn't really advisable though.
Another option is to have start Access on the server with a VBA sub that starts on opening. This sub could poll a directory for requests that are written by your web server. Then on receiving a request run a report and write it to somewhere. Again you'd have to base this around a unique request ID.
I'm not really sure which "solution" would be better.... Access as a command line report generator or Access as a batch reporting service. Both would be nasty, but would get you over the hump until you can migrate to a reporting service.
This is kind of a round about way to achieve what you're asking. You can utilize the free version of sql server express 2005 or 2008 advance edition which includes the reporting services component. Using the report generation tools you can convert your access 2007 reports to sql server reports and have those reports feed off of the access database. You can also go to the extent of migrating the database to sql server as well if you wanted to go that route. Reporting services will generate pdf, xls, csv and xml formats as output for your reports and you can generate those reports just by passing the parameters in the url to the server which will return your report in the format requested.
Link to sql server 2008 express advanced edition:
http://www.microsoft.com/express/sql/download/
If you do not wish to rewrite in .Net, how about Classic ASP and VBScript? VBScript has a lot in common with VBA, so it should not take long create something usable, and there is a great deal of help available for ASP and VBScript on the internet. For example, a simple search returned this method of creating a PDF with Adobe from ASP:
Creating a PDF with ASP