SAS Data Sets on SharePoint - sas

I'm an IT person supporting researchers who use SAS. We have recently migrated most users storage from on-premises SMB shares to MS Teams. The question has arisen whether it's possible to keep SAS Data Sets in Teams storage (Sharepoint library), then access them via the synced library.
Are there any pitfalls to this approach? Any steps that could/should be taken to ensure there are no problems?

It is possible, but not ideal. SAS 9.4 accesses data through a concept called a libname. A libname is a location where data that SAS can access is stored. SAS 9.4 stores data in .sas7bdat files, but it can also access a large variety of other databases natively.
If your users can set up Sharepoint as a shared network disk, SAS can work with it as if it is local. If not, your users will need to download the .sas7bdat files to their system locally, then re-upload it back to Sharepoint using the Sharepoint REST API. SAS can do this through code.
There really is no issue with it other than a convenience factor. It's not as ideal as a shared disk or database access, but it can work in theory.
If they decide to mount it as a network drive, I would add the caveat that they should not use Sharepoint as a place to store temporary data with high read/write speeds. In fact, I'd make it read-only to prevent them from doing so. If they need to pull the data locally then they can do so with libname access.

Related

Django Framework: Reading Data from MSSQL-Database directly? Or import the dataset to sqlite3?

I would like to build an admin dashboard in Django framework.
So far I have only worked with sqlite3 databases in Django.
However, the admin dashboard should read statistics from an MSSQL database and display them accordingly. (Sales figures as a graph, other sales in a table, etc.)
The turnover figures are very extensive. There are several thousand entries per month and the client would like to be able to filter by any date period.
Only data from the MSSQL database should be read. Writing or updating the data is not desired.
So far this is no problem, but I wonder what is the better solution to implement the project.
Should I connect the MSSQL database directly to Django or should I read the MSSQL database in certain intervals and cache the "results" in a sqlite3 database?
Caching seems to me to be the nicer solution, as we don't need real-time data and the performance of the MSSQL server might not suffer as a result. But I would have to build an additional connector to transfer the data from MSSQL to sqlite3.
How would you approach such a project?
Short version: I´d like to display data in django-framework App, should I read directly from MSSQL-Server or should I import the MSSQL-Dataset to local sqlite3-database?
Thanks in advance for your answers.
From official SQLite page:
Situations Where SQLite Works Well:
Application file format
SQLite is often used as the on-disk file format for desktop
applications such as version control systems, financial analysis
tools, media cataloging and editing suites, CAD packages, record
keeping programs, and so forth.
Cache for enterprise data
Many applications use SQLite as a cache of relevant content from an
enterprise RDBMS. This reduces latency, since most queries now occur
against the local cache and avoid a network round-trip. It also
reduces the load on the network and on the central database server.
And in many cases, it means that the client-side application can
continue operating during network outages.

Store all the updated data from Cloud Firestore in real-time

I am working on a project in which an android app updates a location field inside Cloud Firestore in real-time. Currently it is updating the location which means the previous location data is lost.
I want to maintain a history of all the locations so I need to store the data before it is updated.
Does anyone know how I can store this data?
Also this data will be used for analysis later on so a SQL type structured would be preferred.
Thanks in advance.
What you could do is to store the information in a different collection. Then if you want to analyze the information, you can export the data from Firestore to BigQuery, it is a Google Cloud's fully managed, petabyte-scale and cost-effective analytics data warehouse that lets you run analytics over vast amounts of data in near real time. BigQuery lets you focus on finding meaningful insights using standard SQL.

Is it possible to access SAS datasets from a SQL client?

We have datasets that are created and stored in SAS. I'm curious if it is possible to access those datasets from a remote SQL client. Why? To make those datasets more broadly accessible within our organization.
Yes, you can license a product called SAS/SHARE that includes something called SHARE*NET. This is a very useful product that typically is installed in a BI server environment but I suppose it's possible to run on a local desktop.
Basically, you "register" SAS libraries to a service which then makes the data available to external clients over ODBC. This makes the data sets available as "tables" for applications like Excel, so I'm sure you can use other clients as well.
The SAS ODBC driver itself does not require a license, but the SAS/SHARE software does. I use it to make data available to many users who do not have direct access to my UNIX server.
It might be possible through SAS/ACCESS (or something similar), but SAS datasets typically cannot be understood by third-party software.

Adapt Access 2000 Split Database to SQL Backend

I have an Access 2000 Database created over a decade ago. The original creators are long gone. It's stored in a shared folder on a file server on the local network. The file was over 250MB in size, and is accessed by anywhere from a dozen to two dozen local users on the network. We still use Access 2000 loaded on the machines to open the file. I've also already split the database using Access's split tool, into a FE and BE file in order to try and speed things up. We're currently using the FE/BE approach, accessing a single FE file stored on the server as well.
I'm trying to figure out what else I can do to try and optimize it. Move the BE into a SQL server? Put a local copy of the FE on each machine? Set myself on fire and hope for the best?
We are looking to replace the DB, but plans are at least a year or longer down the pipeline at best.

Large Volume Excel Data Pulls - Avoiding ODBC

We have a requirement to provide ad-hoc access to large subsets of a system's data to users to analyse in Excel.
We do not want to grant direct ODBC access. This will curb our ability to make DB layout changes without our users' processes breaking.
Web Services seem ill suited for the volume of data at stake, in the region of 100's of thousands of records.
What would you suggest as an alternative to direct ODBC access?
There is a database concept of a "view" which does exactly what you need - it allows to expose large set of data and gives you a freedom of DB schema changes as long as you take care of exposing the same data to a user.
I agree with you regarding web services - it is not only the volume of data, but also the fact getting web services to work with Excel (2007 and above) is far from trivial. Also you will lock your DB schema as much as you would with a view.
For the really, really huge number of records you can consider data warehousing - a separate db, where you provide a read only access for reporting purposes and feeding the data from your read/write database. The feed can be easily and quickly done via SSIS.
HTH