Fetching data from database using SSRS web service - web-services

Is there a way to fetch data from database by executing query without running any report using SSRS web service?

No. this is not possible.
You may need to explain what/why you want to do this to get a better response.
If you have access to the server to deploy a report, the report could be a very simple table that could be exported to excel.
If you cant deploy a report to use in this manner, then maybe there's a good reason for that? certainly security would be high on the list.

Related

Google Merchant Center - retrieve the BestSellers_TopProducts_ report without the BigQuery Data Transfer service?

I'm trying to find a way to retrieve a specific Google Merchant Center report (BestSellers_TopProducts_) and upload it to BigQuery as part of a specific ETL process we're developing for a customer we have at my workplace.
So far, I know you can set up the BigQuery Data Transfer service so it automates the process of downloading this report but I was wondering if I could accomplish the same with Python and some API libraries from Google (like python-google-shopping) but I may be overdoing it and setting up the service is the way to go.
Is there a way to accomplish this rather than resorting to the aforementioned service?
On the other hand, and assuming the BigQuery Data Transfer service is the way to go, I see (in the examples) you need to create and provide the dataset you're going to extract the report data to so I guess the extraction is limited to the GCP project you're working with.
I mean... you can't extract the report data for a third-party even if you had the proper service account credentials, right?

Can SSO be used to create a dataflow that will reside in the PBI service?

Our client would like us to use dataflows for data reuse and other reasons. We will be connecting to a Snowflake database from PBI service. However, they also want to be able to use SSO (Single Sign-On). So, when a user creates a dataset referencing a dataflow, they want the credentials from the currently logged in user to be picked up via SSO and passed along to Snowflake when the dataflow retrieves data from Snowflake. I don't think this can be done but I wanted to verify.
BTW, I know that SSO can be used with PBI Desktop. Just curious if dataflows can use it.
Yes, it seems is possible to use DataFlows with SSO for Snowflake. I am coming to this conclusion from the following reference : https://learn.microsoft.com/en-us/power-query/connectors/snowflake
which includes DataFlows under Summary - Products.

Syncronise Workday Report with oracle database

is it possible to synchronise a Oracle database table with a Workday Report Web Service Export?
I assume you mean Workday --> Oracle. If you want the flow of information to go the other way around or be bi-directional, the answer is "yes, but not without a lot of work and custom development."
You're probably already delivering the data using an outbound EIB to an sftp landing zone somewhere.
At that point, it's just ETL as usual. No different than if you extracted the data from any other application. Probably the easiest solution would be to use an Oracle External Table on one of the formats recognized by Oracle (CSV is probably the simplest).

How to synchronize the local DynamoDb and Amazon DynamoDb web service

Hello, thanks for your viewing my question first!
I am running the Amazon dynamoDb locally and all databases are saved locally. With the local dynamoDb, I have to show everything with a lot of code, but I feel the interface at web service is much better, in which I can perform operations and see the tables directly and clearly:
So may I ask how can connect them, then I can practice the coding and check the status easily?
Looking forward to your reply and thank you so much!
Sincerely
You cannot connect them as they are completely separate databases. However, you can put a simple user interface on top of your local DynamoDB database.
I use the SQLite Browser: http://sqlitebrowser.org/. Once you have it installed, open the .db file located in the folder where you are running DynamoDBLocal.jar. You should be able to see all your tables and the data within them. You won't be able to see DynamoDB specific things like your provisioned capacity, but I think this will give you enough of what you're looking for.
Does this help?

Prestashop Web Service(CRUD/REST) Api Query Mutiple table

I had gone through the tutorial in Prestashop at http://forge.prestashop.com:8081/display/PS14/Using+the+REST+webservice but it doesn't guide on how to do query on multiple table.
How should I perform it?
The trouble with per-table RESTful web services is they only provide terse access to your data layer. For multi-table joins and subqueries, you're left to either query multiple REST endpoints and perform the reconciliation on the client side or to perform the desired query in SQL and expose it directly through your web service.
From this post:
While importing products is a built-in feature in PrestaShop,
unfortunately, exporting products is not. There are quite a few
modules available for PrestaShop that offer this feature, but finding
a free one is definitely a challenge. Most likely this is due to the
intricacies of the PrestaShop database table structure. Product data
is stored in multiple tables, which means the query to extract that
data is not easy to create.
If you are comfortable running SQL queries, you can use the SQL tab of
phpMyAdmin in your cPanel to query the database tables for the product
information you want to retrieve. However, for most people, this will
not be a workable solution.
You might want to look at the code they provide. It may give you some idea of how to do this in a way that plays nicely with PrestaShop.