Syncronise Workday Report with oracle database - web-services

is it possible to synchronise a Oracle database table with a Workday Report Web Service Export?

I assume you mean Workday --> Oracle. If you want the flow of information to go the other way around or be bi-directional, the answer is "yes, but not without a lot of work and custom development."
You're probably already delivering the data using an outbound EIB to an sftp landing zone somewhere.
At that point, it's just ETL as usual. No different than if you extracted the data from any other application. Probably the easiest solution would be to use an Oracle External Table on one of the formats recognized by Oracle (CSV is probably the simplest).

Related

Google Merchant Center - retrieve the BestSellers_TopProducts_ report without the BigQuery Data Transfer service?

I'm trying to find a way to retrieve a specific Google Merchant Center report (BestSellers_TopProducts_) and upload it to BigQuery as part of a specific ETL process we're developing for a customer we have at my workplace.
So far, I know you can set up the BigQuery Data Transfer service so it automates the process of downloading this report but I was wondering if I could accomplish the same with Python and some API libraries from Google (like python-google-shopping) but I may be overdoing it and setting up the service is the way to go.
Is there a way to accomplish this rather than resorting to the aforementioned service?
On the other hand, and assuming the BigQuery Data Transfer service is the way to go, I see (in the examples) you need to create and provide the dataset you're going to extract the report data to so I guess the extraction is limited to the GCP project you're working with.
I mean... you can't extract the report data for a third-party even if you had the proper service account credentials, right?

How are you coping up with Bigquery especially when you came from traditional RDMS background like Oracle/Mysql?

I am new to BQ. I have a table with around 200 columns, when i wanted to get DDL of this table there is no ready-made option available. CATS is not always desirable.. some times we dont have a refernce table to create with CATS, some times we just wanted a simple DDL statement to recreate a table.
I wanted to edit a schema of bigquery with changes to mode.. previous mode is nullable now its required.. (already loaded columns has this column loaded with non-null values till now)
Looking at all these scenarios and the lengthy solution provided from Google documentation, and also no direct solution interms of SQL statements rather some API calls/UI/Scripts etc. I feel not impressed with Bigquery with many limitations. And the Web UI from Google Bigquery is so small that you need to scroll lot many times to see the query as a whole. and many other Web UI issues as you know.
Just wanted to know how you are all handling/coping up with BQ.
I would like to elaborate a little bit more to #Pentium10 and #guillaume blaquiere comments.
BigQuery is a serverless, highly scalable data warehouse that comes with a built-in query engine, which is capable of running SQL queries on terabytes of data in a matter of seconds, and petabytes in only minutes. You get this performance without having to manage any infrastructure.
BigQuery is based on Google's column based data processing technology called dremel and is able to run queries against up to 20 different data sources and 200GB of data concurrently. Prediction API allows users to create and train a model hosted within Google’s system. The API recognizes historical patterns to make predictions about patterns in new data.
BigQuery is unlike anything that has been used as a big data tool. Nothing seems to compare to the speed and the amount of data that can be fitted into BigQuery. Data views are possible and recommended with basic data visualization tools.
This product typically comes at the end of the Big Data pipeline. It is not a replacement for existing technologies but it complements them. Real-time streams representing sensor data, web server logs or social media graphs can be ingested into BigQuery to be queried in real time. After running the ETL jobs on traditional RDBMS, the resultant data set can be stored in BigQuery. Data can be ingested from the data sets stored in Google Cloud Storage, through direct file import or through streaming data
I recommend you to have a look for Google BigQuery: The Definitive Guide: Data Warehousing, Analytics, and Machine Learning at Scale book about BigQuery that includes walkthrough on how to use the service and a deep dive of how it works.
More than that, I found really interesting article for Data Engineers new to BigQuery, where you can find consideration regarding DDL and UI and best practices on Medium.
I hope you find the above pieces of information useful.

Fetching data from database using SSRS web service

Is there a way to fetch data from database by executing query without running any report using SSRS web service?
No. this is not possible.
You may need to explain what/why you want to do this to get a better response.
If you have access to the server to deploy a report, the report could be a very simple table that could be exported to excel.
If you cant deploy a report to use in this manner, then maybe there's a good reason for that? certainly security would be high on the list.

Prestashop Web Service(CRUD/REST) Api Query Mutiple table

I had gone through the tutorial in Prestashop at http://forge.prestashop.com:8081/display/PS14/Using+the+REST+webservice but it doesn't guide on how to do query on multiple table.
How should I perform it?
The trouble with per-table RESTful web services is they only provide terse access to your data layer. For multi-table joins and subqueries, you're left to either query multiple REST endpoints and perform the reconciliation on the client side or to perform the desired query in SQL and expose it directly through your web service.
From this post:
While importing products is a built-in feature in PrestaShop,
unfortunately, exporting products is not. There are quite a few
modules available for PrestaShop that offer this feature, but finding
a free one is definitely a challenge. Most likely this is due to the
intricacies of the PrestaShop database table structure. Product data
is stored in multiple tables, which means the query to extract that
data is not easy to create.
If you are comfortable running SQL queries, you can use the SQL tab of
phpMyAdmin in your cPanel to query the database tables for the product
information you want to retrieve. However, for most people, this will
not be a workable solution.
You might want to look at the code they provide. It may give you some idea of how to do this in a way that plays nicely with PrestaShop.

Is possible to create a cron-job on a as400 system (IBM) in order to update/insert large quantity of data on online server?

I was looking here if is possible to insert/update a large quantity of rows from an as400 system.
I have a website stored on another server online and that website must be updated with the new stocks for each article. But this data only exists in the as400 system.
I would like to be the as400 system linking the web-server instead of the web-server link to as400 for security reasons.
A better system would be to update/insert everytime a change has been made in the as400, but if this is not possible it could be making an update every 3 hours in order to mantain consistency between the 2 servers.
Thanks
Yes it can be done.
You can attach a database trigger to your stock table that pushes the key fields to a data queue anytime an insert, update or delete is performed. You can then process the data queue to send the updates to the web site using an HTTP POST or other means.
IBM Redbook: Stored Procedures, Triggers, and User-Defined Functions
on DB2 Universal Database for
iSeries
IBM i 7.1 Information Center: Data Queue APIs
Scott Klement's RPG IV Sockets Tutorial