Automate downloading of transaction details from Authorize.net - authorize.net

Has anybody had any luck in automatically downloading the daily transaction details from Authorize.net?
Right now we have to use the "Search" function, enter a date range, select the format, download, then Import.
Frustratingly, they do not have an API for this function.
Are there any tools or alternative approaches that you have implemented for downloading this daily detail?

They have a reporting API you can use. Just set up a cron job that gets the transactions for each day using the getTransactionsList() API call and you're all set.

Related

GCP Pubsub Design Questions

We are planning to use GCP Pubsub to write events to GCS. I have the below questions.
We want to enable the audit table in BigQuery, we would like to see how many messages came for the particular time frame. By day, hour
How do we validate from Pubsub let's say we received 10 messages, how do we check against GCS? How to check we didn't drop any messages.
I would really appreciate your feedback.
To validate number of records written written to GCS, you can create Big query external temp table and query for number of records written to GCS. This sanity check need to be done at regular interval.
Second solution :- You can also check no or records written to GCS through following command:-
gsutil cat gs://folder/test.csv | wc -l

Integration of BigQuery and Dialogflow

I'm getting started on Dialogflow and I would like to integrate it with BigQuery.
I have some tables in BigQuery with difference data, for instance a record of alarms that a wind turbine showed during time.
In one of my test cases, let's say I want my chatbot to tell me what alarms were raised in the wind turbine number 5 of my farm, on the 25th of October.
I have already created a chatbot in Dialogflow that asks for all the necessary parameters of the enquiry, such as the wind farm name, the wind turbine number, the date, and the name of the alarm.
My doubt now is how I can send those parameters to BigQuery in order to dig into my tables, extract the required information, and print it in Dialogflow.
I have been looking for documentation or tutorials but nothing came out that could fit my case...
Thanks in advance!
You need to implement a fulfillment. It triggers a webhook, for example a Cloud Functions or a Cloud Run service.
This webhook call contains the value gather by your intent and parameters. You have to extract them and perform your process, for example a call to BigQuery. Then format the response and display it on Dialogflow.

How to monitor if a BigQuery table contains current data and send an alert if not?

I have a BigQuery table and an external data import process that should add entries every day. I need to verify that the table contains current data (with a timestamp of today). Writing the SQL-query is not a problem.
My question is how to best install such a monitoring in GCP? Can Stackdriver execute custom BigQuery SQL? Or would a CloudFunction be more suitable? An AppEngine application with a cronjob? What's the best practise?
Not sure what's the best practice here, but one simple solution is to use BigQuery scheduled query. Schedule query, make it fail is something is wrong using ERROR() function, configure scheduled query to notify (it sends email) if it fails.

How to programmatically get Google product reports (pulled via API) into AWS S3?

I am looking to pull a monthly performance report of a campaign running in Double click bid manager, I'm pulling this report via the `DoubleClick Bid Manager API(https://developers.google.com/bid-manager/v1/queries/createquery).
I just plug the JASON syntax with the account and various fields, when I click on execute I'm able to download the reports from the reports tab of bid manager, then I'm uploading it to S3 manually so that I can later dump it into Redshift.
I was wondering if there was a way I can do this programmatically, instead of uploading the report manually every time.
If there is a question already on this, please point me in the right direction.
Thanks in advance.

Matillion for Amazon Redshift support for job monitoring

I am working on Amazon Matillion for Redshift and we have multiple jobs running daily triggered by SQS messages. Now I am checking the possibility of creating a UI dashboard for stakeholders which will monitor live progress of jobs and will show report of previous jobs, like Job name, tables impacted, job status/reason for failure etc. Does Matillion maintain this kind of information implicitly? Or I will have to maintain this information for each job.
Matillion has an API which you can use to obtain details of all task history. Information on the tasks API is here:
https://redshiftsupport.matillion.com/customer/en/portal/articles/2720083-loading-task-information?b_id=8915
You can use this to pull data on either currently running jobs or completed jobs down to component level including name of job, name of component, how long it took to run, whether it ran successfully or not and any applicable error message.
This information can be pulled into a Redshift table using the Matillion API profile which comes built into the product and the API Query component. You could then build your dashboard on top of this table. For further information I suggest you reach out to Matillion via their Support Center.
The API is helpful, but you can only pass a date as a parameter (this is for Matillion for Snowflake, assume it's the same for Redshift). I've requested the ability to pass a datetime so we can run the jobs throughout the day and not pull back the same set of records every time our API call runs.