Adding Google Analytics Segments to AWS Appflow - amazon-web-services

I am trying to add segments to my AWS Appflow that is pulling Google Analytics data.
This is because I am running into Sampling problems(Google Analytics summarizes a lot of the data and makes analysis impossible)
I can add date range filters, but even with that set the minimum, I am still in need of breaking the requests down further via segmenting. But I can not find any support articles or places online that have done similar.
I have used the Google Analytics API by itself without appflow and been able to get all the data without it sampling, but need to do something similar using appflow.
What is the correct way to add segments to a Google Analytics Appflow
Thanks in advance for any help

Related

How to move data from BigQuery to Google Ads or DV360?

all,
I have been looking to move my certain customer data to Google Ads/DV360.
Wanted to understand if there's any way to move data From BigQuery >> Google Ads?
Using scripts we can automate bulk uploads using spreadsheets etc. but this controls only campaigns. Is there any method I can automate "remarketing list" under "Segments" via Scripts? Also is there any limitations wrt number of records & size of the file etc.?
Upon research I discovered methods to move the data Vice-versa ie; Google Ads >> Big Query via "Transfer Service API", "Scripts", Manual method.
Exploring API's therefore, wanted to understand if API can help me in the above requirements or not? If Yes, please guide me on the same.
Your help & response would highly be appreciated. Have a nice day ahead.

Google Merchant Center - retrieve the BestSellers_TopProducts_ report without the BigQuery Data Transfer service?

I'm trying to find a way to retrieve a specific Google Merchant Center report (BestSellers_TopProducts_) and upload it to BigQuery as part of a specific ETL process we're developing for a customer we have at my workplace.
So far, I know you can set up the BigQuery Data Transfer service so it automates the process of downloading this report but I was wondering if I could accomplish the same with Python and some API libraries from Google (like python-google-shopping) but I may be overdoing it and setting up the service is the way to go.
Is there a way to accomplish this rather than resorting to the aforementioned service?
On the other hand, and assuming the BigQuery Data Transfer service is the way to go, I see (in the examples) you need to create and provide the dataset you're going to extract the report data to so I guess the extraction is limited to the GCP project you're working with.
I mean... you can't extract the report data for a third-party even if you had the proper service account credentials, right?

How to generate uptime reports through Google Cloud Stackdriver?

I am a new user with Google cloud (Stackdriver).
I would like to set and generate uptime reports on a monthly basis which would include the past 4 weeks through e-mail in the cloud but I have not been able to find from where I could do this.
I have done research but have not managed to find what I am looking for. The closes I got to was TRACE but is still not what I would like to have.
It's not possible to generate that kind of reports using tools available in Google Cloud.
Using traces is probably the best you can do now - although you can try the Cloud Trace API which may give you a way to extract the information in a more structured way.
If you want this feature included in GCP please go to IssueTracker and create a new feature request with detailes explanation of what your goal is and mention the time-span you want to be able to get data from.

Google Tag Manager clickstream to Amazon

So the questions has more to do with what services should i be using to have the efficient performance.
Context and goal:
So what i trying to do exactly is use tag manager custom HTML so after each Universal Analytics tag (event or pageview) send to my own EC2 server a HTTP request with a similar payload to what is send to Google Analytics.
What i think, planned and researched so far:
At this moment i have two big options,
Use Kinesis AWS which seems like a great idea but the problem is that it only drops the information in one redshift table and i would like to have at least 4 o 5 so i can differentiate pageviews from events etc ... My solution to this would be to divide from the server side each request to a separated stream.
The other option is to use Spark + Kafka. (Here is a detail explanation)
I know at some point this means im making a parallel Google Analytics with everything that implies. I still need to decide what information (im refering to which parameters as for example the source and medium) i should send, how to format it correctly, and how to process it correctly.
Questions and debate points:
Which options is more efficient and easiest to set up?
Send this information directly from the server of the page/app or send it from the user side making it do requests as i explained before.
Does anyone did something like this in the past? Any personal recommendations?
You'd definitely benefit from Google Analytics custom task feature instead of custom HTML. More on this from Simo Ahava. Also, Google Big Query is quite a popular destination for streaming hit data since it allows many 'on the fly computations such as sessionalization and there are many ready-to-use cases for BQ.

What is the best tool to use for real-time web statistics?

I operate a number of content websites that have several million user sessions and need a reliable way to monitor some real-time metrics on particular pieces of content (key metrics being: pageviews/unique pageviews over time, unique users, referrers).
The use case here is for the stats to be visible to authors/staff on the site, as well as to act as source data for real-time content popularity algorithms.
We already use Google Analytics, but this does not update quickly enough (4-24 hours depending on traffic volume). Google Analytics does offer a real-time reporting API, but this is currently in closed beta (I have requested access several times, but no joy yet).
New Relic appears to offer a few analytics products, but they are quite expensive ($149/500k pageviews - we have several times this).
Other answers I found on StackOverflow suggest building your own, but this was 3-5 years ago. Any ideas?
Heard some good things about Woopra and they offer 1.2m page views for the same price as Relic.
https://www.woopra.com/pricing/
If that's too expensive then it's live loading your logs and using an elastic search service to read them to get he data you want but you will need access to your logs whilst they are being written to.
A service like Loggly might suit you which would enable you to "live tail" your logs (view whilst being written) but again there is a cost to that.
Failing that you could do something yourself or get someone on freelancer to knock something up for you enabling logs to be read and displayed in a format you recognise.
https://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the metrics that you need to track are just limited to the ones that you have listed (Page Views, Unique Users, Referrers) you may think of collecting the logs of your web servers and using a log analyzer.
There are several free tools available on the Internet to get real-time statistics out of those logs.
Take a look at www.elastic.co, for example.
Hope this helps!
Google Analytics offers real time data viewing now, if that's what you want?
https://support.google.com/analytics/answer/1638635?hl=en
I believe their API is now released as we are now looking at incorporating this!
If you have access to web server logs then you can actually set up Elastic Search as a search engine and along with log parser as Logstash and Kibana as Front end tool for analyzing the data.
For more information: please go through the elastic search link.
Elasticsearch weblink