I have 2 GCP projects, and both of them have enabled Query Insight recently.
I was wondering what is
__google_connectivity_prober
of Query Insight User of GCP SQL? It just appear in one of projects, and document not explain the user.
DB version:MySQL 8.0.26
region:asia-east2-b
info
I've survey document of Query Insight, but not explain the user.
why only one of my projects have __google_connectivity_prober user?
and what is it?
thanks.
Related
I have one generic question, actually, I am hunting for a solution to a problem,
Currently, we are generating the reports directly from the oracle database, now from the performance perspective, we want to migrate data from oracle to any specific AWS service which could perform better. We will pass data from that AWS service to our reporting software.
Could you please help which service would be idle for this?
Thanks,
Vishwajeet
To answer well, additional info is needed:
How much data is needed to generate a report?
Are there any transformed/computed values needed?
What is good performance? 1 second? 30 seconds?
What is the current query time on Oracle and what kind of query? Joins, aggregations etc.
I have just been given admin access to a Google Analytics portal that tracks the corporate website's activity. The tracked data are to be moved to Amazon S3 via AppFlow.
I followed official AWS documentation in how to setup the connection between GA and AWS. We have created the connection successfully but I came across an issue I can't find an answer to:
Subobject field is empty. Currently, there are already ~4 months worth of data so I was thinking it's not an empty data thing. This issue does not allow me to proceed creating the flow as it is a required field. Any thoughts?
note: the client and the team is new to AWS, so we are setting it up as we go, learning on the way. thank you for the help!
Found the answer! The Google analytics account should have a Universal Analytics property available. Here are a few links:
https://docs.aws.amazon.com/appflow/latest/userguide/google-analytics.html
https://support.google.com/analytics/answer/6370521?hl=en
I have uploaded a number of jobs to CTS and am facing the issue, that for every job instance information, that is listed in job['responsilities'] is not found during searches.
Is there any way to add this field to the index?
It seems now, that the indexable fields are actually limited and are now also documented in the official docs.
For further info on implementing google CTS, feel free to ask me.
hope anyone can help me.
I tried now several times to import my first Power BI model into AZURE Analysis Services. I went through the steps as described by the AZURE team. After I've chosen the right server and the model as PBIX file from my local drive and clicked on import, the following happens:
1. It is running for several minutes.
2. After several minutes it stops without any message or anything similar. The model is afterwards not in AZURE Analysis service.
I tried it now 4 times without any success.
Has anyone made the same experiences? Does anyone know a solution for that?
thanks and br
Christoph
There is a fairly short time-out on the import option for Azure Analysis Services. If it is a large model, I typically:
Export model as a template (pbit)
Open PBIT, and select edit instead of load.
Save new pbix without applying changes or refreshing data.
This new pbix should be small enough to import as long as there aren't any other issues with the dataset.
Firstly could someone please advise the best practice for sitecore logging when hosted in Azure?
Ideally we would like to log on to table storage. I tried using https://www.nuget.org/packages/log4net.Appender.Azure/.
However, the data doesn't stored on to azure table storage until we invoke the buffer.flush() method per article below:
http://zacg.github.io/blog/2014/02/05/azure-log4net-appender/
Has anyone experience this logging on to table storage in sitecore? Any recommendation will be much appreciated.
Good question. We have just added a specific object type that is optimized for logging - so our recommendation is to use AppendBlob for logging. See here for more information: http://blogs.msdn.com/b/windowsazurestorage/archive/2015/04/13/introducing-azure-storage-append-blob.aspx.
Unfortunately many people do try to use Table Storage for logging and if you don't design your keys carefully you can end up with hot partitions and scalability problems. Take a look at the logging anti-pattern in this guide: https://azure.microsoft.com/en-us/documentation/articles/storage-table-design-guide/.