Why are there so many inexplicable tables under my crashlytics instance? - google-cloud-platform

I was experiencing the sandbox function of bigquery and encountered two problems:
Automatically import the data from Crashlytics into bigquery, but I don't know why, there are many inexplicable tables under the crashlytics instance. Should not have only appid_ Android and appid_ IOS two tables?
Automatically import the data from Analytics into bigquery. I don't know why I only imported the data of 11-08 and 11-11, but the other data of 11-09, 11-10 and 11-12 were not automatically imported, and I couldn't find Google's technical support...

Related

Batch delete BigTable tables and BigQuery datasets

I searched around to find a way to Batch delete BigTable tables and BigQuery datasets (using python's library) without any luck up to now.
Is anyone aware of an efficient way to do that?
I looked into these links but nothing promising :
BigQuery
BigTable
Im looking for something similar as this one coming from datastore documentation:
from google.cloud import datastore
# For help authenticating your client, visit
# https://cloud.google.com/docs/authentication/getting-started
client = datastore.Client()
keys = [client.key("Task", 1), client.key("Task", 2)]
client.delete_multi(keys)
Batch delete
I think it's not possible natively, you have to develop your own script.
For example you can configure all the tables to delete, then there are many solutions :
Develop a Python script, loop on the tables to delete and use Python Bigquery and Bigtable clients : https://cloud.google.com/bigquery/docs/samples/bigquery-delete-dataset
https://cloud.google.com/bigtable/docs/samples/bigtable-hw-delete-table
Develop a shell script, loop on the tables to delete and use bq and cbt (from gcloud sdk) :
https://cloud.google.com/bigquery/docs/managing-tables?hl=en#deleting_a_table
https://cloud.google.com/bigtable/docs/cbt-reference?hl=fr
If it's possible on your side, you can also use Terraform to delete multiple Bigquery and Bigtable tables, but it's more adapted if you need to manage a state for your infrastructure :
https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigquery_table
https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigtable_table

Can't find BigQuery Logs for GA4 events_intraday tables

I am trying to create a trigger for a Cloud Function to copy events_intraday table data as soon as new data has been exported.
So far I have been following this answer to generate a sink from Cloud Logging to Pub/Sub.
I have only been able to find logs for events_YYYMMDD tables but none for events_intraday_YYYYMMDD neither on Cloud Logging nor on BigQuery Job History (Here are my queries for events tables and events_intraday tables on Cloud Logging).
Am I looking at the wrong place? How is it possible for the table to be updated without any logs being generated?
Update: There is one(1) log generated per day when the table is created but "table update" logs are yet to be found.
Try
protoPayload.authorizationInfo.permission="bigquery.tables.create"
protoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable"
protoPayload.resourceName : "projects/'your_project'/datasets/'your_dataset'/tables/events_intraday_"

How to monitor if a BigQuery table contains current data and send an alert if not?

I have a BigQuery table and an external data import process that should add entries every day. I need to verify that the table contains current data (with a timestamp of today). Writing the SQL-query is not a problem.
My question is how to best install such a monitoring in GCP? Can Stackdriver execute custom BigQuery SQL? Or would a CloudFunction be more suitable? An AppEngine application with a cronjob? What's the best practise?
Not sure what's the best practice here, but one simple solution is to use BigQuery scheduled query. Schedule query, make it fail is something is wrong using ERROR() function, configure scheduled query to notify (it sends email) if it fails.

How to deploy data from Django WebApp to a cloud database which can be accessed by Jupyter notebook such as Kaggle?

I have build a Django WebApp. It has an sql database. I would like to analyze this data and share the analysis using online platform Jupyter notebook such as Kaggle.
I have already deployed to Google App Engine as an SQL instance, but I don't know how to view this SQL instance tables in Kaggle. There is an option to view BigQuery databases in Kaggle, but I don't know how to get the data from my SQL instance to BigQuery.
To be able to access the data with Kaddle you would need to import the data from the CloudSQL instance into BigQuery.
Currently there are some options for importing data into BigQuery, the best choice would depend on what type of analysis you want to do with it.
If you just want to import the data from the CloudSQL instance into BigQuery, the easiest way to do it would be to first export the data in CSV format and then import the CSV file into BigQuery.
In case you are working with a large database, you can also do it programmatically by using the Client Libraries.

google cloud data Prep - error when importing dataset from BigQuery

When I tring to import bigquery tables as dataset in my data Prep flow, I have the following error:
Could not create dataset: I/o error.
I tried to import many bigquery tables (which are from same BQ dataset) all of them successfully imported except this which has many columns (more than 2700 columns!).
Maybe that's because of the large number of columns but I can't see any such limitation in Docs!
when I selecet the table ----> I have this message "Preview not available" like this:
and after clicking "import":
Does anyone have any idea why this is happening or has any suggestion?
Dataprep documentation doesn't have maximum columns limitation, thus it's not probable that this is the problem.
On the other hand, I have seen generic messages like 'I/O error' or simply red icons when importing data, and they are related to the data type conversion between Dataprep and BigQuery.
I think that finding the types in BQ that are not compatibles in Dataprep and converting to one compatible should solve your issue.