cloud-builds pub/sub topic appears to be unlisted or inaccessible - google-cloud-platform

I'm attempting to create an integration between Bitbucket Repo and Google Cloud Build to automatically build and test upon pushes to certain branches and report status back (for that lovely green tick mark). I've got the first part working, but the second part (reporting back) has thrown up a bit of a stumbling block.
Per https://cloud.google.com/cloud-build/docs/send-build-notifications, Cloud Build is supposed to automatically publish update messages to a Pub/Sub topic entitled "cloud-builds". However, trying to find it (both through the web interface and via gcloud command line tool) has turned up nothing. Copious amounts of web searching has turned up https://github.com/GoogleCloudPlatform/google-cloud-visualstudio/issues/556, which seems to suggest that the topic referenced in that doc is now being filtered out of results; however, that issue seems to be specific to the visual studio tools and not GCP as a whole. Moreover, https://cloud.google.com/cloud-build/docs/configure-third-party-notifications suggests that it's still accessible, but perhaps only to Cloud Functions? And maybe only manually via the command line, since the web interface for Cloud Functions also does not display this phantom "cloud-builds" topic?
Any guidance as to where I can go from here? Near as I can tell, the two possibilities are that something is utterly borked in my GCP project and the Pub/Sub topic is either not visible just for me or has somehow been deleted, or I'm right and this topic just isn't accessible anymore.

I was stuck with the same issue, after a while I created the cloud-builds topic manually and created a cloud function that subscribed to that topic.
Build details are pushed to the topic as expected after that, and my cloud function gets triggered with new events.

You can check the existence of the cloud-builds topic an alternate way from the UI, by downloading the gcloud command line tool and, after running gcloud init, running gcloud pubsub topics list to list all topics for the configured project. If the topic projects/{your project}/topics/cloud-builds is not listed, I would suggest filing a bug to the cloud build team here.

Creating the cloud-builds topic manually won't work since it's a special topic that Google managed.
In this case, you have to go to the API central and disable the CloudBuild API, and then enable it again, the cloud-builds topic will be created for you. Enable and disable Cloud Build API

Related

Google Cloud Run service deployment, is it the best direction in my situation?

I have some experience with Google Cloud Functions (CF). I tried to deploy a CF function recently with a Python app, but it uses an NLP model so the 8GB memory limit is exceeded when the model is triggered. The function is triggered when a JSON file is uploaded to a bucket.
So, I plan to try Google Cloud Run but I have no experience with it. Also, I am not completely sure if it is the best course of action.
If it is, what is the best way of implementing provided that the Run service will be triggered by a file uploaded to a bucket? In CF, you can select the triggering event, in Run I didn't see anything like that. I could use some starting points as I couldn't find my case in the GCP documentation.
Any help will be appreciated.
You can use at least these two things:
The legacy one: Create a GCS notification in PubSub. Then create a push subscription and add the Cloud Run URL in the HTTP push destination
A more recent way is to use Eventarc to invoke directly a Cloud Run endpoint from an event (it roughly create the same thing with a PubSub topic and push subscription, but it's fully configured for you)
EDIT 1
When you use Push notification, you will received a standard PubSub message. The format is described in the documentation for the attributes and for the body content; keep in mind that the raw content is base64 encoded and you have to decode it to get the final format
I personally have a Cloud Run service that log the contents of any requests to be able to get in the logs all the data that I need to develop. When I have a new message format, I configure the push to that Cloud Run endpoint and I automatically get the format
For Eventarc, the format will be added to the UI soon (I view that feature in preview, but it's not yet available). The best solution is to log the content to know what you get to know what to do!

Create dependent triggers in GCP cloud build

I need to create a dependent trigger in cloud build. Currently I have two triggers as shown in below image, both of which are created on push event to master branch in respective repos.
'app-engine-test' is triggered on pushing the code to a cloud repository whereas 'seleniumTest' is triggered on pushing code to a Git repository.
However I want to trigger 'seleniumTest' trigger once 'app-engine-test' build is completed. I could not find any such setting in GCP UI.
Can anyone please help?
You may be able to do this by using a Pub/Sub message as the trigger for your dependent build.
When a CloudBuild build runs it publishes messages to a Pub/Sub topic cloud-builds - see https://cloud.google.com/build/docs/subscribe-build-notifications.
So if you have builds app and test, app would be triggered when you push to source control, and test triggered when a message on the cloud-builds topic is published.
I haven't tested this myself, but need something similar so will update this answer as I go. If it turns out you can't subscribe to the cloud-builds event then at the end of the app build you could also publish a message to your own Pub/Sub topic which you could then use to trigger the second build.
Another solution in your case might be to merge the two projects and simply run the selenium tests as a final build step once you've successfully deployed the code.

How to test billing cap with Google Cloud Platform?

I'm trying to use cloud functions for a personal project, but that requires upgrading to the Blaze plan. I'd like to make sure I don't get unexpectedly charged, so I've been looking into capping the billing for GCP (I've made the mistakes of accidentally set up infinite loops in my code that repeatedly modified Firebase databases). The documentation is confusing for me at two points:
Depending on your runtime, the GCP_PROJECT environment variable might be set automatically. Review the list of environment variables set automatically and determine if you need to manually set the GCP_PROJECT variable to the project for which you want to cap (disable) Cloud Billing.
Since the runtime environment is Node v10, am I supposed to add GCP_PROJECT as an environment variable to the stopBilling cloud function? Would its value just be the project ID?
When the budget sends out a notification, the specified project will no longer have a Cloud Billing account. If you want to test the function, publish a sample message with the testing message above. The project will no longer be visible under the Cloud Billing account and resources in the project are disabled, including the Cloud Function if it's in the same project.
Is "the testing message above" referring to the message in 'Test your Cloud Function'? If so, it hasn't been working for me. I don't even receive any budget notifications from the testing message (should they be showing up in my email?)
Your first assumption is correct, you need to manually set the GCP_PROJECT environment variable in a Node.js 10 runtime and it value is the projectId.
For your second question, I agree the documentation is a bit misleading in this particular topic, but you can use the instructions in the Connect a Cloud Billing budget to a Pub/Sub topic of the Manage programmatic budget alert notifications documentation, which is what I believe that the documentation you shared was refering to.

Google Cloud service stopped and never restarting

I have been using the Google Cloud speech recognition service for some time, through a python application.
Due to accidentally copying my Google Cloud json file to a GitHub shared location (I was doing a backup), I suddenly got a warning from Google Cloud that I was violating the rules as json is private. Then, I promptly removed the file, but nevertheless, I got an email saying that my resources for my project "santo1" were suspended, saying some reasons of "cryptocurrency mining" which I have no idea.
I applied to reactivate and my appeal was accepted promptly, saying that my resources about santo1 were reinstated.
Unfortunately, the speech recognition still didn't work.
Launching it from python, it records from the microphone but no answer from the service - and no error messages at all.
Then I attempted the following:
regenerate API
create a new json
create a new project with its own json under my same google account
as suggested by the Google Cloud chat operator, I manually clicked play to the VM resource that appeared stopped
create a new gmail account, with another new project, setup with billing and everything (also reconfigured through "gcloud init")
None of these attempts worked.
I need assistance on this, as the chat operator didn't seem capable of telling me more.
Thank you in advance
Best regards
I would recommend you to contact GCP support for this case as your cloud service could be still in suspended status regardless your access is OK
Apparently, the access key is stolen and used by hackers and they did crypto mining using your GCP account, hence your service account was banned
If it's your testing account/project, you should consider to create a new project rather than continue with it, the hacker could create some other services which you may not realize until too late
Worse case is it's your PROD service, then you'd better review the bill and transaction report thoroughly

Export / Import tool with Google Spanner

I have several questions regarding the Google Spanner Export / Import tool. Apparently the tool creates a dataflow job.
Can an import/export dataflow job be re-run after it had run successfully from the tool? If so, will it use the current timestamp?
How to schedule a daily backup (export) of Spanner DBs?
How to get notified of new enhancements within the GCP platform? I was browsing the web for something else and I noticed that the export / import tool for GCP Spanner had been released 4 days earlier.
I am still browsing through the documentation for dataflow jobs and templates, etc.. Any suggestions to the above would be greatly appreciated.
Thx
My response based on limited experience with the Spanner Export tool.
I have not seen a way to do this. There is no option in the GCP console, though that does not mean it cannot be done.
There is no built-in scheduling capability. Perhaps this can be done via Google's managed Airflow service, Cloud Composer (https://console.cloud.google.com/composer)? I have yet to try this, but it is next step as I have similar needs.
I've made this request to Google several times. I have yet to get a response. My best recommendation is to read the change logs when updating the gcloud CLI.
Finally-- there is an outstanding issue with the Export tool that causes it to fail if you export a table with 0 rows. I have filed a case with Google (Case #16454353) and they confirmed this issue. Specifically:
After running into a similar error message during my reproduction of
the issue, I drilled down into the error message and discovered that
there is something odd with the file path for the Cloud Storage folder
[1]. There seems to be an issue with the Java File class viewing
‘gs://’ as having a redundant ‘/’ and that causes the ‘No such file or
directory’ error message.
Fortunately for us, there is an ongoing internal investigation on this
issue, and it seems like there is a fix being worked on. I have
indicated your interest in a fix as well, however, I do not have any
ETAs or guarantees of when a working fix will be rolled out.