We are using Bitbucket as our version control tool.
Builds and deployments are executing on Google Cloud Platform.
All builds are triggered via build triggers as specified in Google Cloud docs.
This works great, but we would like to get a build status feedback for specific branch in Bitbucket or preferably on pull request.
I did some research, but I didn't find anything useful on this specific topic.
There is a note in Google Cloud docs considering build on pull request which doesn't give us much hope:
"Whether based on branch commits or tag commits, builds are only triggered on pushes to the remote origin. Builds do not trigger on local changes, pre-submits, or pull requests."
GitHub seems to have a very good integration with Google Cloud Build which supports Build status feedback.
Is possible to get Google Cloud Build status feedback in Bitbucket on pull request/remote push and if it is, how?
How are you dealing with this problem?
Here we've made use of, out of the box, integration between Google Cloud Build and Google Pub/Sub. Connected into this default topic cloud_builds we have a Google Cloud Function responsible for making an HTTP request into Bitbucket REST API. This API has a current 2.0 version but several examples still mention the past version 1.0, so be careful about the potential issues.
#Moacir Rosa thank you for your answer.
Based on that suggestion we build the following GCP Cloud Function:
const axios = require('axios');
/**
* Triggered from a message on a Cloud Pub/Sub topic.
*
* #param {!Object} event Event payload.
* #param {!Object} context Metadata for the event.
*/
exports.processPubSubMessage = (event, context) => {
const pubsubMessage = event.data;
const dataString = Buffer.from(pubsubMessage, 'base64').toString();
const message = JSON.parse(dataString);
const commitSha = message.sourceProvenance.resolvedRepoSource.commitSha;
const repoName = message.sourceProvenance.resolvedRepoSource.repoName;
const [bitbucket, username, repo_slug] = repoName.split('_');
// Build Bitbucket payload data.
const payload = {
type: 'string',
created_on: message.createTime,
description: `Status: ${message.status}`,
key: 'string',
name: 'Google Cloud Build',
refname: `buildTriggerId: ${message.buildTriggerId}`,
state: getBitbucketState(message.status),
updated_on: message.finishTime,
url: message.logUrl,
uuid: message.id,
}
// Send request to Bitbucket.
const token = process.env.BITBUCKET_TOKEN;
const url = getBuildUrl(username, repo_slug, commitSha);
axios.post(url, payload, {
headers: { Authorization: `Basic ${token}` }
})
.then(function(response){
console.log(response);
})
.catch(function(error){
console.log(error);
});
/**
* See: https://developer.atlassian.com/bitbucket/api/2/reference/resource/repositories/%7Busername%7D/%7Brepo_slug%7D/commit/%7Bnode%7D/statuses/build
*
* #param {string} username
* #param {string} repo_slug
* #param {string} commitSha
*/
function getBuildUrl(username, repo_slug, commitSha) {
const baseUrl = 'https://api.bitbucket.org/2.0/repositories';
return `${baseUrl}/${username}/${repo_slug}/commit/${commitSha}/statuses/build`;;
}
/**
* Translates states from Google Cloud Build Message to Bitbucket.
* See: https://developer.atlassian.com/bitbucket/api/2/reference/resource/repositories/%7Busername%7D/%7Brepo_slug%7D/commit/%7Bnode%7D/statuses/build
*
* #param {string} status
*/
function getBitbucketState(status) {
switch(status.toLowerCase()) {
case 'success':
return 'SUCCESSFUL';
case 'queued':
case 'working':
return 'INPROGRESS';
default:
return 'FAILED';
}
}
};
See: https://github.com/honest-food-company/gcp-cloud-build-function-bitbucket
I use Bitbucket Cloud which does not support Bitbucket Tokens. I forked Honest Food Company's excellent repo and modified it to support App Passwords for Bitbucket Cloud.
https://github.com/dgallegos/gcp-cloud-build-function-bitbucket
Related
I am trying to deploy a simple containerized Express app using either GCE or Cloud Run.
It simply calls getSignedUrl with action set to 'read':
router.get('/', async function (req, res, next) {
const gs = new Storage();
const credentials = await gs.authClient.getCredentials();
log(`Client email: ${credentials.client_email}`);
const [url] = await gs.bucket(GS_BUCKET).file('0.txt').getSignedUrl({
version: 'v4',
action: 'read',
expires: Date.now() + 60_000,
contentType: 'text/plain',
});
res.render('index', { title: 'Express', ...credentials, url });
})
I set up my local development environment using the default service account for the project, as explained here.
Now, when I run it on my local machine, either directly (using Node.js) or in a container (using Docker), it works fine and generates a signed URL every time. When I try to build and deploy the container in the cloud (using Cloud Build + Cloud Run or GCE), however, I get the following error:
Error: The caller does not have permission
at Gaxios._request (/workspace/node_modules/gaxios/build/src/gaxios.js:130:23)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Compute.requestAsync (/workspace/node_modules/google-auth-library/build/src/auth/oauth2client.js:382:18)
at async GoogleAuth.signBlob (/workspace/node_modules/google-auth-library/build/src/auth/googleauth.js:721:21)
at async sign (/workspace/node_modules/#google-cloud/storage/build/src/signer.js:181:35) {
The client_email property is the same in both environments: 6***********-compute#developer.gserviceaccount.com i.e. the project's default service account, which seems to have the required permissions (as shown by the operation's succeeding on my local machine).
What could cause this error and how can I find out?
I have written a small cloud function in GCP which is subscribed to Pub/Sub event. When any cloud builds triggered function post message into the slack channel over webook.
In response, we get lots of details to trigger name, branch name, variables details but i am more interested in Build logs URL.
Currently getting build logs URL in response is like : logUrl: https://console.cloud.google.com/cloud-build/builds/899-08sdf-4412b-e3-bd52872?project=125205252525252
which requires GCP console access to check logs.
While in the console there an option View Raw. Is it possible to get that direct URL in the event response? so that i can directly sent it to slack and anyone can access direct logs without having GCP console access.
In your Cloud Build event message, you need to extract 2 values from the JSON message:
logsBucket
id
The raw file is stored here
<logsBucket>/log-<id>.txt
So, you can get it easily in your function with Cloud Storage client library (preferred solution) or with a simple HTTP Get call to the storage API.
If you need more guidance, let me know your dev language, I will send you a piece of code.
as #guillaume blaquiere helped.
Just sharing the piece of code used in cloud function to generate the singedURL of cloud build logs.
var filename ='log-' + build.id + '.txt';
var file = gcs.bucket(BUCKET_NAME).file(filename);
const getURL = async () => {
return new Promise((resolve, reject) => {
file.getSignedUrl({
action: 'read',
expires: Date.now() + 76000000
}, (err, url) => {
if (err) {
console.error(err);
reject(err);
}
console.log("URL");
resolve(url);
});
})
}
const singedUrl = await getURL();
if anyone looking for the whole code please follow this link : https://github.com/harsh4870/Cloud-build-slack-notification/blob/master/singedURL.js
I am currently using react-native to build a mobile application. I need to access a machine learning model in order to send pictures for segmentation. I want to be able to receive a segmented picture back to have the background of the picture cut out. I am trying to use Amazon Sagemaker (because it seems to be a easy to work with package, but if there are other ways to do it, please let me know).
On this Sagemaker quick-start guide, on step 5a, it states:
5a. To deploy the model on a server and create an endpoint that you can access, copy the following code into the next code cell and select Run:
xgb_predictor = xgb.deploy(initial_instance_count=1,instance_type='ml.m4.xlarge')
I want to host everything on AWS and not have to run a separate server. What service/process could I use that would allow me to create an endpoint that I can access through react-native?
To summarize the conversation in the comments:
Once you have your model trained, tuned, and deployed (which is not a simple process), you can call the endpoint of the model using the AWS SDK for JavaScript, that you install by:
npm install aws-sdk
var AWS = require('aws-sdk/dist/aws-sdk-react-native');
you include in the HTML as:
<script src="https://sdk.amazonaws.com/js/aws-sdk-2.538.0.min.js"></script>
And when you want to call the endpoint you invoke it like that:
var params = {
Body: Buffer.from('...') || 'STRING_VALUE' /* Strings will be Base-64 encoded on your behalf */, /* required */
EndpointName: 'STRING_VALUE', /* required */
Accept: 'STRING_VALUE',
ContentType: 'STRING_VALUE',
CustomAttributes: 'STRING_VALUE'
};
sagemakerruntime.invokeEndpoint(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
You can check out the Amplify Library that can take some of the heavy liftings such as getting IAM permissions to call the API, a user log in and many others.
I have written an HTTP triggered function in GCP and it computes some data as expected and after the computation, I would like to publish the result on a MQTT topic.
I added the following snippet of code but it triggers an error:
Error: Error: Cannot find module '#google-cloud/pubsub'
Below is the code added
//decoding worked
const PubSub = require('#google-cloud/pubsub');
// Your Google Cloud Platform project ID
const projectId = 'XXXXX';
// Instantiates a client
const pubsubClient = new PubSub({
projectId: projectId
});
// The name for the new topic
const topicName = 'XXXX';
// Creates the new topic
pubsubClient
.createTopic(topicName)
.then(results => {
const topic = results[0];
console.log(`Topic ${topic.name} created.`);
})
.catch(err => {
console.error('ERROR:', err);
});
If I get rid of the import of library, I get
Error: ReferenceError: PubSub is not defined
So - How can I publish in a topic from an HTTP triggered function in gcp?
You need to install the #google-cloud/pubsub library as dependency so that your Cloud Function can import it successfully. You can do it by running the following command locally:
npm install --save #google-cloud/pubsub
This will include this library in the package.json file you upload with your function code.
If you are writing your function from the Developer Console directly, you'll need to add the following to your package.json file:
"dependencies": {
"#google-cloud/pubsub": "^0.19.0"
}
I am using AWS resources for my android project, I am planning to add push notification service for my project with AWS SNS.there are few questions bothering me much. I did not find any questions regarding these, except one or two but with unclear explanations.
1.Does AWS support FCM? SNS work with GCM. But Google recommends to use FCM instead of GCM. I did not find AWS supporting FCM.
2.Do AWS store messages (or data) into their databases even after sending push notifications?
3.I tried putting FCM api key in SNS application platform, it is showing invalid parameters why?
FCM is backwards compatible with GCM. The steps for setting up FCM on AWS are identical to the GCM set up procedure and (at least for the moment) FCM works transparently with GCM and SNS with respect to server-side configuration.
However, if you are sending data payloads to the Android device they will not be processed unless you implement a client side service that extends FirebaseMessagingService. The default JSON message generator in the AWS console sends data messages, which will be ignored by your app unless the aforementioned service is implemented. To get around this for initial testing you can provide a custom notification payload which will be received by your device (as long as your app is not in the foreground)
There are GCM-FCM migration instructions provided by Google however the changes you need to make are predominantly on the App side.
The steps you need to follow to test GCM/FCM on your app with SNS are:
Create a Platform Application in SNS, selecting Google Cloud Messaging (GCM) as the Push Notification Platform, and providing your Server API key in the API key field.
Select the Platform Application and click the Create platform endpoint button.
Provide the InstanceID (Device Token) generated by your app. You must extend the FirebaseInstanceIDService and override the onTokenRefresh method to see this within your Android App. Once you have done this, uninstall and reinstall your app and your token should be printed to the Debug console in Android Studio on first boot.
Click the Add endpoint button.
Click on the ARN link for your platform application.
Select the newly created Endpoint for your device and click the Publish to endpoint button.
Select the JSON Message Format, and click the JSON message generator button.
Enter a test message and click the Generate JSON button
Now comes the "gotcha part".
The message that is generated by SNS will be of the form:
{
"GCM": "{ \"data\": { \"message\": \"test message\" } }"
}
As we mentioned earlier, data payloads will be ignored if no service to receive them has been implemented. We would like to test without writing too much code, so instead we should send a notification payload. To do this, simply change the JSON message to read:
{
"GCM": "{ \"notification\": { \"title\": \"test title\", \"body\": \"test body\" } }"
}
(For more information about the JSON format of an FCM message, see the FCM documentation.)
Once you have done this, make sure your app is not running on the device, and hit the Publish Message button. You should now see a notification pop up on your device.
You can of course do all this programmatically through the Amazon SNS API, however all the examples seem to use the data payload so you need to keep that in mind and generate a payload appropriate to your use case.
Now you can go to your firebase console (https://console.firebase.google.com/) select your project, click the gear icon and choose project settings, then click on the cloud messaging tab...
You'll see the legacy Server Key which is the GCM API Key and you'll have the option to generate new Server Keys which are the FCM versions
SNS will accept both versions but their menu option is still categorizing it under GCM
Here is picture for your reference:
Note that you can "accidentally" remove your Server Keys but the Legacy server key is not deletable. Also, if you click the add server key button, you'll get a new server key BELOW the first one, WITH NO WARNING! ...Nice job Google ;)
One more additional note to Nathan Dunn's great answer.
How to send data with the notification from SNS to Firebase.
We need to add data to the Json (inside the notification):
{
"default": “any value",
"GCM": "{ \"notification\": { \"body\": \”message body\”, \”title\”: \”message title \”, \"sound\":\"default\" } , \"data\" : {\”key\" : \”value\", \”key2\" : \”value\” } }”
}
In your FirebaseMessagingService implementation (Xamarin example)
public override void OnMessageReceived(RemoteMessage message)
{
try
{
var body = message?.GetNotification()?.Body;
var title = message?.GetNotification()?.Title;
var tag = message?.GetNotification()?.Tag;
var sound = message?.GetNotification()?.Sound;
var data = message?.Data
foreach (string key in data.Keys)
{
// get your data values here
}
}
catch (Exception e)
{
}
}
I tried to use solution with notification payload instead of data, but I did not receive push notifications on the mobile device. I found this tutorial https://youtu.be/iBTFLu30dSg with English subtitles of how to use FCM with AWS SNS step by step and example of how to send push notifications from AWS console and implement it on php with aws php sdk. It helped me a lot.
Just an additional note to Nathan Dunn's Answer: to add sound use the following JSON message
{
"GCM": "{ \"notification\": { \"text\": \"test message\",\"sound\":\"default\" } }"
}
It took me a while to figure out how to send the notification with the right payload (publish to topic). So I will put it here.
private void PublishToTopic(string topicArn)
{
AmazonSimpleNotificationServiceClient snsClient =
new AmazonSimpleNotificationServiceClient(Amazon.RegionEndpoint.EUWest1);
PublishRequest publishRequest = new PublishRequest();
publishRequest.TopicArn = topicArn;
publishRequest.MessageStructure = "json";
string payload = "\\\"data\\\":{\\\"text\\\":\\\"Test \\\"}";
publishRequest.Message = "{\"default\": \"default\",\"GCM\":\"{" + payload + "}\"}";
PublishResponse publishResult = snsClient.Publish(publishRequest);
}
Amazon does support FCM as all previous code has been migrated from GCM to FCM. Below article explains in detail.
Article Published by Amazon
To answer the questions:
AWS SNS does support FCM.
No AWS does not store messages after sending push notifications.
For a detailed tutorial on setting up FCM with SNS please read this article.