Bitbucket Pipelines: pulling an image from GCR with environment variables fails - google-container-registry

So I'm trying to use an image from my Google Container Registry, since this is a private registry I need to authenticate.
Obviously I don't want to renew my auth token every hour to make my pipelines work, so I need to go for the json key file.
It works when I define the image as follows:
image:
name: eu.gcr.io/project_id/image:latest
username: _json_key
password: >
{JSON file content}
email: pipelines#bitbucket.com
But that means your json key file is out in the open available for everyone with access to the pipelines fine to see, not what I'd like.
Then I've put the contents of the JSON file into an Environment Variable and replaced the actual json with the environment variable as follows:
image:
name: eu.gcr.io/project_id/image:latest
username: _json_key
password: >
${JSON_KEY}
email: pipelines#bitbucket.com
Somehow in the second scenario it doesn't work :(

After some more testing, I found that this worked:
image:
name: eu.gcr.io/project_id/image:latest
username: _json_key
password: '$JSON_KEY'

Related

GCP workflow: load external sql file?

I am planning to have a Cloud Scheduler that calls a GCP Workflows every day at 8 a.m. My GCP Workflows will have around 15 different steps and will be only transformations (update, delete, add) on BigQuery. Some queries will be quite long and I am wondering if there is a way to load a .sql file into a GCP Workflows task1.yaml?
#workflow entrypoint
ProcessItem:
params: [project, gcsPath]
steps:
- initialize:
assign:
- dataset: wf_samples
- input: ${gcsPath}
- sqlQuery: QUERY HERE
...
You need to do something similar: (of course you can assign this to a variable like input)
#workflow entrypoint
main:
steps:
- getSqlfile:
call: http.get
args:
url: https://raw.githubusercontent.com/jisaw/sqlzoo-solutions/master/select-in-select.sql
headers:
Content-Type: "text/plain"
result: queryFromFile
- final:
return: ${queryFromFile.body}
For Cloud Storage that may look like:
call: http.get
args:
url: https://storage.cloud.google.com/................./q1.sql
headers:
Content-Type: "text/plain"
auth:
type: OIDC
result: queryFromFile
Or event with this format (different URL syntax + OAuth2)
call: http.get
args:
url: https://storage.googleapis.com/................./q1.sql
headers:
Content-Type: "text/plain"
auth:
type: OAuth2
result: queryFromFile
Make sure that invoker has the right permission to access the Cloud Storage file.
Note: On further testing, this to work correctly the text/plain
mime-type must be set on the GCS file.

How to trigger Email notification to Group of users from BigQuery on threshold?

I have written the below query in GCP BigQuery, where I am using error function to pop-up error message when the threshold for the quantity column exceeds 1000.
SELECT ERROR(CONCAT("Over threshold: ", CAST(quantity AS STRING)))
FROM `proj.dataset.table`
WHERE quantity > 1000
I am getting the email notification when I have scheduled this query in BigQuery. But I want to trigger that notification to the group of users through BigQuery.
How to achieve this?
You could achieve this and a lot more with the Cloud Workflows serverless product and an external email sending provider such as Sendgrid, Mailchimp, Mailgun that offers a REST Api.
You basically setup a Workflow that will handle the steps for you:
run the BigQuery query
on error trigger an email step
you could even combine, if results returned are of a kind execute another step
The main workflow would be like this:
#workflow entrypoint
main:
steps:
- getList:
try:
call: BQ_Query
args:
query: SELECT ERROR('demo') from (select 1) where 1>0
result: result
except:
as: e
steps:
- sendEmail:
call: sendGridSend
args:
secret: sendgrid_email_dev_apikey
from: from#domain.com
to:
- email: email1#domain.com
- email: email2#domain.com
subject: "This is a test"
content: ${"Error message from BigQuery" + e.body.error.message}
contentType: "text/plain"
result: callResult
- final:
return: ${callResult}
sendgrid_email_dev_apikey is the secret label, I've used Secret Manager to store Sendgrid's API key. If you want to use MailChimp there are examples in this Github repo.
The workflow invoker could be a Cloud Scheduler entry. So instead of launching the scheduled queries from BigQuery interface, you set them up in a scheduled Workflow. You must give permission for the invoker service account to read Secrets, to run BigQuery jobs.
The rest of the Workflow is here:
BQ_Query:
params: [query]
steps:
- runBQquery:
call: googleapis.bigquery.v2.jobs.query
args:
projectId: ${sys.get_env("GOOGLE_CLOUD_PROJECT_ID")}
body:
useLegacySql: false
query: ${query}
result: queryResult
- documentFound:
return: ${queryResult}
sendGridSend:
params: [secret, from, to, subject, content, contentType]
steps:
- getSecret:
call: http.get
args:
url: ${"https://secretmanager.googleapis.com/v1/projects/" + sys.get_env("GOOGLE_CLOUD_PROJECT_NUMBER") + "/secrets/" + secret + "/versions/latest:access"}
auth:
type: OAuth2
result: sendGridKey
- decodeSecrets:
assign:
- decodedKey: ${text.decode(base64.decode(sendGridKey.body.payload.data))}
- sendMessage:
call: http.post
args:
url: https://api.sendgrid.com/v3/mail/send
headers:
Content-Type: "application/json"
Authorization: ${"Bearer " + decodedKey }
body:
personalizations:
- to: ${to}
from:
email: ${from}
subject: ${subject}
content:
- type: ${contentType}
value: ${content}
result: sendGridResult
- returnValue:
return: ${sendGridResult}
Since you receive a mail notification, I guess you are using the BigQuery Data Transfer service.
According to this paragraph only the person that set up the transfer will receive the mail notification. However, if you're using Gmail you can automatically forward these message to a list of users.
This link should guide you through it.

Is there any way to encrypt the Http basic authentication password in rails?

I have written the following in my controller:
http_basic_authenticate_with name: "foo", password: "bar", except: [:new, :show, :edit, :create]
but when I push it to my repo, the password is there for everyone to see. Is there any way to encrypt the password?
You probably want to use environment-variables for this :)
There's a gem (Like for everything basically): https://github.com/bkeepers/dotenv
In your .env file you'd have the following:
AUTHENTICATION_USERNAME="foo"
AUTHENTICATION_PASSWORD="bar"
Where as in your controller you write it like so:
http_basic_authenticate_with name: ENV['AUTHENTICATION_USERNAME'], password: ENV['AUTHENTICATION_PASSWORD'], except: [:new, :show, :edit, :create]
This way your code is completely separated from the actual information.
Make sure to not add the .env-file to your git-repository by adding this to your gitignore:
.env
So what this does is it'll load these variables you set up in .env into your existing environment variables. This way somebody needs to actually log into your server and get access to that particular file in order to get the username/password. And this should be more secure than having the username/password in plain text inside your controller ;)

Loopback angular sdk unable to login

I'm trying to use loopback angular SDK to login but instead I'm getting back a 401 unauthorized response.
User.login({ rememberMe: true }, {email: $scope.user.email, password: $scope.user.password})
.$promise
.then(function() {
var next = $location.nextAfterLogin || '/';
$location.nextAfterLogin = null;
$location.path(next);
})
.catch(function(x) {
$scope.authError = 'Wrong Credentials';
});
};
Also i can see the user with the given credentials in the datasource that I've defined for the User model.
So basically, I have a boot script that creates a default user
User.create([{
username: 'admin',
email: 'xxxx#gmail.com',
password: 'admin'
}], on_usersCreated);
And I also have created a custom User model that extends the built-in User model.
Why am I getting a 401 unauthorized response when trying to login? I have even tried to login via the explorer, but with no success.
Update:
I debugged the build-in User model and the login method; and it seems that it cannot find the user with the given email. I can see it though in the mongodb database.
Inspect the ACL records of User model, starting you app in debug mode and viewing console: DEBUG=loopback:security:acl slc run.
I don't see anything wrong with your code. Are you sure the values you're passing into the function are correct? Try logging $scope.user.email and $scope.user.password before calling User.login.
See my new example here: https://github.com/strongloop/loopback-getting-started-intermediate/blob/master/client/js/services/auth.js#L6-L15
The tutorial part is not complete, you can look around the source code to get an idea of what you're doing different.

Regular expressions for long phrases in perl

I'm looking to extract the "Account Name" and "Source Network Address" from the following text using regular expressions in a perl script. Adding a regular expression for such a long phrase, seems to take a lot of effort.
I need your help with finding the best regex for this, or any ideas would help. Keep in mind that this are just 3 examples out of possible 50? phrases similar to this (different lengths).
Example phrase 1:
WinEvtLog: Security: AUDIT_SUCCESS(4624): Microsoft-Windows-Security-Auditing: admin: DOMAIN: hostname.domain.com: An account was successfully logged on. Subject: Security ID: S-1-0-0 Account Name: - Account Domain: - Logon ID: 0x0 Logon Type: 3 New Logon: Security ID: S-1-5-21-1130994204-1932287720-1813960501-1239 Account Name: admin Account Domain: DOMAIN Logon ID: 0x1d12cfff5 Logon GUID: {AF5E2CF5-1A54-2121-D281-13381F397F41} Process Information: Process ID: 0x0 Process Name: - Network Information: Workstation Name: Source Network Address: 101.101.101.101 Source Port: 52616 Detailed Authentication Information: Logon Process: Kerberos Authentication Package: Kerberos Transited Services: - Package Name (NTLM only): - Key Length: 0 This event is generated when a logon session is created. It is generated on the computer that was accessed.
Example phrase 2:
WinEvtLog: Security: AUDIT_SUCCESS(4634): Microsoft-Windows-Security-Auditing: admin: DOMAIN: hostname.domain.com: An account was logged off. Subject: Security ID: S-1-5-21-1130554204-1932287720-1813960501-4444 Account Name: admin Account Domain: DOMAIN Logon ID: 0x1d12d000a Logon Type: 3 This event is generated when a logon session is destroyed. It may be positively correlated with a logon event using the Logon ID value. Logon IDs are only unique between reboots on the same computer." 4646,1
Example phrase 3:
WinEvtLog: Security: AUDIT_SUCCESS(540): Security: Administrator: HOST88: HOST88: Successful Network Logon: User Name: Administrator Domain: HOST88 Logon ID: (0x14,0x6E6FB948) Logon Type: 3 Logon Process: NtLmSsp Authentication Package: NTLM Workstation Name: DESKHOST88 Logon GUID: - Caller User Name: - Caller Domain: - Caller Logon ID: - Caller Process ID: - Transited Services: - Source Network Address: 10.10.10.10 Source Port: 43221
The following regex will handle your posted cases:
if ( $string =~ /(?<=Account Name:)\s+([^-\s]+).+(?:Source Network Address:)\s+([\d.]+)\s+/ ) {
$account_name = $1;
$source_addr = $2;
}
How rigorous do you want to be with your solution?
If you have log lines and want to extract the word that follows "Account Name:" and the address that follows "Source Network Address:" then you can do it with a very naive regex like this:
my ($account_name) = /Account Name:\s+(\S+)/;
my ($source_network_addr) = /Source Network Address:\s+(\S+)/;
That doesn't attempt to validate that anything else in the line is as you expect it to be, but if the application is only parsing lines that are generated by IIS or whatever, it may not need to be really precise.