how to save cloud foundry username and password? - cloud-foundry

I can login to cloud-foundy/bluemix using:
cf login -u my_cf_username -p my_cf_password
I can set up environment variables to store these values (e.g. by setting them in .bashrc or .profile):
export cf_user=my_cf_username
export cf_pass=my_cf_password
cf login -u $cf_user -p $cf_pass
However, ideally I would like the Cloud Foundry username and password to be retrieved from a properties file, e.g. $HOME/.cf - is this possible?

This isn't possible with the current CF CLI client.
It should be possible to write a custom CF CLI Plugin to handle this behaviour.

If you simply want to avoid the login step, wouldn't a bash script suffice (as suggested in Reading username and password from file and others)?
To the larger question of how to secure credentials beyond logging into the CLI, I recommend Pat Mueller's blog entry How your cloud application should access credentials and other private data: Use a Cloud Foundry user-provided service and bind key/value to it with locally-defined VCAP. He recommends using the cfenv package as shown below:
cfenv = require("cfenv")
var localVCAP = require("./local-vcap.json")
var appEnv = cfenv.getAppEnv({vcap: localVCAP})
var creds = appEnv.getServiceCreds(/session-secret/) || {}
console.log("session secret is:", creds.secret)
You then create a user-provided service with the cf cups command (cups = create-user-provided-service). To get access to the credentials from a given service, bind the cups to it. He notes several advantages of this approach; I especially like that values are only visible to users who have the appropriate access to the cups service. For more details, see the source article cited above.

Related

Creating credentials file on server via Dockerfile & Google cloud secret manager

I'm using Google cloud build for CI/CD for my django app, and one requirement I have is to set my GOOGLE_APPLICATION_CREDENTIALS so I can perform authenticated actions in my Docker build. For example, I need to run RUN python manage.py collectstatic --noinput which requires access to my Google cloud storage buckets.
I've generated the credentials and it works well when simply including it in my (currently private) repo as a .json file, so it gets pulled into my Docker container with the COPY . . command and setting the env variable with ENV GOOGLE_APPLICATION_CREDENTIALS=credentials.json. Ultimately, I want to grab the credential value from secret manager and create the credentials file during the build stage, so I can completely remove the credentials from the repo. I tried doing this with editing cloudbuild.yaml (referencing this doc) with various implementations of the availableSecrets config, $$SECRET syntax, and build-args in the docker build command and trying to access in Dockerfile with
ARG GOOGLE_BUILD_CREDS
RUN echo "$GOOGLE_BUILD_CREDS" >> credentials.json
ENV GOOGLE_APPLICATION_CREDENTIALS=credentials.json
with no success.
If someone could advise me how to implement this in my cloudbuild.yaml and Dockerfile if its possible, or if there's another better solution altogether, would be much appreciated.
This is the relevant part of my cloudbuild.yaml
steps:
- name: gcr.io/cloud-builders/docker
args:
- build
- '--no-cache'
- '-t'
- '$_GCR_HOSTNAME/$PROJECT_ID/$REPO_NAME/$_SERVICE_NAME:$COMMIT_SHA'
- .
- '-f'
- Dockerfile
id: Build
availableSecrets:
secretManager:
- versionName: projects/PROJECT_ID/secrets/CREDENTIALS/versions/latest
env: 'CREDENTIALS'
If your container will run on Cloud Run, it's super easy: Remove the service account key file (and roughly, in most use cases, you never ever need it).
Keep in mind that a service account key file is a secret with a private key. And if you put it in your container, you simply store it in plain text. So bad for a secret!! (with dive, you can explore your container content, and steal the secret if you have access to the container directly)
But, I'm sure you know that because you want to store the secret in a secret manager. Now a question? How do you access a secret manager? Do you need a service account key file to be authenticated to access it?
In fact not.
The solution is to use ADC (Application default credentials). With the client libraries, use the get default credential method to let the library determine automatically the platform and the credential to use
On Cloud Run (as any other Google Cloud services), you have a metadata server that allows client libraries to get credentials information from the runtime service account.
On your local environment, you have 2 options:
Use your own credential. For that run the command gcloud auth application-default login. It's your own credential and permissions, not exactly the same as the Cloud Run runtime environment
Impersonate the Cloud Run runtime service account and act as itself to run your container/code locally. For that run the command gcloud auth application-default login --impersonate-service-account=<service account email>, Be sure to have the role service account token creator on the service account.
And then, run your app locally, and let the ADC use the credentials
I think I've worked out a fix. To solve the error I mentioned in my reply to #guillaume-blaquiere, I updated my build args in cloudbuild.yaml to include --network=cloudbuild, allowing me access to the correct service account credentials (credit to this answer).
The next issue I faced is with the django-storages library, returning this exception
AttributeError: you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
I then came across this suggestion to add the setting GS_QUERYSTRING_AUTH = False to my django config and this seems to do the trick. My only conern is the documentation here does not go into too much detail on impacts or risks of disabling this (the bucket is public-read as it recommends). It seems to be working as intended however. So I will go with this configuration unless a better solution is put forward.

gsutil login using installed application json file

I want to us gsutil to list files in a google cloud bucket.
I installed the google cloud sdk, my problem is that i have a credential file client_secrets.json that is for a installed application, it looks something like this
{
"installed": {
"client_id": "837647042410-75ifg...usercontent.com",
"client_secret":"asdlkfjaskd",
"redirect_uris": ["http://localhost", "urn:ietf:wg:oauth:2.0:oob"],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token"
}
}
I know you can authenticate using
gcloud auth activate-service-account --key-file=credentials.json
but that only works for service accounts and mine is an installed application credential and it didn't work.
Instead of using gcloud auth I've tried to switch to do authentication using boto by running
gcloud config set pass_credentials_to_gsutil false
and updating the .boto config file to update the client_id and client_secret manually from my file but when i run gsutil config to run the new credentials it keeps asking me to login through the browser to provide a code after a manual login:
Please navigate your browser to the following URL:
https://accounts.google.com/o/oauth2/auth?client_id=20487....
Enter the authorization code:
but this is an app that should not need a user to login since i already have a credential file.
What am i missing? how can i authenticate using an install app client_secrets.json using gsutil.
You don't have a credential. Your file is a OAuth2 Client Id (of type Desktop). The tools use it to initiate a OAuth 2 authentication, and yes you need to log in because there isn't any credential in this file.
The clientID and client secret are used to authenticate your log in request with your login and your password, it's not a credential of you, just for the OAuth2 protocol!
This Documentation provides a complete workflow to use an installed application credentials, you need to configure your application and save your credentials in a client_secrets.json file and after the whole configuration and authentication process you will be able to use the gcloud/ gsutil command. You can refer to this Public Google Documentation for managing your secret key to create and authenticate your service account key.
You can also refer to the stackoverflow Question where a googler has explained how to pass key.json file to gsutil by using .boto configuration for your json file.

PermissionDenied: 403 IAM permission 'dialogflow.intents.list'

I'm trying to get the list of the intents in my Dialogflow agent using Dialogflow's V2 APIs but have been getting the following error:
PermissionDenied: 403 IAM permission 'dialogflow.intents.list' on 'projects/xxxx/agent' denied.
I adopted the following steps:
I created a new agent(with V2 APIs enabled) and a new service account for it.
I downloaded the JSON key and set my GOOGLE_APPLICATION_CREDENTIALS variable to its path.
Following is my code:
import dialogflow_v2 as dialogflow
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="/home/user/folder/service-account-key.json"
client=dialogflow.IntentsClient()
parent = client.project_agent_path('[PROJECT_ID]')
for element in client.list_intents(parent):
pass
I have made various agents and service accounts and even changed the role from Admin to Client but can't figure out any solution. I tried the following solution but didnt' work
Tried Solution: DialogFlow PermissionDenied: 403 IAM permission 'dialogflow.sessions.detectIntent'
There is no need for creating a new Agent. You can edit the existing agents IAM.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
In google cloud, go to IAM Admin > IAM under tab Members. Find the name of your agents and then click on edit.
Give admin permissions to the agent to give permissions to list intent.
The problem lies in the IAM section of GCP. Probably you are making a POST request with a role that does not have the necessary authorizations.
Look into your key.json file that contains the field "client_email"
Proceed to the IAM page and set the relevant role with that email to
a role that has posting capabilities. (e.g. Admin)
This solved my problem.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
(Optional) In the Cloud console, go to the menu icon > APIs & Services > Library. Select any APIs (if any) > Enable.
In Cloud Console > under the menu icon ☰ > APIs & Services > Credentials > Create Credentials > Service Account Key.
Under Create service account key, select New Service Account from the dropdown and enter a project name and for role choose Owner > Create.
JSON private key file will be downloaded to your local machine that you will need.
For Javascript:
In the index.js file you can do service account auth with JWT:
const serviceAccount = {}; // Starts with {"type": "service_account",...
// Set up Google Calendar Service account credentials
const serviceAccountAuth = new google.auth.JWT({
email: serviceAccount.client_email,
key: serviceAccount.private_key,
scopes: 'https://www.googleapis.com/auth/xxxxxxx'
});
For Python:
There's a Google Auth Python Library available via pip install google-auth and you can check out more here.
When you create the intentClient, use following:
key_file_path = "/home/user/folder/service-account-key.json";
client=dialogflow.IntentsClient({
keyFilename: key_file_path
})
Intents list
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Keep in mind that when you set an environment variable value in a session, it is reset every time the session is dropped.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide, in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Try also to create project in DialogFlow Console
https://dialogflow.cloud.google.com/
You need to create the following as environment variable
googleProjectID: "",
dialogFlowSessionID: "anything",
dialogFlowSessionLanguageCode: "en-US",
googleClientEmail: "",
googlePrivateKey:
I think you might have missed the Enable the API section in the documentation setup.
Here is that link:
https://cloud.google.com/dialogflow/cx/docs/quick/setup#api
After clicking the link, select the chatbot project you created and fill the necessary instructions given there.
The permissions that I have given for that project are Owner, and editor.
After this, try the code in this link:
https://cloud.google.com/dialogflow/es/docs/quick/api#detect_intent
You should get a response from your chatbot
Hope this helps!

Cloud Foundry: Credentials were rejected

I'm having trouble with cf login. Whenever I try to login i get the following error:
Credentials were rejected, please try again.
Now I'm Aware that this is a known issue. But even with alternatives, I always get the same message.
What I've tried so far:
Git CMD; cmd.exe; Powershell;
cf login -u mail#adress.com; (and different variations)
cf auth "Mail#Adress.com" "MyPassword";
What else is there I can try or what am I missing?
When you run cf login or cf auth and specify a username/password, you are attempting to authenticate with UAA using a standard password based authentication. If your company uses single sign-on (i.e. SSO), this will not work.
When your company uses SSO, you need to run cf login -sso (that's it, no other flags). In this case, the cf cli will give you a link to click. You'll go to this link in your browser and it will display a one-time password. The cf cli will then prompt you for the one-time password, type it in and the cli will use that to log you in with your company's SSO system.
Again, if your company has set up SSO you must use the cf login -sso option to login. As far as I know, there's no way to tell from the cf cli if your company requires SSO, but you can tell if you go to login.system_domain in your browser (where system_domain the the DNS entry set up for CF). When you go to that page, you'll see the login form and when SSO is set up the bottom of the form will have a link that says "click here to login with SSO" or something like that. That link only shows up when SSO is enabled.

What is default password for Jupyter created on google's data proc

I set data proc using the steps in link here
https://cloud.google.com/dataproc/docs/tutorials/jupyter-notebook
But my jyputer keep asking for password
I didn't set any password.
I tried my google account password that doesn't work
I ran ../root$ sudo grep -ir password
and get following, so that confirmed no password is set
.jupyter/jupyter_notebook_config.py:## Hashed password to use for web authentication.
.jupyter/jupyter_notebook_config.py:# The string should be of the form type:salt:hashed-password.
.jupyter/jupyter_notebook_config.py:#c.NotebookApp.password = u''
.jupyter/jupyter_notebook_config.py:# Only used when no password is enabled.
.local/share/jupyter/runtime/nbserver-3668.json: "password": false,
Since the initialization action just installs from latest using conda install jupyter, this appears to have been caused by a recent upstream change, specifically upgrading the notebook component from 4.2.3 to 4.3.0 causing token-based auth to be turned on by default. A recent cluster I deployed a couple weeks ago using the out-of-the-box init action didn't have the same login you're seeing; the design of the init action is to let Google Compute Engine firewalls be your layer of defense and the SSH tunnel being your secure connection, rather than relying on various third-party implementations of auth from the different Hadoop/Spark tools and web UIs.
The solution will be to add a line to setup-jupyter-kernel.sh:
echo "c.NotebookApp.token = u''" >> ~/.jupyter/jupyter_notebook_config.py
to disable jupyter-side authentication altogether and revert to the behavior a couple weeks ago. Note that if you want to do this yourself you'll have to fiddle with the INIT_ACTIONS_REPO and INIT_ACTIONS_BRANCH settings in jupyter.sh which may take some getting used to if you haven't been customizing it already. We'll try to push a fix as soon as possible and once that's done you should be able to use the out-of-the-box init action without causing the login screen again.
If you already have a cluster running, you can disable the auth for your jupyter server by running that manually as root after SSH'ing into the master:
sudo su
killall -9 jupyter-notebook
echo "c.NotebookApp.token = u''" >> ~/.jupyter/jupyter_notebook_config.py
/dataproc-initialization-actions/jupyter/internal/launch-jupyter-kernel.sh
Alternatively, if you do want to keep the new default token-authorization approach, the jupyter server actually logs a generated token to /var/log/jupyter_notebook.log; look for a line stating The Jupyter Notebook is running at: http://[all ip addresses on your system]:8123/?token=[some-token-string-here]; that token string can be plugged in to the password field or in the URL parameter as it shows.
EDIT: The fix has now been committed into Dataproc's init action repository and synced to gs://dataproc-initialization-actions. Deployments out-of-the-box once again work without an extra login page in the Jupyter UI.
A new metadata option has also been added if you do want to specify a token which Jupyter also allows to be used in the password field, with key JUPYTER_AUTH_TOKEN. Use it as follows only if you want a login page requesting your specified token (no metadata keys are necessary if you just want the old behavior of no login page):
gcloud dataproc clusters create \
--initialization-actions gs://dataproc-initialization-actions/jupyter/jupyter.sh \
--metadata JUPYTER_AUTH_TOKEN=foobarbaz
Then your login password will be foobarbaz.
When you dont set any password you can login with the your server credentials where it is installed.