Is it possible to access key-value pair from secret manager file? - google-cloud-platform

I have a secret manager file in GCP which stores different secrets like db_username,db_password,gcp_project_id etc. The secret file would look something like this
db_username=admin
db_password=admin
gcp_project_id= my-project-2345
Is it possible to create a single secret with name dev-secret and then access individual key value pair like dev-secret.db_username ? I want to have one secret file and it will have multiple config and then access this using CloudRun secrets which will be accessed as an environment variable

Related

Set custom environment variables in AWS

I am using AWS sagemaker, I have some secret keys and access keys to access some APIs that I don't want to expose directly in code.
What are the ways like environment variables etc., that can be used to hide these keys and I can use them securely, and how to set them.
AWS System Manager (SSM) is designed to store keys and tokens securely.
Depending on how your notebook is defined, you could use the 'env' property directly or in training data, or you could access SSM directly from sagemaker. For example this Snowflake KB article explains how to fetch auth info from ssm: https://community.snowflake.com/s/article/Connecting-a-Jupyter-Notebook-Part-3

Is there a way to access AWS resources with Access KeyID and Secret Key stored in parameter store?

I currently have a python utility running on my laptop that connects to AWS resources using Access keyID and Secret Access key from an IAM user that I have provisioned. In the current method I am hard coding the Access Key and Secret Key in the utility to establish a session using boto3 and connect to the Database.
session = boto3.session.Session(
aws_access_key_id='ABCDEFGHIJKLMNOPQ'
,aws_secret_access_key='ZYXWVUTSRNMPOQ12345678')
client = session.client('rds', region_name=us-east-1)
Now I have saved the specific Access Key and Secret Key in AWS parameter store but I am trying to see if there is a way to retrieve from that so I dont have to hard code the Access key and secret key ?
There are different ways for example you can explore to use AWS STS to get temporary credentials or use Cognito identity pool to configure identities and use access tokens to access the resource without using access key secret keys. Sharing few links below.
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp_request.html
https://docs.aws.amazon.com/cognito/latest/developerguide/tutorials.html
https://www.youtube.com/watch?v=abTy-Yyo6lI

can we pass GCP(google cloud platform) service account credentials as key value pairs, without using the credential file?

In GCP, can we fetch service account keys from the AWS secrets manager as key-value pairs instead of passing them as a credential file? I'm trying to use the credentials in packer/Jenkins via fetching it from the AWS secret manager. please let me know it will be very helpful.

How to get only one secret value from one secret key using the AWS plug in "AWS Secrets Manager- Get Secret" in Azure DevOps/Pipeline/Release

In Azure DevOps/Pipeline/Release I am using the AWS plug in "AWS Secrets Manager- Get Secret".
In My AWS secret I have the secret name and in that I have two secret keys with values.
Using the AWS plug in "AWS Secrets Manager- Get Secret" is there any way to pull out the value for just one of the keys?
Say for instance that I only want to get the secret value of the "db_pw" key using the plug in and assign that to the Azure variable password "DB_password".
I understand that I can just point the plug in it to the secret "TVS_Live" but it puts the two keys I have in the secret in the ADO variable "DB_password" and then I have to later parse it all out using something like jq.
I just want to pull the secret value of one secret key out - "db_pw" - resulting in me getting "" in the Azure DevOps variable "DB_Password" (see images below)
Is there a way to do this?
Here is my ADO AWS secret plug in configuration (which now pulls in everything) :
Here is my AWS secret configuration:
According to the task document description, this should be by design.
Use this task to retrieve the value of a secret stored in AWS Secrets
Manager and store it locally in an Azure DevOps build variable. The
build variable will be automatically set to 'secret' mode to
automatically mask the value when logged or otherwise displayed.
To achieve your needs, you could need to split secret keys into separate Secrets to obtain the value of a single secret key. You can submit a feature request to this task on github.

Connect S3 via NIFI without credential

Today I have a NIFI that saves data on the S3, but when changing environments and machines I put a directory with the credentials (sometimes I need to change the credential) each EC2 . I would like to know if there is a way that I can connect the S3 automatically without having to change the file with the credential at each machine change.
Thanks
I'm not sure I understand the question. Do you want to set the process so that NiFi is agnostic to the credentials and just "saves data to S3", being told the credentials by the particular machine this flow is running on? Or embed the credentials in NiFi so that no matter which machine this flow is running on, it uses the same credentials? Both are possible.
Credentials provided by machine
You can populate the AWS credentials (Access Key and Secret Key) in three ways:
Provide a file path for the Credentials File processor property pointing to a file on disk which contains these credentials.
Populate the appropriate properties of the PutS3Object component using parameters.
Create an AWSCredentialsProviderControllerService instance with those values and reference it from this processor
Whatever credentials are in the credentials file on disk, the parameter context, or the referenced controller service will be used. If the flow segment is deployed to a different NiFi instance (and the appropriate credentials file exists, or the parameters are populated, or the controller service populated [depending on the scenario]), those new values will be used.
Credentials embedded in NiFi flow
Either populate the AWS credentials (Access Key and Secret Key) in the appropriate properties of the PutS3Object component, or create an AWSCredentialsProviderControllerService instance with those values and reference it from this processor. If you deploy this flow to another NiFi instance, it will continue to use these same credentials.