How to set an environment variable in AWS MWAA - amazon-web-services

I'm trying to set the GOOGLE_APPLICATION_CREDENTIALS environment variable in MWAA to authenticate Google Cloud, but I can't figure out how.
In GCP Composer I can just use the console and add an environment variable, is there nothing like that in AWS MWAA?
Thanks in advance

You could use a custom environment variable like in the docs
Or you store google cloud creds in the connection manager.

Related

How to auto authenticate GCP VM' service account with ansible?

We have created a VM that has a service account assigned to it and ansible installed on it.
I want ansible to run inside that VM and by default use the identity (service account) of the VM
Do i still have to set GCP's default environment variables for authentication?
If yes - i do not know where to point the "GCP_SERVICE_ACCOUNT_FILE" to?
The VM does not contain an account file after its creation as far as I know, so im not sure how i can automate this task.
Any help appreciated, thanks

How to store GOOGLE_APPLICATION_CREDENTIALS in an AWS ECS environment on Fargate?

We have an API app that uses Firebase Admin to send messages to devices.
Earlier, we used to specify the service account key using environment variable like GOOGLE_APPLICATION_CREDENTIALS="path_to_file.json".
But now, since we are shifting to AWS Elastic Container Service on Fargate, I am unable to figure out how to put this file in the container for AWS ECS.
Any advice highly appreciated.
Thanks
Solved it by storing the service key as a JSON Stringified environment variable & using admin.credential.cert() instead of defaultAppCredentials.
Refer: https://firebase.google.com/docs/reference/admin/node/admin.credential#cert
I would suggest instead AWS Secrets Manager that is purpose-built for storing secrets. Take a look to his blog post:
https://aws.amazon.com/blogs/compute/securing-credentials-using-aws-secrets-manager-with-aws-fargate/
Even better than using environment variables which have their own downsides, you can leverage AWS Parameter Store which is a secure way to manage secrets in the AWS environment (where secrets are encrypted both in transit and at rest).
You'd need to create an IAM role for Amazon ECS for your code to have access to the Parameter Store.
You may want to check this article: https://aws.amazon.com/blogs/compute/managing-secrets-for-amazon-ecs-applications-using-parameter-store-and-iam-roles-for-tasks/
Use the specific method from_service_account_info as described here. You then pass the content of the credentials json file as a dictionary.

Can apache startup run a script to populate environment?

We're moving a website to the AWS environment and running apache2 on an EC2. We were planning on using the AWS secrets manager to store some of the credentials such as the RDS (database) and email credentials. I also use environment variables in apache to store the AWS credentials, but since it is the AWS credentials which are used to retrieve the secrets, I was wondering if there was any way to run a script on apache start-up to use the aws-sdk to retrieve those secrets.
Or are there any other suggestions on how to do it? I can do it after the fact in the PHP code that needs such access, but I'm just exploring what is possible at the moment. It would be nice if some configuration options were loaded when apache starts.
Your two options seem to be either write a startup script that populates the env variables and then starts apache as a child process so it can read those variables, or modify the web application to read the secrets directly.
Reading the secrets directly in the application (possibly from some short lived cache) would allow you to rotate your DB creds without having to restart the application.
Also, if you are running on EC2 you do not need to populate env variables with the AWS creds. Just use roles for EC2. The AWS CLI and SDKs already know how to retrieve those credentials directly.

how to set credentials to use GCP API from Dataproc instance

I am trying to access some credentials stored in google Secret Manager. To access this its required to have credentials setup in the Cluster machine where the jar is running.
I have SSH into the master instance, and seen there is nothing configured for GOOGLE_APPLICATION_CREDENTIALS.
I am curious to know how to assign GOOGLE_APPLICATION_CREDENTIALS or any other alternative that allows to use GCP APIs that require credentials.
If you are running on Dataproc clusters, default GCE service account should be already configured for you. Assuming your clusters are running outside GCP environment, in that case you want to follow this instruction to manually set up a service account that has editor/owner role for Google Secret Manager, and download the credential key file and point GOOGLE_APPLICATION_CREDENTIALS to it.

Can we realtime update credential for pivotal cloud foundry?

Can we realtime update the PCF credential for Spring Cloud DataFlow?
The credential is defined in the yaml file:
SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_CLOUDFOUNDRY_ACCOUNTS[default]_CONNECTION_PASSWORD: xxx
It will be very helpful when rotating the password without restarting the Spring Cloud DataFlow.
Thanks,
Many of our PCF customers either rely on config-server, Vault or CredHub to automatically resolve the value for user/pass or other sensitive credentials.
For instance, if you're using CredHub (service-bound to SCDF), you would have the value for this property something like:
SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_CLOUDFOUNDRY_ACCOUNTS[default]_CONNECTION_PASSWORD: ${vcap.services.YOUR_scdf-server_credhub-sb.credentials.YOUR_scdf_cloudfoundry_password}
With this type of setting, you can rolling-update the sensitive credentials. Also, when using config-server or vault, you can remotely update the credentials in the Git backing repo, and the latest would take into account at runtime.