Is there a way to pull secrets in Postman from Hashicorp Vault?
I see the feature is still open from 2018
https://github.com/postmanlabs/postman-app-support/issues/5577
Preferably without 3rd parties
Related
Currently, we heavily rely on AWS SSM to store and read secrets from. All of our services and CodePipelines use AWS SSM for fetching the secrets.Secret rotation with AWS SSM requires the use of lambda functions, it would get quite tiresome since we have large number of secrets.
We researched about HashiCorp Vault and the two features that we would like to use our Secret rotation and Dynamic secrets. The secret rotation with HashiCorp seems painless as compared to SSM secret rotation.
Is it possible to use both of them together? Basically, the secret rotation would be done by HashiCorp Vault and the new values can be written back to AWS SSM. All the services (such as ECS, Beanstalk) will need to be restarted to fetch the new secrets.
In short, does Vault provide some sort of integration for the above or do I have to include the writing back part in the same secret rotation cron job script for the Vault?
HashiCorp Vault provides functionality of dynamic secretsts so there will be a vault agent which will fetch secrets from HashiCorp Vault and put it into a file or set as an environment variable. Application will read from the file or use environment variables. so you should not use SSM to fully leverage HashiCorp Vault.
But, if you want to use both HashiCorp Vault and SSM then what you can do is like Vault agent will fetch secrets from HashiCorp Vault and there will be a cron job which will change secrets into SSM.
We have a workflow in AWS that pulls twitter related data through API. We are doing it for over 100 twitter accounts and we expect the number to increase exponentially. Each twitter account has its own API key and so we want to use each key to pull data relating to that twitter account.
Our worklow previous used AWS secret but we only stored one account before but because we are planning to run it dynamically for separate accounts using their own API keys, we are thinking of using Hashicorp vault to store
We are wondering if anyone can point to best way we can enable the integration ? Is this the best approach or is there other scalable approach?
Hashicorp provides a Lambda extension as a way to integrate with Vault
https://learn.hashicorp.com/tutorials/vault/aws-lambda
This AWS blog post also details other options in the space and their advantages / disadvantages
https://aws.amazon.com/blogs/compute/choosing-the-right-solution-for-aws-lambda-external-parameters/
Currently, Our Security engineering team is not allowing to write in secrets manager but read is fine.
We have a common lambda role which is being used by other modules as well.
Is there any way to configure writing limited to only particular secrets ?
Alternatives to AWS secrets manager.
Is there any way to configure writing limited to only particular secrets?
Yes. This official documentation shows how to grant read access to only specific secrets. You could do the same thing with write access.
Alternatives to AWS secrets manager.
AWS SSM Parameter Store
Strong recommendation for using Secrets Manager or SSM Parameter Store to store secrets, but there are also other, non-AWS alternatives like Hashicorp's Vault. It can be found from AWS Marketplace (https://aws.amazon.com/marketplace/pp/prodview-ngzq6n42psnxa) or downloaded from the vendor website (https://www.vaultproject.io/).
Don't forget to ask security team review and approval before using it in production environments. :-)
I am having properties file specific for dev, test and other environments. I have to store this files in some secure place in aws. I am using AWS Native tools for build and deployment. Please let me know how to store these files in aws
There are many ways to deal with a secret in case of AWS, but one thing is clear it depends on the service where you will use and consume these secret.
But you explore these two
The simplest way is to use the Environment variable.
AWS Secrets Manager
s3 ( for keeping files)
One common approach is to pass your secret as an environment variables, but in case of AWS I will recommend to go with AWS Secrets Manager
AWS Secrets Manager is an AWS service that makes it easier for you to
manage secrets. Secrets can be database credentials, passwords,
third-party API keys, and even arbitrary text. You can store and
control access to these secrets centrally by using the Secrets Manager
console, the Secrets Manager command line interface (CLI), or the
Secrets Manager API and SDKs.
Basic Secrets Manager Scenario
The following diagram illustrates the most basic scenario. It shows
how you can store credentials for a database in Secrets Manager, and
then use those credentials in an application that needs to access the
database.
Compliance with Standards
AWS Secrets Manager has undergone auditing for the these standards and can be part of your solution when you need to obtain compliance certification.
You can explore this article to read and write secret.
If you need to maintain files, not only object then you can store in s3 and pull files during deployment. but better to enable server-side encprtion.
But I will prefer secret manager instead of s3 and environment variable.
You can for s3 here and here
Bajjuri,
As Adil said in answer,
AWS Secret Manager -- if you want to store data as key, value pairs.
AWS S3 -- if you want to store files securely.
Adding to his answer, you can use AWS CodeDeploy Environment
Variables to fetch the files according to the your environment.
Let's say you've CodeDeploy deployment group for dev environment with
name "DEV" and deployment group for prod environment with name "PROD",
you can use this variable in bash script and call it in life cycle
hooks of appspec file to fetch the files or secret accordingly.
I've been using this technique in production for long and it works like a charm.
I want to access secrets stored in Hashicorp Vault in Google Cloud Functions, and am wondering about the best way to authenticate and retrieve a token.
I think ideally I would use the default service account credentials inside the cloud function. Is it possible to use Vault's GCP auth backend and create a signed JWT from the default service account? I'm trying to avoid uploading any kind of credentials as part of the function source.
I think this project could give you good hints on what you want to do: https://github.com/kelseyhightower/vault-on-google-kubernetes-engine
Tell me if it helps.