Setting environment variables in cloud run or just put them in container file? [closed] - google-cloud-platform

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
When using the Cloud Run, what is the most secure way to deploy secret such as API keys? Should I dump them in a file in my project or should I set them up as environment variables in Cloud Run?

Neither in the container nor in the environment variable. Choose Secret Manager for this.
If you don't want to change your code, I wrote a wrapper (and an article) to do this

Related

HDFS is for Big Data storage and Azure storage [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I am using HDFS to store the data files. Want to know is there a way to use Azure storage to be used in the place of HDFS? If so how.
I am using Spark and Python.
Post an answer to end this question. As #Joel Cochran comments, you can use Azure Data Lake Gen2 which is fully HDFS compatible. You can refer to https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-introduction to start it.

Replicate an AWS instance [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I have one EC2 p3.large instance where I have installed several libraries, I want to make an exact replica of this instance as a backup. I need that this clone includes all the installed libraries, in that sense, something similar to a what a docker container does.
I have tried just to clone the instance as shown here:
https://docs.bitnami.com/aws/faq/administration/clone-server/
But this does not kept the installed libraries and files from the original instance to the new one.
You can make an AMI of the current instance and use it for back up anytime.
Related docs here

List gcp project resource types using api [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
Here I want to retrieve all the resources for a specific project or for the account using an java call (API call) (eg-buckets, storage resources etc)
There is no API call that will give you a list of all resources in a project as the different products use different API endpoints. You can however use the Cloud Asset Inventory to export all asset metadata at a certain point in time into a GCS file.

ADD a existing API without server downtime [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Can you ADD an existing API without server downtime on wso2 ESB?we want to deploy it onto server without interfering with coming live request?
Yes, it is possible. You can replace the existing API configuration in the deployment folder or the CAPP (If you used capps to deploy the api) and you do not need a server restart to apply changes.

Amazon S3 :: redirect on noSuchKeyError [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I need to refirect on another object on S3 when noSuchKeyError (object not found).
How can I do this?
Thanks
You have provided very little information so I don't know if this advise is useful or not. However, if you configure your S3 bucket for website hosting, it is possible to define custom error pages that map to particular HTTP error codes, as explained here.