gcloud command to list all project owner - google-cloud-platform

Searching for a GCP cmd to list all the active owners of a project. Have tried using the below cmd but it lists all the IAM policies. I only require project owner information.
gcloud projects get-iam-policy $PROJECT-ID

Try:
PROJECT="[YOUR-PROJECT-ID]"
gcloud projects get-iam-policy ${PROJECT} \
--flatten="bindings" \
--filter="bindings.role=roles/owner" \
--format="value(bindings.members[])"
This uses gcloud's --flatten, --format and --filter. See [this] post for a very good explanation.
It's confusing but --filter can only be used on lists and so --flatten is used to convert a single resource with a single bindings into multiple documents root on bindings.
Then it's possible to filter out the roles of value roles/owner.
Then format the result to include only the members.
Note: members are prefixed with the type (user:, serviceAccount: etc.). You may want to further process these.
Or:
PROJECT="[YOUR-PROJECT-ID]"
FILTER="
.bindings[]
|select(.role==\"roles/owner\").members"
gcloud projects get-iam-policy ${PROJECT} \
--format=json \
| jq -r "${FILTER}"
If you're willing to use jq to process JSON. You can have gcloud --format=json emit JSON and then process it using jq.
The advantage of this approach is that you learn and use one tool (i.e. jq) to process JSON output from any number of commands (not just gcloud).
The disadvantage of this approach is that you need to use multiple tools (gcloud and jq) instead of just one (gcloud).
In the case of jq, it's easier (!?) to write a filter that extracts the email from the member:
FILTER="
.bindings[]
|select(.role==\"roles/owner\").members[]
|split(\":\")[1]"

Related

Terraform script to build and run Dataflow Flex template

Need to convert these 2 gcloud commands to build and run dataflow jobs using Terraform.
gcloud dataflow flex-template build ${TEMPLATE_PATH} \
--image-gcr-path "${TARGET_GCR_IMAGE}" \
--sdk-language "JAVA" \
--flex-template-base-image ${BASE_CONTAINER_IMAGE} \
--metadata-file "/Users/b.j/g/codebase/g-dataflow/pubsub-lite/src/main/resources/g_pubsublite_to_gcs_metadata.json" \
--jar "/Users/b.j/g/codebase/g-dataflow/pubsub-lite/target/debian/pubsub-lite-0.0.1-SNAPSHOT-uber.jar" \
--env FLEX_TEMPLATE_JAVA_MAIN_CLASS="com.in.g.gr.dataflow.PubSubLiteToGCS"
gcloud dataflow flex-template run "pub-sub-lite-flex-`date +%Y%m%d-%H%M%S`" \
--template-file-gcs-location=$TEMPLATE_FILE_LOCATION \
--parameters=subscription=$SUBSCRIPTION,output=$OUTPUT_DIR,windowSize=$WINDOW_SIZE_IN_SECS,partitionLevel=$PARTITION_LEVEL,numOfShards=$NUM_SHARDS \
--region=$REGION \
--worker-region=$WORKER_REGION \
--staging-location=$STAGING_LOCATION \
--subnetwork=$SUBNETWORK \
--network=$NETWORK
I've tried using the resource google_dataflow_flex_template_job from which i can run the dataflow job using the stored dataflow template(2nd gcloud command), now I need to create the template and docker image as per my 1st gcloud command using terraform ?
Any inputs on this ?? And whats the best way to pass the jars used in the 1st gcloud command (placing it in GCS bucket) ?
And whats the best way to pass the jars used in the 1st gcloud command (placing it in GCS bucket)?
There is no need to manually store these jar files in GCS. The gcloud dataflow flex-template build command will build a docker container image including all the required jar files and upload the image to the container registry. This image (+ the metadata file) is the only thing needed to run the template.
now I need to create the template and docker image as per my 1st gcloud command using terraform ?
AFAIK there is no special terraform module to build a flex template. I'd try using the terraform-google-gcloud module, which can execute an arbitrary gcloud command, to run gcloud dataflow flex-template build.
If you build your project using Maven, another option is using jib-maven-plugin to build and upload the container image instead of using gcloud dataflow flex-template build. See these build instructions for an example. You'll still need to upload the json image spec ("Creating Image Spec" section in the instructions) somehow, e.g. using the gsutil command or maybe using terraform's google_storage_bucket_object, so I think this approach is more complicated.

How to authenticate a gcloud service account from within a docker container

I’m trying to create a docker container that will execute a BigQuery query. I started with the Google provided image that had gcloud already and I add my bash script that has my query. I'm passing my service account key as an environment file.
Dockerfile
FROM gcr.io/google.com/cloudsdktool/cloud-sdk:latest
COPY main.sh main.sh
main.sh
gcloud auth activate-service-account X#Y.iam.gserviceaccount.com --key-file=/etc/secrets/service_account_key.json
bq query --use_legacy_sql=false
The gcloud command successfully authenticates but can't save to /.config/gcloud saying it is read-only. I've tried modifying that folders permissions during build and struggling to get it right.
Is this the right approach or is there a better way? If this is the right approach, how can I get ensure gcloud can write to the necessary folder?
See the example at the bottom of the Usage section.
You ought to be able to combine this into a single docker run command:
KEY="service_account_key.json"
echo "
[auth]
credential_file_override = /certs/${KEY}
" > ${PWD}/config
docker run \
--detach \
-env=CLOUDSDK_CONFIG=/config \
--volume=${PWD}/config:/config \
--volume=/etc/secrets/${KEY}:/certs/${KEY} \
gcr.io/google.com/cloudsdktool/cloud-sdk:latest \
bq query \
--use_legacy_sql=false
Where:
--env set the container's value for CLOUDSDK_CONFIG which depends on the first --volume flag which maps the host's config that we created in ${PWD} to the container's /config.
The second --volume flag maps the host's /etc/secrets/${KEY} (per your question) to the container's /certs/${KEY}. Change as you wish.
Suitably configured (🤞), you can run bq
I've not tried this but that should work :-)

How to list all projects in GCP that belongs to a specific organization

gcloud allows you to list organization, folders or projects. I didn't found a option to list projects inside a organization.
Something like:
gcloud projects list --organization=ORG
You can use Cloud Asset inventory. The base query is the following:
gcloud beta asset search-all-resources \
--asset-types=cloudresourcemanager.googleapis.com/Project \
--scope=organizations/920778098964
You can play with page size if you want to have a long list of results. More details here
I personally prefer to export to BigQuery all the assets and then query what I want in it. Project, but also VM, firewall rules,....
I think there's no quick way like you mentioned with --organization arg, but that could be accomplished with, for example, the following UNIX-like script:
for project_id in $(gcloud projects list --format='value(project_id)'); do
org_id=$(gcloud projects describe $project_id --format='value(parent.id)')
if [ $org_id -eq $the_org_you_want_to_find_out ]; then
echo "$org_id > $project_id"
fi
done
You can use gcloud:
gcloud projects list --filter 'parent.id=id-organization123456 AND parent.type=organization' | awk '{print $1 }' > projects.txt
Not what you asked, but if you want to filter to all projects assigned to a folder you can use --filter='parent.id:40123456789':
$ gcloud organizations list
DISPLAY_NAME ID DIRECTORY_CUSTOMER_ID
example.com 10123456789 C0123abc
$ gcloud resource-manager folders list --organization=10123456789
DISPLAY_NAME PARENT_NAME ID
demos organizations/10123456789 40123456789
test organizations/10123456789 50123456789
$ gcloud projects list --filter='parent.id:40123456789'
PROJECT_ID NAME PROJECT_NUMBER
demo-one demo-one 301123456789
demo-two demo-two 302123456789
demo-three demo-three 303123456789
You could list all the project in an organization using the following command:
gcloud projects list
The definition of the command is:
Lists all active projects, where the active account has Owner, Editor
or Viewer permissions. Projects are listed in alphabetical order by
project name. Projects that have been deleted or are pending deletion
are not included.
If you only need the project_ID, the name or the project number you could use:
gcloud projects list --format 'value(project_id)'
gcloud projects list --format 'value(name)'
gcloud projects list --format 'value(project_number)'

gcloud services enable/disable --async --format=json returns empty array

gcloud supports scripting with --format option and gcloud services enable --async returns a command that could be used to wait till the operation is completed.
E.g. the following call gcloud services disable servicenetworking.googleapis.com --async may return smth like:
Asynchronous operation is in progress... Use the following command to wait for its completion:
gcloud beta services operations wait operations/acf.<UID>
The problem here is that the output is returned as 2 rows and is not that easy to include into automated scripts. The first idea is to use the --format option with smth like --format=json and use jq afterward, but the --format=json just does nothing for the gcloud services enable/disable, it always returns [].
So, I found out that gcloud services enable/disable has no actual output, but instead, the output we receive with --async goes to the error stream.
So, I've created this small script that allows grabbing the operation ID from the output, store it in a file and then process in whatever way we want:
wait_operation_id_file="$(mktemp /tmp/enable_service_operation.XXXXXXX)"
gcloud services enable "servicenetworking.googleapis.com" --async 2>&1 \
| grep 'gcloud beta services operations wait' \
| sed 's/.*wait //' \
>> "${wait_operation_id_file}"
wait_id="$(cat "${wait_operation_id_file}")"
gcloud services operations wait "${wait_id}"
rm --force "${wait_operation_id_file}"

Get the Default GCP Project ID with a Cloud SDK CLI One-Liner

I’m looking for a gcloud one-liner to get the default project ID ($GCP_PROJECT_ID).
The list command gives me:
gcloud config list core/project
#=>
[core]
project = $GCP_PROJECT_ID
Your active configuration is: [default]
While I only want the following output:
gcloud . . .
#=>
$GCP_PROJECT_ID
The easiest way to do this is to use the --format flag with gcloud:
gcloud config list --format 'value(core.project)' 2>/dev/null
The --format flag is available on all commands and gives you full control over what is printed, and how it is formatted.
You can see this help page for full info:
gcloud topic formats
Thanks to comment from Tim Swast above, I was able to use:
export PROJECT_ID=$(gcloud config get-value project)
to get the project ID. Running the get-value command prints the following:
gcloud config get-value project
#=>
Your active configuration is: [default]
$PROJECT_ID
You can also run:
gcloud config get-value project 2> /dev/null
to just print $PROJECT_ID and suppress other warnings/errors.
With Google Cloud SDK 266.0.0 you can use following command:
gcloud config get-value project
Not exactly the gcloud command you specified, but will return you the currently configured project:
gcloud info |tr -d '[]' | awk '/project:/ {print $2}'
Works for account, zone and region as well.
From Cloud Shell or any machine where Cloud SDK is installed, we can use:
echo $DEVSHELL_PROJECT_ID
And as shown in the below screenshot.
I got a question about how to set the environment variable $DEVSHELL_PROJECT_ID; here are the steps:
If the URL has the variable project and is set to some project id, then the environment variable $DEVSHELL_PROJECT_ID usually will be set to the project id.
If the variable project is not set in the URL, we can choose the project from the Combobox (besides the title Google Cloud Platform) which will set the variable project in the URL. We may need to restart the Cloud Shell or refresh the entire web page to set the environment variable $DEVSHELL_PROJECT_ID.
Otherwise, if the environment variable $DEVSHELL_PROJECT_ID is not set, we can set it by the command shown below where we replace PROJECT_ID with the actual project id.
gcloud config set project PROJECT_ID
All these are shown in the below figure.
Direct and easy way to get the default $PROJECT_ID is answered above.
In case you would like to get $PROJECT_ID from the info command, here is a way to do it:
gcloud info --format=flattened | awk '/config.project/ {print $2}'
or:
gcloud info --format=json | jq '.config.project' | tr -d '"'
Just run:
gcloud info --format={flattened|json}
to see the output, then use awk, jq or similar tools to grab what you need.