GCP PubSub not honoring inherited permissions - google-cloud-platform

Some of my service accounts are getting 403 (user not authorized) errors trying to publish/subscribe to PubSub. It appears it's not honoring "Inherited" permissions from Project level IAM.
I have verified the service accounts have IAM permissions to PubSub Subscriber & Viewer; and when I check the topic and subscriptions, they list the service accounts as type "Inherited". If I manually add the service account to the same permission from PubSub Console the UI lists it as "Mixed" and then it works.
Background - It was working before!
What's strange is this was working fine before. I accidentally deleted these same service accounts yesterday. I recreated them the same way, setup permissions the same way and it won't work. Also, the accounts that weren't deleted still work using "Inherited" permissions.
Some other things I've tried:
Created service account with different name from what was deleted - didn't work
Re-created topics/subs after creating service accounts and giving them project-wide permissions- didn't work
Long term I guess I'd prefer to control permissions per Topic/Sub; but I'm still baffled why this isn't working or what I've done wrong.

There currently seems to be a limitation with project-level permissions when a service account is deleted and recreated. The permissions for the newly created service account will not be propagated as expected.
If the service account is created with a different name, inherited permissions should work correctly. Note that permission propagation is not immediate and can have a delay. You may have to wait a few minutes to see the changes reflected.
For further assistance, you may need to contact Cloud Support so they can look into the specifics of your situation.

Related

Google cloud function deployment vs runtime permissions

I am paranoid about security and this makes no sense to me. I must be missing something here. I can get it working no problems. But I want to know why? What is the philosophy behind it? and how am I protected?
I wrote a google cloud function that receives a post request and publishes an event to a google pubsub topic. I've set up my topic pubsub resource and set up an iam binding so that only my functions service account can publish to that channel - that is all good.
However, it does not let me deploy (using gcloud functions deploy --service-account=...) my function with that service account. Says it does not have secretAccessor and and deploymentManager.editor and cloudfunctions.developer etc
My confusion is...why should it need development/deployment related permissions? I am deploying the function and I have those permissions. So it should use my permissions to deploy. But when the function is actually running, I dont want it to have those development/deployment management permissions in case there is some vulnerability that can be exploited. I want it to run as the service account I specify. It needs to be restricted to only the permissions related to receiving request and publishing to my topic. Otherwise it would break the principle of having least privileges.
When you create a service such as Functions, Run, or Compute Engine, you, as the deployer, need two types of permissions:
Permission to create the service
Permission to assign an existing identity (aka service account) to the service.
The service typically needs an identity (service account) with appropriate permissions. The permissions are the ones required for the service to access other services and resources. This service runs independently of the identity that created the service.
Two identities and two sets of permissions to manage. That means your goal of least privilege can be achieved.
My confusion is...why should it need development/deployment related
permissions?
I do not know because your question does not have the details required to answer. The error you posted does not make sense in the context described. I am not aware of any instance where, for example, deploying a Function requires Deployment Manager Editor for the Function's identity. The function itself might need that IAM role, but the deployment command does not nor does the deployment command even know which permissions the function requires except for those derived by the command line flags.
If you need more help on this, edit your question to clearly describe both identities and IAM roles, the deployment, which resources are accessed, and how you are deploying. Include the actual commands and error messages.

GCP Pub/Sub default service account is not getting created when enabling the API

We have two projects in our GCP account; one for our Dev environment and one for our Test environment at the moment. Terraform manages most of our infrastructure, so we have minimal clicking around the GUI, or CLI commands.
I have assumed we enabled the Pub/Sub API by deploying to it with Terraform in both of our environments, although we may have needed to do this manually. We noticed that Google created a default Pub/Sub service account for us in our Dev environment, but not in our Test environment. This docs page suggests it should make this service account.
Additionally, we have noticed multiple Pub/Sub subscriptions working, apparently without any service account. We believe that the service account is only needed for this particular Subscription because it is a push to an e-mail server. Therefore, it needs a service account with the 'Service Account Token Creator' role.
We've attempted to redeploy the whole infrastructure and disable/re-enable the Pub/Sub API. Neither seemed to kick GCP into creating the Service Account. Further to this, we attempted to make the default service account manually. Still, GCP constrains the name a user can give a service account themselves, so we're unable to create a service account with the name that the Pub/Sub service would expect.
We wonder if there is some configuration of the project we may have missed or if anyone has seen this previously?
Does it not exist or does you not see it?
I'm pretty sure that it exists but without any role granted on it and you don't see it in the UI. Try to grant a role on this default service account, and it will appear in the IAM page!

Can't create job on GCP Cloud Scheduler

When I try to create a job in the GCP Cloud Scheduler I get this error:
{"error":{"code":7,"message":"The principal (user or service account) lacks IAM permission \"iam.serviceAccounts.actAs\" for the resource \"[my service account]\" (or the resource may not exist)."}}
When I enabled the GCP Cloud Scheduler the service account was created (and I can see it in my accounts list). I have verified that it has the "Cloud Scheduler Service Agent" role.
I am logged in as an Owner of our project. It is when I try to create the job that I get this error. I tried to add the "Service Account User" to my principal account, but to no avail.
Does anyone know if I have to add any additional permissions? Or if I have to allow my principal to act (impersonate?) this service account in some way?
Many thanks.
Ben
Ok I figured this out. The documentation is (sort of, in my view) clear if you read it in a certain way / know how GCP IAM works.
You actually need two service accounts. You need one that you set up yourself (can be whatever name you like and doesn't require any special permissions) and you also need the one for Cloud Scheduler itself.
Don't confuse the two. And use the one that you created when specifying the service account to generate the OAuth / OICD tokens.

AWS Permissions no longer work after consolidated billing

So we have this aws account with some permissions and it was working fine at first. We were able to deploy to aws using serverless framework. But then the client decided to setup an organization since they have other aws accounts also and to consolidate the billing under 1 account, they added the account they gave us to the organization. Now the problem is when we deployed using serverless again, serverless can no longer see the deployment bucket with an access denied error. But when the account was removed from the organization, serverless is able to locate the bucket. Is there some addition permissions or changes to the permissions that needs to be done when an account is linked to an organization? Can someone explain to me cause I can't seem to find any example of my scenario in a google search. I am new to AWS and this is the first time I experience organzations in AWS.
The only implication to permissions from joining an OU (organization unit) would be via the Service Contol Policy (SCP). Verify that the SCP attached to the organization does not block the actions you are attempting to execute.
We would love to get more information if possible, but I would maybe start looking in the following places in your consolidated account:
Trusted access for AWS services - https://console.aws.amazon.com/organizations/home?#/organization/settings
https://console.aws.amazon.com/organizations/home?#/policies
See if anything was changed there, if someone added a policy, or if the AWS Resource Access Manager is disabled.

Dataprep doesn't works - Cloud Dataflow Service Agent

I made a mistake deleting an user service-[project number]#dataflow-service-producer-prod.iam.gserviceaccount.com in Service accounts, I should have deleted another user.
After that, the Dataprep stopped running the jobs.
I've checked all guidelines about dataflow and dataprep: if the API is enable (yes, it is). If there is a proper service account (yes). But I don't know what rules to assign to these accounts.
I tried assigning the "Cloud Dataflow Service Agent" role for this account, but it doesn't appear for me >
I tried too assigning another roles, but didn't work.
It all started when I deleted this account erroneously.
Someone knows how solve this?
PS: I'm working progress with my English, sorry for some mistakes.
If you accidentally deleted the Dataflow service account, disable Dataflow API then re-enable it will create the service account again automatically.
Disabling/Enabling the API is not recommended as associated resources will be impacted. You should rather undelete the default service account in the following 30 days. You would need its ACCOUNT_UNIQUE_ID that can be found in the generated logs when it was deleted. Find details here.