Permission denied for Cloud Function to decrypt with KMS key - google-cloud-platform

I have a Python Cloud Function which uses a KMS key to decrypt some authentication tokens for other services from the environment, as in https://dev.to/googlecloud/using-secrets-in-google-cloud-functions-5aem
I keep getting a 403 Permission Denied whenever I run my function. When I call the function locally on my computer it works fine. I've tried adding the "Cloud KMS CryptoKey Decrypter" role to the default Compute Engine service account but that didn't work.
Any other ideas?
Edit: here's some code that shows what I'm doing. The environment variables are stored in an environment.yaml file which I point to when I gcloud functions deploy
def decrypt_secret(key: str, secret: str):
kms_client = kms.KeyManagementServiceClient()
decrypted = kms_client.decrypt(key, base64.b64decode(secret))
return decrypted.plaintext.decode("ascii")
def do_kms_stuff():
key = os.environ["KMS_RESOURCE_NAME"]
session = boto3.Session(
profile_name="my-profile",
aws_access_key_id=decrypt_secret(
key, os.environ["AWS_ACCESS_KEY_ID_ENCRYPTED"]
),
aws_secret_access_key=decrypt_secret(
key, os.environ["AWS_SECRET_ACCESS_KEY_ENCRYPTED"]
),
)
# ...
And here's the error from the Cloud Functions console:
File "<string>", line 3, in raise_from: google.api_core.exceptions.PermissionDenied: 403 Permission 'cloudkms.cryptoKeyVersions.useToDecrypt' denied on resource 'projects/my-project/locations/my-location1/keyRings/my-keyring/cryptoKeys/my-key' (or it may not exist). at error_remapped_callable (/env/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59) at func_with_timeout (/env/local/lib/python3.7/site-packages/google/api_core/timeout.py:214) at retry_target (/env/local/lib/python3.7/site-packages/google/api_core/retry.py:182) at retry_wrapped_func (/env/local/lib/python3.7/site-packages/google/api_core/retry.py:277) at
__call__ (/env/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:143) at decrypt (/env/local/lib/python3.7/site-packages/google/cloud/kms_v1/gapic/key_management_service_client.py:1816) at decrypt_secret (/user_code/kms_stuff.py:17) at do_kms_stuff (/user_code/kms_stuff.py:48) at my_cloud_function (/user_code/main.py:46) at call_user_function (/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py:214) at invoke_user_function (/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py:217) at run_background_function (/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py:383)

As #DazWilkin and #pessolato mentioned, the issue was that I was using the wrong service account. Once I changed to use the default AppSpot account everything worked smoothly.

Related

Is it possible to open the AWS Management Console website from AWS CLI?

Let's say I'm logged in into my AWS CLI tool with a particular account and I can execute commands like
aws ecr describe-repositories
Is there a AWS CLI command which opens up the AWS Management Console website on the default browser, already logged in in the same account?
E.g.: something like
aws web
Thanks!
While there is no such cli command inbuilt into aws cli. You can provide users with the direct access to the AWS Management Console if they have valid STS session IAM credentials (access and secret keys). You can read about the process of using getSigninToken action to generate pre-signed AWS console URL in exchange for your IAM creds here.
The python code example
import urllib, json, sys
import requests # 'pip install requests'
import boto3 # AWS SDK for Python (Boto3) 'pip install boto3'
# Step 1: Authenticate user in your own identity system.
# Step 2: Using the access keys for an IAM user in your AWS account,
# call "AssumeRole" to get temporary access keys for the federated user
# Note: Calls to AWS STS AssumeRole must be signed using the access key ID
# and secret access key of an IAM user or using existing temporary credentials.
# The credentials can be in Amazon EC2 instance metadata, in environment variables,
# or in a configuration file, and will be discovered automatically by the
# client('sts') function. For more information, see the Python SDK docs:
# http://boto3.readthedocs.io/en/latest/reference/services/sts.html
# http://boto3.readthedocs.io/en/latest/reference/services/sts.html#STS.Client.assume_role
sts_connection = boto3.client('sts')
assumed_role_object = sts_connection.assume_role(
RoleArn="arn:aws:iam::account-id:role/ROLE-NAME",
RoleSessionName="AssumeRoleSession",
)
# Step 3: Format resulting temporary credentials into JSON
url_credentials = {}
url_credentials['sessionId'] = assumed_role_object.get('Credentials').get('AccessKeyId')
url_credentials['sessionKey'] = assumed_role_object.get('Credentials').get('SecretAccessKey')
url_credentials['sessionToken'] = assumed_role_object.get('Credentials').get('SessionToken')
json_string_with_temp_credentials = json.dumps(url_credentials)
# Step 4. Make request to AWS federation endpoint to get sign-in token. Construct the parameter string with
# the sign-in action request, a 12-hour session duration, and the JSON document with temporary credentials
# as parameters.
request_parameters = "?Action=getSigninToken"
request_parameters += "&SessionDuration=43200"
if sys.version_info[0] < 3:
def quote_plus_function(s):
return urllib.quote_plus(s)
else:
def quote_plus_function(s):
return urllib.parse.quote_plus(s)
request_parameters += "&Session=" + quote_plus_function(json_string_with_temp_credentials)
request_url = "https://signin.aws.amazon.com/federation" + request_parameters
r = requests.get(request_url)
# Returns a JSON document with a single element named SigninToken.
signin_token = json.loads(r.text)
# Step 5: Create URL where users can use the sign-in token to sign in to
# the console. This URL must be used within 15 minutes after the
# sign-in token was issued.
request_parameters = "?Action=login"
request_parameters += "&Issuer=Example.org"
request_parameters += "&Destination=" + quote_plus_function("https://console.aws.amazon.com/")
request_parameters += "&SigninToken=" + signin_token["SigninToken"]
request_url = "https://signin.aws.amazon.com/federation" + request_parameters
# Send final URL to stdout
print (request_url)
I've also written AWS plugin back in the days that does exactly that you need but it does not work with aws cli v2
https://github.com/b-b3rn4rd/awscli-console-plugin
This does not exist. Credentials used for AWS CLI and for Console access are different.
For CLI you use Access key and Secret key.
For Console access (through a web browser) you use username and password.
It is possible that in an AWS account you have programmatic access but do not have console access.

How to set credentials to use Gmail API from GCE VM Command Line?

I'm trying to enable my Linux VM on GCE to access my Gmail account in order to send emails.
I've found this article, https://developers.google.com/gmail/api/quickstart/python, which reads some Gmail account information (it's useful since I just want to test my connection).
from __future__ import print_function
import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/gmail.readonly']
def main():
"""Shows basic usage of the Gmail API.
Lists the user's Gmail labels.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
service = build('gmail', 'v1', credentials=creds)
# Call the Gmail API
results = service.users().labels().list(userId='me').execute()
labels = results.get('labels', [])
if not labels:
print('No labels found.')
else:
print('Labels:')
for label in labels:
print(label['name'])
if __name__ == '__main__':
main()
However, I'm not sure which Credentials I need to set, as when I set and use:
Service Account: I receive the following message ValueError: Client secrets must be for a web or installed app.
OAuth client ID for Web Application type: the code runs well, however I receive the following message when trying to first authorize the application's access:
Erro 400: redirect_uri_mismatch
The redirect URI in the request, http://localhost:60443/, does not match the ones authorized for the OAuth client. To update the authorized redirect URIs, visit: https://console.developers.google.com/apis/credentials/oauthclient/${your_client_id}?project=${your_project_number}
OAuth client ID for Desktop type: the code runs well, however I receive the following message when trying to first authorize the application's access:
localhost connection has been refused
Does anyone know how is the correct setup and if I'm missing anything?
Thanks
[Nov17th]
After adding the gmail scope to my VM's scopes I ran the python script and I got the following error:
Traceback (most recent call last):
File "quickstart2.py", line 29, in <module>
main()
File "quickstart2.py", line 18, in main
results = service.users().labels().list(userId="107628233971924038182").execute()
File "/home/lucasnunesfe9/.local/lib/python3.7/site-packages/googleapiclient/_helpers.py", line 134, in positional_wrapper
return wrapped(*args, **kwargs)
File "/home/lucasnunesfe9/.local/lib/python3.7/site-packages/googleapiclient/http.py", line 915, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://gmail.googleapis.com/gmail/v1/users/107628233971924038182/labels?alt=json returned "Precondition check failed.">
I checked the error HTTP link and it shows:
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"errors": [
{
"message": "Login Required.",
"domain": "global",
"reason": "required",
"location": "Authorization",
"locationType": "header"
}
],
"status": "UNAUTHENTICATED"
}
}
Is any manual procedure needed in a "first authorization step"?
PS: reinforcing that I have already enabled my Service Account to "G Suite Domain-wide Delegation". This action generated an OAuth 2.0 Client ID, which is being used in the python script (variable userId).
Personally, I never understood this example. I think it's too old (and even compliant Python 2.6!!)
Anyway, you can forget the first part and get a credential like this
from googleapiclient.discovery import build
import google.auth
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/gmail.readonly']
def main():
creds, project_id = google.auth.default(scopes=SCOPES)
service = build('gmail', 'v1', credentials=creds)
# Call the Gmail API
results = service.users().labels().list(userId='me').execute()
labels = results.get('labels', [])
if not labels:
print('No labels found.')
else:
print('Labels:')
for label in labels:
print(label['name'])
if __name__ == '__main__':
main()
However, because you will use a service account to access to your GMAIL account, you have to change the userId with the ID that you want and to grand access from the GSuite admin console to the service account to have access to GMAIL API.
To achieve this, you need to grant the scope of your VM service account. Stop the VM, run this (long) command, and start your VM again. The command:
Takes the current scopes of your VM (and clean/format them)
Add the gmail scope
Set the scope to the VM (it's a BETA command)
gcloud beta compute instances set-scopes <YOUR_VM_NAME> --zone <YOUR_VM_ZONE> \
--scopes=https://www.googleapis.com/auth/gmail.readonly,\
$(gcloud compute instances describe <YOUR_VM_NAME> --zone <YOUR_VM_ZONE> \
--format json | jq -c ".serviceAccounts[0].scopes" | sed -E "s/\[(.*)\]/\1/g" | sed -E "s/\"//g")

Get secrets for GCP deployments from KMS

I want to deploy a Cloud VPN tunnel in GCP using Deployment Manager
I set up a deployment script using Python for this and I don't want the shared secret for the VPN tunnel in plain text in my configuration.
So I tried to include the secret encrypted via KMS and then perform a call to the KMS in the python script to get the plain text secret.
The python code to decrypt the secret looks like this:
import base64
import googleapiclient.discovery
def decryptSecret(enc_secret,context):
""" decrypts the given Secret via KMS"""
# KMS Configuration
KEY_RING = <Key Ring>
KEY_ID = <Key>
KEY_LOCATION = REGION
KEY_PROJECT = context.env['project'],
# Creates an API client for the KMS API.
kms_client = googleapiclient.discovery.build('cloudkms', 'v1')
key_name = 'projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}'.format(
KEY_PROJECT, KEY_LOCATION, KEY_RING, KEY_ID)
crypto_keys = kms_client.projects().locations().keyRings().cryptoKeys()
request = crypto_keys.decrypt(
name=key_name,
body={'ciphertext': enc_secret})
response = request.execute()
plaintext = base64.b64decode(response['plaintext'].encode('ascii'))
return plaintext
But if I deploy this code I just get the following error message from deployment manager:
Waiting for update [operation-<...>]...failed.
ERROR: (gcloud.deployment-manager.deployments.update) Error in Operation [operation-1517326129267-5640004f18139-450d8883-8d57c3ff]: errors:
- code: MANIFEST_EXPANSION_USER_ERROR
message: |
Manifest expansion encountered the following errors: Error compiling Python code: No module named googleapiclient.discovery Resource: cloudvpn-testenv.py Resource: config
I also tried to include the complete google-api-python-client library in my configuration yaml, but I still get this error.
Any idea someone?
To answer your question directly:
# requirements.txt
google-api-python-client
# main.py
import base64
import os
import googleapiclient.discovery
crypto_key_id = os.environ['KMS_CRYPTO_KEY_ID']
def decrypt(client, s):
response = client \
.projects() \
.locations() \
.keyRings() \
.cryptoKeys() \
.decrypt(name=crypto_key_id, body={"ciphertext":s}) \
.execute()
return base64.b64decode(response['plaintext']).decode('utf-8').strip()
kms_client = googleapiclient.discovery.build('cloudkms', 'v1')
auth = decrypt(kms_client, '...ciphertext...'])
You can find more examples and samples on GitHub.
To indirectly answer your question, you may be interested in Secret Manager instead.

AWS S3 - Able to Upload File from Local but not from Deployed (Access Denied)

My problem is that I cannot upload a file from my deployed project to a S3 bucket, even though I am able to upload from local host. Expect the URL, everything remains the same (headers, body etc.) when I am calling the method.
I am using boto3 to interact with s3 and using created IAM users' credentials. Also, for deployment, I am using AWS Elastic Beanstalk.
Below is the code I am using for uploading;
def put(self, bytes, data, folder, file_name):
self.ext = file_name.split(".")[-1]
if self.__is_audio_ext(self.ext):
if folder == self.__voice_record:
self.__create_voice_record(data, folder, file_name)
elif folder == self.__voice_message:
self.__create_voice_message(data, folder, file_name)
else:
return "Response cannot be constructed."
self.s3_client.put_object(Body=bytes, Bucket=self.bucket_name, Key=folder + "/" + file_name)
return "Successfully created at URL " \
+ self.bucket_url + self.bucket_name + "/" + folder + "/" + file_name
else:
return "Invalid file type"
Also, below is how I setup the boto3
def __init__(self):
self.ext = ""
self.env = {
"aws_access_key_id": settings.AWS_ACCESS_KEY_ID,
"aws_secret_access_key": settings.AWS_SECRET_ACCESS_KEY,
"region_name": 'eu-central-1'
}
self.bucket_name = "********"
self.session = session.Session(region_name='eu-central-1')
self.s3_client = self.session.client('s3', config=boto3.session.Config(signature_version='s3v4'))
self.bucket_url = "http://s3-eu-central-1.amazonaws.com/"
When I make my PUT request to the my server, this is the error I got:
An error occurred (AccessDenied) when calling the PutObject operation:
Access Denied"
Note that I created IAM user and give it the full permission of using S3 and I am sure that I am using the right credentials. This can be understood easily from that I can actually upload file from local.
This is why I believe the problem is somewhere between the file in my request and the deployment project. But it does not seem still right to me. Anyway, do not listen to me, I am pretty confused here.
Please do not hesitate asking me about what you do not understand. I may skip clearing some points.
I am working on it for hours and could not come up with any proper solutions, so I will be really glad for any help!
Thanks!
It's too late but hope fully helpful to other new users. We should attach instance profile to EC2 with right permissions for S3 bucket permission and make sure bucket policy should allow to the role attached to instance.
Follow this link

Google Python Admin SDK using Oauth2 for a Service Account (Education Edition)-"oauth2client.client.AccessTokenRefreshError: access_denied" exception

I have been trying to get the Service Account authentication working for the Google Admin SDK for a few days now to no avail. I am using the google-api-python-client-1.2 library freshly installed from Google.
I have been following Google's documentation on the topic. Links are here:
htps://developers.google.com/accounts/docs/OAuth2ServiceAccount
htps://developers.google.com/api-client-library/python/guide/aaa_oauth
htp://google-api-python-client.googlecode.com/hg/docs/epy/oauth2client.client.SignedJwtAssertionCredentials-class.html
And have the tasks.py Service Account example working which you can be found here:
htp://code.google.com/p/google-api-python-client/source/browse/samples/service_account/tasks.py?r=c21573904a2df1334d13b4380f63463c94c8d0e8
And have been closely studying these two Stack Overflow threads on a related topic here:
google admin sdk directory api 403 python
Google Admin API using Oauth2 for a Service Account (Education Edition) - 403 Error
And have studied the relevant code in gam.py (Dito GAM).
Yet I'm still missing something as I am getting an 'oauth2client.client.AccessTokenRefreshError: access_denied' exception in nearly every test case I write.
Here is a concise example of a test authentication:
import httplib2
from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials
f = file('myKey.p12', 'rb')
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(
'myServiceAdmin#developer.gserviceaccount.com',
key,
sub='myAdminUser#my.googleDomain.edu',
scope = ['https://www.googleapis.com/auth/admin.directory.user',])
http = httplib2.Http()
http = credentials.authorize(http)
service = build('admin', 'directory_v1', http=http)
When I run the above code I get this stack dump and exception:
Traceback (most recent call last):
File "./test.py", line 17, in <module>
service = build('admin', 'directory_v1', http=http)
File "/usr/lib/python2.7/dist-packages/oauth2client/util.py", line 132, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/apiclient/discovery.py", line 192, in build resp, content = http.request(requested_url)
File "/usr/lib/python2.7/dist-packages/oauth2client/util.py", line 132, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/oauth2client/client.py", line 475, in new_request
self._refresh(request_orig)
File "/usr/lib/python2.7/dist-packages/oauth2client/client.py", line 653, in _refresh
self._do_refresh_request(http_request)
File "/usr/lib/python2.7/dist-packages/oauth2client/client.py", line 710, in _do_refresh_request
raise AccessTokenRefreshError(error_msg)
oauth2client.client.AccessTokenRefreshError: access_denied
I've tried multiple super user accounts, service accounts, and keys and always end up with the same exception. If I add sub to the tasks.py example I end up with the same error. Replacing sub with prn also generates this exception and adding private_key_password='notasecret' does nothing (it is the default). The Admin SDK is activated in the Google Developers Console and the target accounts have super user privileges. This makes me think something is missing on the Google domain side but I cannot think of anything else to check.
Any one have an idea what I am doing wrong?
Have you granted the third party client access in your Admin Console for your service account?
My to go instruction when it comes to setting up Service Account is the instruction Google has for Drive Api.
https://developers.google.com/drive/web/delegation
Take a look at the "Delegate domain-wide authority to your service account" part and see if you have completed those steps.
Maybe not OP's problem, but I had this same error and my issue was that I was setting the sub field in the credentials object
credentials = SignedJwtAssertionCredentials(SERVICE_ACCOUNT_EMAIL, key,
scope=SCOPES, sub=**<DON'T SET ME>**)
If you're using domain-wide delegation, you need to not set a sub (because your "user" is the domain administrator.) The docs are a bit confusing on this point. I just removed the sub field and it worked for me.