import boto3
client = boto3.client('cognito-idp')
def lambda_handler():
response = client.list_users(UserPoolId='ap-south-1_w9By25vrR',AttributesToGet=['email','name'])
return response
I am getting output of a dictionary of all metadata but i want only name,mail only
Check the Official AWS Code Library that contains hundreds of these Code examples in the different supported SDKs. This is the doc to refer to when looking for the latest AWS code examples.
Here is the Python example for this use case.
List users
Related
My company has a Google Brand Account to manage our YouTube (YT) channel. We are trying to use the YouTube Analytics & Reporting API to automatically export metrics about our channel. To do so, we have created an App in our GCP Organisation and in that App we have created an Oauth client. Everything works fine when retrieving YouTube data for a channel owned by a user within our GCP Org. However, the Brand Account that owns the YT channel we are interested in is not a member of our GCP Org. This means that when trying to access that channel using our Oauth client we get the following error:
Error 403: org_internal
This client is restricted to users within its organization.
From searching online documentation it seems that we can do one of the following:
Make our App external from the APIs & Services -> OAuth consent screen section
Migrate the Brand Account to our GCP Org
I don't know how to do 2. and whether it is actually feasible at all. And 1. seems a bit overboard to me as we don't really want to access data from any user with a Google Account but it might be the only way. So I am looking for help on how best to proceed so that we can use an OAuth client within our GCP Org to get analytics data from our YT channel.
Looks like this was first documented here. It's makred "won't fix" by Google, not sure why.
I was able to use this OAuth Playground (OAP) workaround to get my app to work. It's a pretty sad workaround because the token will only work for an hour and then you must manually refresh in the playground.
Once you get the token from OAP, here is the code I'm using.
import os, json
from pathlib import Path
import google.oauth2.credentials
from googleapiclient.discovery import build
# pasted from OAP, note only access_token is actually needed
Path('oap.json').write_text('''
{
"access_token": "tokenstring",
"scope": "https://www.googleapis.com/auth/youtube.readonly https://www.googleapis.com/auth/yt-analytics.readonly",
"token_type": "Bearer",
"expires_in": 3599,
"refresh_token": "refreshtoken"
}
''')
TOKEN_FILE = 'oap.json'
# For Reference
# SCOPES = ['https://www.googleapis.com/auth/yt-analytics.readonly',
# 'https://www.googleapis.com/auth/youtube.readonly']
API_SERVICES = [
('youtubeAnalytics', 'v2'),
('youtube', 'v3')
]
oap = json.load(open(TOKEN_FILE, 'r'))
creds = google.oauth2.credentials.Credentials(oap['access_token'])
service_list = []
for API_SERVICE_NAME,API_VERSION in api_services:
service = build(API_SERVICE_NAME, API_VERSION, credentials = creds)
service_list.append(service)
ytAnalytics, ytData = service_list
# test ytData
req = ytData.channels().list(
part = 'id,snippet',
mine=True)
res = req.execute()
print(res)
for channel in res['items']:
print('Channel:',channel['snippet']['title'])
I am trying to have a Github Webhook launch an AWS Lambda I have.
The best way I can figure out how to do that is to use AWS API Gateway, the issue is security.
Github Webhooks will only send a secret with the POST call.
I can't find any way to have AWS API Gateway to verify this signature.
Or where I can add this functionality.
I assume I can write an AWS Lambda Authorizer.
But this is a lot of code in different places, starting to see the need for serverless framework.
Any easier setup within AWS I do not know about for this?
Came here because I was trying to integrate a Github webhook with AWS lambda and ran into the same problem as the OP. At the time of writing I believe the best solution is to include verification code in the main lambda, as others have suggested.
On the AWS Computer Blog from September 2017:
Enhanced request authorizer Lambda functions receive an event object that is similar to proxy integrations. It contains all of the information about a request, excluding the body.
Source: Using Enhanced Request Authorizers in Amazon API Gateway (amazon.com)
You can't perform HMAC as recommended by Github, because AWS authorizer lambdas don't give you access to the body of the HTTP request, which you need in order to compare digests.
This is a shame, because HMAC seems to be a pretty standard way of securing an endpoint that responds to a webhook. See for example this blog post, Webhooks do’s and dont’s: what we learned after integrating +100 APIs (restful.io). Twitter and Stripe do something similar:
Securing webhooks (twitter.com)
Checking Webhook Signatures(stripe.com)
To make the approach described above work, if you're using API Gateway you'll need to make sure that the header that contains the hash signature is forwarded as part of the event argument to the lambda. To do that, follow these instructions: How do I pass custom headers through Amazon API Gateway to an AWS Lambda function using custom Lambda integration for further processing? (amazon.com)
I couldn't find a way to do this with API Gateway. I validated within the LAMBDA using (Python).
High level overview : Calculate HMAC signature with GITHUB_SECRET then compare to the signature passed from Github.
You can obviously simplify, intentionally verbose for readability. There may be better ways, but I couldn't find one.
Make sure your Webhook is configured for application/json. Hopefully this helps someone else.
import logging
import json
import hmac
import hashlib
import re
from urllib.parse import unquote
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
GITHUB_SECRET = 'SECRET FROM GITHUB CONSOLE'
def lambda_handler(event, context):
logger.info("Lambda execution starting up...")
incoming_signature = re.sub(r'^sha1=', '', event['headers']['X-Hub-Signature'])
incoming_payload = unquote(re.sub(r'^payload=', '', event['body']))
calculated_signature = calculate_signature(GITHUB_SECRET, incoming_payload.encode('utf-8'))
if incoming_signature != calculated_signature:
logger.error('Unauthorized attempt')
return {
'statusCode': 403,
'body': json.dumps('Forbidden')
}
logger.info('Request successfully authorized')
# do stuff in Lambda
return {
'statusCode': 200,
'body': json.dumps(f'Work in progress')
}
def calculate_signature(github_signature, githhub_payload):
signature_bytes = bytes(github_signature, 'utf-8')
digest = hmac.new(key=signature_bytes, msg=githhub_payload, digestmod=hashlib.sha1)
signature = digest.hexdigest()
return signature
I have an application that must verify the identity of its caller. For this I need ID tokens (the JWTs from Google's OpenID Connect implementation) from each client calling the service.
I would like to write client code that works both locally using the default user credentials—for testing and development—and on a Compute Engine instance in production. The official Python auth SDK generally does a good job of handling those cases and saving me the trouble of checking the environment, e.g. I can just call google.auth.default and it figures out where to get credentials.
However, that google.auth package only seems to be able to give me auth tokens, not ID tokens in an environment-independent way. Here is what I tried:
import google.auth
from google.auth.transport import requests
credentials, project = google.auth.default(scopes=["openid"])
req = requests.Request()
credentials.refresh(req)
print(credentials.id_token)
This works on my laptop with my default credentials, but on the Compute Engine instance I instead get an error AttributeError: 'Credentials' object has no attribute 'id_token'
According to this page in the docs, you are supposed to fetch an ID token for an instance by requesting it from the metadata server...
import requests
audience = 'service_identifier'
metadata_server_token_url = 'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience='
token_request_url = metadata_server_token_url + audience
token_request_headers = {'Metadata-Flavor': 'Google'}
token_response = requests.get(token_request_url, headers=token_request_headers)
jwt = token_response.content.decode("utf-8")
print(jwt)
I don't want to do that. I don't want to manually check the environment. The SDK is supposed to handle that complexity for me. Shouldn't there be a way to leverage the google-auth SDK to generate an ID token in an environment-independent way?
EDIT 1: Why I need this
The application is based on Cloud Functions, and it returns highly sensitive data. Only a specific set of subjects—some trusted devs and services—should be able to access that data. So, the cloud functions must verify the ID of the caller (user or service account) using an ID token signed by Google. Specifically, I need to know the sub claim, the "subject" in the JWT. It is effectively the same issue that the IAM features are meant to solve, documented here. However, I cannot use these because they are still in beta. So, I'm writing identity checks manually into the cloud functions for the time being.
I think I have an answer to this question. I could always get it to work locally or in the cloud, the trick was to find a way of combining the two. I colleague of mine actually showed me how to do this and I just wanted to share this with others who are looking for a solution.
import google.auth
from google.auth.transport.requests import AuthorizedSession, Request
from google.oauth2.id_token import fetch_id_token
import requests
def GetIdToken(audience):
credentials, _ = google.auth.default()
session = AuthorizedSession(credentials)
request = Request(session)
credentials.refresh(request)
if hasattr(credentials, "id_token"):
return credentials.id_token
return fetch_id_token(request, audience)
def ProcessPayload(url, payload):
# Get the ID Token
id_token = GetIdToken(url)
# Process post request
headers = {'Authorization': f'Bearer {id_token}'}
response = requests.post(url, json=payload, headers=headers)
I am trying to get the list of userpools in aws cognito this error code arrises
Host URL: https://cognito-idp.us-east-1.amazonaws.com/ListUserPools?Version=2016-04-18
I am using postman it calculates aws signature correctly so no problem in that...
AWS ListUserPools : https://docs.aws.amazon.com/cognito-user-identity-pools/latest/APIReference/API_ListUserPools.html
Where to place that request syntax on the documentation link ????
You have to either use the AWS CLI or any of the AWS SDKs to make this call. You can't just do it using postman. Scroll down to the documentation link you have posted. Now select any of the SDKs that you want to develop using. Then follow the documentation for that language.
I'm having difficulties using Google new Admin SDK. In particular the Directory API using Oauth2.
I think I'm almost there but I've got stuck trying to retrieve a users details using the Directory API (I'm using a Google Education Edition domain).
Basically what I'm trying to do is write a python script that provisions or de-provisions users based on their enrollment status which is managed by our AD. I've got a script that does this using Oauth1 but want to update it to use Oauth2.
Here is a code snippet based on some examples I found.
f = file('test_key.p12', 'rb')
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(
'606346240424-10stfco1ukj9h4m4b4r40#developer.gserviceaccount.com',
key,
scope= 'https://www.googleapis.com/auth/admin.directory.user')
http = httplib2.Http()
http = credentials.authorize(http)
service = build(serviceName='admin', version='directory_v1', http=http)
lists = service.users().get(userKey='joe.blogs#mydomain.com').execute(http=http)
pprint.pprint(lists)
This piece of code appears to connect correctly but when I try to execute the query I get a 403 error.
ERROR: https://www.googleapis.com/admin/directory/v1/users/joe.blogs#mydomain.com?alt=json returned "Not Authorized to access this resource/api">
My first thought was because I haven't turned on this API on the administrators console (Google API's console) but I have. (Actually I turned on the Admin SDK and not the Directory API because there is no Directory API to turn on and seeing that it's part of the Admin SDK it would work?).
Is there another step I'm missing or have I made a silly mistake somewhere?
Bruce,
you're pretty close.
Couple of items:
If you're using App Engine, need to convert p12 key to pem and strip header
Need to include user with super user credentials (who has permission to do these operations) whom you're impersonating (not the user who is being changed) using the sub= parameter
So full code will look a bit like this:
# domain configuration settings
import domainconfig
f = file(domainconfig.KEY_FILE, "rb") # b reads file in binary mode; not strictly necessary, but safer to avoid strange Windows EOL characters: https://stackoverflow.com/questions/9644110/difference-between-parsing-a-text-file-in-r-and-rb-mode
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(
domainconfig.SERVICE_ACCOUNT_EMAIL,
key,
scope = domainconfig.SCOPE,
sub=domainconfig.SUB_ACCOUNT_EMAIL # 'sub' supercedes the deprecated 'prn'
)
http = httplib2.Http()
http = credentials.authorize(http)
directoryservice = build("admin", "directory_v1", http=http)
users = directoryservice.users()
response = users.get(userKey='joe.blogs#mydomain.com').execute()
This should be of help: https://developers.google.com/drive/delegation
When asserting the credentials you need to connect it to the user that is going to be changed. From the link above, note this section:
credentials = SignedJwtAssertionCredentials(SERVICE_ACCOUNT_EMAIL, key,
scope='https://www.googleapis.com/auth/drive', sub=user_email)