How to authenticate to AWS using AWS-ADFS to use Boto3 - Python - amazon-web-services

I have the following Python code to connect to a DynamoDB table in AWS:
# import requests
from __future__ import print_function # Python 2/3 compatibility
import boto3
import json
import decimal
from boto3.dynamodb.conditions import Key, Attr
# Helper class to convert a DynamoDB item to JSON.
class DecimalEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, decimal.Decimal):
return str(o)
return super(DecimalEncoder, self).default(o)
def run():
dynamodb = boto3.resource('dynamodb', region_name='ap-southeast-2')
table = dynamodb.Table('work-gtfsr-tripupdates-dev-sydt')
response = table.query(
# ProjectionExpression="#yr, title, info.genres, info.actors[0]", #THIS IS A SELECT STATEMENT
# ExpressionAttributeNames={"#yr": "year"}, # Expression Attribute Names for Projection Expression only. #SELECT STATEMENT RENAME
KeyConditionExpression=Key('pingEpochTime').eq(1554016605) & Key('entityIndex').between(0, 500)
)
for i in response[u'Items']:
print(json.dumps(i, cls=DecimalEncoder))
run()
The following code is confirmed to work when I connect to my personal AWS account (authenticating via AWS CLI), but it does not work behind a firewall when I am authenticating via AWS-ADFS. When I run the code to connect to the corporate AWS instance, I get the error:
botocore.exceptions.ClientError: An error occurred (UnrecognizedClientException) when calling the Query operation: The security token included in the request is invalid.
When I run 'aws-adfs login' script (that is confirmed to work), it seemingly is correctly populating the .aws folder in my home drive and has worked when deploying Lambda functions in the past. Should I be doing something in the code to accommodate aws-adfs session tokens?

Found on another Stack Overflow page that apparently the Boto library requires the '[default]' entry within the ~/.aws/credentials file.
I tested the aws-adfs login script by authenticating with a profile called 'default', and everything works now.

With boto3 you need to set up a session for your resource.
session = boto3.session.Session(profile_name="other_name")
dynamodb = session.resource('dynamodb', region_name='ap-southeast-2')
Where "other_name" is the name of your profile you have authenticated with.

Related

To get a trigger or notification to angular whenever GCP bucket got upadted

I wanted to get a notification to angular when gcp bucket got updated .Can anyone help me with that?
The idea is: whenever a bucket is updated a http post request is sent to your python web application. All you need in python is to handle this post request.
For example
import http.server
import socketserver
from http import HTTPStatus
class Handler(http.server.SimpleHTTPRequestHandler):
def do_GET(self):
self.send_response(HTTPStatus.OK)
self.end_headers()
self.wfile.write(b'Hello world')
def post(self):
if 'X-Goog-Resource-State' in self.request.headers:
resource_state = self.request.headers['X-Goog-Resource-State']
if resource_state == 'sync':
logging.info('Sync message received.')
else:
an_object = json.loads(self.request.body)
bucket = an_object['bucket']
object_name = an_object['name']
logging.info('%s/%s %s', bucket, object_name, resource_state)
else:
logging.info("Other post.")
httpd = socketserver.TCPServer(('', 8093), Handler)
httpd.serve_forever()
After the web server above is up, then, in Cloud Console redirect notifications to your URL
gsutil notification watchbucket yourULR gs://BucketName
The def post(self) is coppied from webapp2 example here
https://cloud.google.com/storage/docs/object-change-notification
Web server code is copied from here
https://gist.github.com/davidbgk/b10113c3779b8388e96e6d0c44e03a74

directory_v1 always returns 403 from a cloud function

I have followed the steps mentioned in : https://developers.google.com/admin-sdk/directory/v1/guides/delegation
Service account has all the necessary domain wide delegations.
I wish to run below mentioned code in cloud function without passing credentials to build method, but it always returns 403- help appreciated
import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/admin.directory.user']
def directory_api(request):
"""Shows basic usage of the Admin SDK Directory API.
Prints the emails and names of the first 10 users in the domain.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
print("before build")
service = build('admin', 'directory_v1')
# Call the Admin SDK Directory API
print('Getting the first 10 users in the domain')
try:
results = service.users().list(domain="sss.com", viewType="domain_public").execute()
print(results)
users = results.get('users', [])
except Exception as excs:
print(excs)
if not users:
print('No users in the domain.')
else:
print('Users:')
for user in users:
print(u'{0} ({1})'.format(user['primaryEmail'],
user['name']['fullName']))
return "ok"
Issue:
You are trying to access a private resource and not providing any credentials. Because of this, you are getting a 403.
Also, you talked about a Service Account, but the code you shared corresponds to authenticating with a regular account (you're not using these credentials anyway, but that's not how you build credentials with a Service Account).
Solution:
It's no use to grant domain-wide delegation (DWD) to a Service Account, if you don't do the following:
Use the Service Account credentials to impersonate a regular account who has access to the resource you are trying to access (probably an admin user in this case). The purpose of DWD is that the Service Account can act on behalf of any user in the domain. But you have to specify which user you want the Service Account to impersonate, otherwise the Service Account will behave as if you hadn't granted DWD at all.
Use the delegated credentials retrieved in previous step when building the service.
Actually, the page you referenced has an example of how to delegate credentials with a Service Account, check this.
An example more adapted to your needs, and using JSON instead of P12, could be this:
from google.oauth2 import service_account
from googleapiclient.discovery import build
SCOPES = ['https://www.googleapis.com/auth/admin.directory.user']
SERVICE_ACCOUNT_FILE = 'credentials.json'
def listUsers():
creds = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)
delegatedCreds = creds.with_subject('your-admin-account#sss.com')
service = build('admin', 'directory_v1', credentials=delegatedCreds)
# Call the Admin SDK Directory API
print('Getting the first 10 users in the domain')
try:
results = service.users().list(domain="sss.com", viewType="domain_public").execute()
print(results)
users = results.get('users', [])
except Exception as excs:
print(excs)
if not users:
print('No users in the domain.')
else:
print('Users:')
for user in users:
print(u'{0} ({1})'.format(user['primaryEmail'],
user['name']['fullName']))
return "ok"
Reference:
Using OAuth 2.0 for Server to Server Applications
Perform G Suite Domain-Wide Delegation of Authority

Django: Stripe & POST request

I am currently trying to implement Stripe Connect in my Django project. Stripe documentations states for Standard accounts:
Assuming no error occurred, the last step is to use the provided code
to make a POST request to our access_token_url endpoint to fetch the
user’s Stripe credentials:
curl https://connect.stripe.com/oauth/token \
-d client_secret=sk_test_Dur3X2cOCwyjlf9Nr7OCf3qO \
-d code="{AUTHORIZATION_CODE}" \
-d grant_type=authorization_code
I now wonder how to send a POST request with Django without form & user action (clicking the submit button)?
Since Standard Connect relies on OAuth for its connection flow:
https://stripe.com/docs/connect/standard-accounts#oauth-flow
so you can use an OAuth python library like Rauth, as you mentioned, to handle the flow.
Also please note that Stripe Python library provides an implementation of the OAuth flow here:
https://github.com/stripe/stripe-python/blob/a938c352c4c11c1e6fee064d5ac6e49c590d9ca4/stripe/oauth.py
You can see an example of its usage here:
https://github.com/stripe/stripe-python/blob/f948b8b95b6df5b57c7444a05d6c83c8c5e6a0ac/examples/oauth.py
The example uses Flask not Django but should give you a good idea in terms of its use.
With regards to the advantages of using an existing OAuth implementation as opposed to implementing the calls directly yourself: one advantage I see is that your code would reuse a library that generally covers all different uses cases (e.g better error handling) and is also well tested.
Thanks to #psmvac I could implement it the 'proper' way now using the oAuth of Stripe. Here some reference/example Django code if anyone is trying the same. Obviously, urls.py has to be configured. This is in my views.py:
def stripe_authorize(request):
import stripe
stripe.api_key = ''
stripe.client_id = 'XYZ'
url = stripe.OAuth.authorize_url(scope='read_only')
return redirect(url)
def stripe_callback(request):
import stripe
from django.http import HttpResponse
# import requests
stripe.api_key = 'XYZ'
## import json # ?
code = request.GET.get('code', '')
try:
resp = stripe.OAuth.token(grant_type='authorization_code', code=code)
except stripe.oauth_error.OAuthError as e:
full_response = 'Error: ' + str(e)
return HttpResponse(full_response)
full_response = '''
<p>Success! Account <code>{stripe_user_id}</code> is connected.</p>
<p>Click here to
disconnect the account.</p>
'''.format(stripe_user_id=resp['stripe_user_id'])
return HttpResponse(full_response)
def stripe_deauthorize(request):
from django.http import HttpResponse
import stripe
stripe_user_id = request.GET.get('stripe_user_id', '')
try:
stripe.OAuth.deauthorize(stripe_user_id=stripe_user_id)
except stripe.oauth_error.OAuthError as e:
return 'Error: ' + str(e)
full_response = '''
<p>Success! Account <code>{stripe_user_id}</code> is disconnected.</p>
<p>Click here to restart the OAuth flow.</p>
'''.format(stripe_user_id=stripe_user_id)
return HttpResponse(full_response)

How to stop/start services of ambari cluster using AWS Lambda and AWS API Gateway

I wan't to pass the web services to AWS Lambda so that using those web services I can stop/start the services of ambari cluster.
Thanks in advance.
AWS Lambda can be easily integrated with almost all webservices including EC2 through programmatic api calls using boto3
You just need to create a boto3 client of the aws service in lambda function and you can use them the way you want (for e.g start/stop) .
AWS Lambda also provides you to schedule its invocation.
As you have mentioned in the comment , you need to stop them using an api.
Here is the basic code snippet that works -
# you need to create a zip that includes requests lib and upload it in lambda function
import requests
# name of lambda_function that needs to invoke
def lambda_handler(event, context):
url = ''
json_body = {}
try:
api_response = requests.put(url=url, json=json_body)
except Exception as err:
print(err)
If you have these servers running in aws then you can create a boto3 client and use
import boto3
from boto3.session import Session
aws_access_key = 'xxxxxxxxxxxxxxxxxxxx'
aws_secret_key = 'yyyyyyyyyyyyyyyyyyyyy'
region = 'xx-yyyyyy'
def lambda_handler(event, context):
try:
sess = Session(aws_access_key_id=aws_access_key,
aws_secret_access_key=aws_secret_key)
ec2_conn = sess.client(service_name='ec2', region_name=region)
response = client.start_instances(
InstanceIds=[
'i-xxxxxx','i-yyyyyy','i-zzzzzz',
])
except ClientError as e:
print(e.response['Error']['Message'])
P.S: this is a basic snippet and can vary according to your use case

Is it possible to use Google+ Domains API to access a user not part of a Domain?

Hello to anyone who has any suggestion to help,
QUESTIONS:
Is it possible to use use the 'Google+ Domains API' to access a Non-Domain Google+ profile?
For Non-Domain associated Google+ accounts accounts, are only 'Google+ API's' able to be used?
Is there a way the following could be accomplished using Standard OAuth 2.0 client credentials instead of a Service Account?
BACKGROUND:
I'm trying to use the Google+ Domains API for Python to upload .jpg images from a Linux device, and insert to my personal Google+ profile/account.
This account is not part of a Domain and no other Google+ users' data will be accessed by my application. I've created a Service Account and PKCS 12 Key with 'Owner' permission that I use for authorization. NOTE That I did not create Domain Wide Delegation of Authority, because I believe there is no Domain associated with my Personal Google+ account.
Getting Error 403 trying to access Google+ Domains using a Service Account
googleapiclient.errors.HttpError: <HttpError 403 when requesting
https://www.googleapis.com/upload/plusDomains/v1/people/me/media/cloud?
uploadType=multipart&alt=json returned "Forbidden">
The “Forbidden” error is returned when executing the media().insert() to the collection 'cloud'. Here is the python code that I assembled from examples I found on the Developers Guides.
import httplib2
import pprint
import sys
from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials
from oauth2client import client
from httplib2 import Http
SCOPES = ['https://www.googleapis.com/auth/plus.me',
'https://www.googleapis.com/auth/plus.media.upload',
'https://www.googleapis.com/auth/plus.stream.write']
def main(argv):
f = file('Trebor NAO Access-c35352d48d33.p12', 'rb')
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials('trebor-nao-access-service-659#trebor-nao-access.iam.gserviceaccount.com', key, SCOPES)
http = httplib2.Http()
http = credentials.authorize(http)
service = build(“plusDomains”, “v1”, http=http)
try:
user_id = 'me'
print('Uploading a picture of a box of MiniWheats cereal')
result = service.media().insert(
media_body = 'MiniWheats.jpg',
body = {'displayName': 'MiniWheats.jpg'},
userId = user_id,
collection = 'cloud',
).execute()
print('result = %s' % pprint.pformat(result))
media_id = result.get('id')
print('MediaID of MiniWheats.jpg: %s' % media_id)
if __name__ == '__main__':
main(sys.argv)
Any advice you can give is greatly appreciated.
Robert Dixey