Alternative for context.succeed(event) - amazon-web-services

Iam trying to create a migration trigger using python to change my user pool.In this trigger, what will be the alternative for context.succeed(event) that needs to be sent back to cognito after successful verification of user.

According to the docs, you can return the event
def lambda_handler(event, context):
# Your code here
# Return to Amazon Cognito
return event
Docs

Related

Subscribe an Amazon SNS topic using AWS Lambda fn that is sns.subscribe()?

I want to send an SNS subscription request to a Topic for email subscription,
As I'm new to Lambda, Can you please let me know what libraries needs to be imported and what functions needs to be triggered.
Thanks in advance.
Import Boto3 Package
Create a Reference to SNS Boto
Create a SNS Topic
Create a Email Susbcriber to that SNS Topic.
(make sure you follow the indentation in python. It a running code)
import boto3
snsClient = boto3.client('sns')
def lambda_handler(event, context):
topicName = "myFirstTopic"
emailId= "yourname#company.com"
#creating Topic and if it already created with the specified name, that topic's ARN is returned without creating a new topic..
snsArn = createSNSTopic(topicName)
#crearting a subscriber for the specified SNS Toic
snsEndpointSusbcribe(snsArn,emailId)
#Function to create SNS Topic
def createSNSTopic(topicName):
try:
response = snsClient.create_topic(
Name=topicName
)
print(response['TopicArn'])
return response['TopicArn']
except Exception as ex:
print(ex)
#Function to create SNS Subscriber for that Topic
def snsEndpointSusbcribe(snsArn,emailId):
response = snsClient.subscribe(
#
TopicArn=snsArn,
Protocol='email',
Endpoint=emailId,
)

Perform AppSync Mutation from Lambda(Python) Fuction, Triggered from S3 Upload

Greeting all!
I have a client application in React that uses an AppSync GraphQL API back end. I have added storage to amplify and can upload files to the storage. When a file is uploaded to the Storage a Lambda function is triggered and I am able to iterate through the contents. I have configured my AWS AppSync API with Cognito User Pool authentication
My requirement is I need to make GraphQL mutation from the Lambda Function(written in python) to an AppSync API.
The requirement in Short:
Upload an Excel file to the S3 Storage bucket using AppSync
Trigger a lambda function(python) on upload to extract contents of the Excel file
Send a GraphQL mutation to the AppSync GraphQL from the Lambda function with the contents of the file using the schema as required.
I need to use the Cognito user credentials, as I need to get details of the user making the mutations/
With the last update, I am able to upload content to the Client app, which dumps the Excel sheet in the S3 bucket triggering the Lambda function.
I tried to use "Assume role" but was getting the below error
[ERROR] ClientError: An error occurred (AccessDenied) when calling the AssumeRole operation: User: arn:aws:sts::#############:assumed-role/S3Accessfunction/uploadProcessor is not authorized to perform: sts:AssumeRole on resource: AWS:#######################:CognitoIdentityCredentials
I'm currently using this workaround in python to authorize my AppSync calls with cognito:
First my authorization:
import boto3
import json
from botocore.exceptions import ClientError
import requests
from requests_aws4auth import AWS4Auth
#AppSync variables
s3 = boto3.client('s3')
APPSYNC_API_ENDPOINT_URL = ""
appSyncUnsername = ""
appSyncPassword = ""
appSyncClientID = ""
appSyncUserPool = ""
#create Token for AppSync functionalities using an authorized user
def createToken():
client = boto3.client('cognito-idp')
response = client.initiate_auth(
ClientId=appSyncClientID,
AuthFlow='USER_PASSWORD_AUTH',
AuthParameters={
'USERNAME': appSyncUnsername,
'PASSWORD': appSyncPassword
},
ClientMetadata={
"UserPoolId" : appSyncUserPool
}
)
token = response["AuthenticationResult"]["AccessToken"]
return token
And using this token i can then make authorized calls:
def getExample(Id, token):
session = requests.Session()
Id = "\""+Id+"\""
query= """query MyQuery {
getSomething(id: """+Id+""") {
id
something
}
}"""
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'authorization': token},
json={'query': query}
)
something = response.json()["data"]["getSomething"]["something"]
return something

Send username to AWS Lambda function triggered when user is registered through AWS Cognito

I am trying to write a Lambda function that makes a folder in an s3 bucket when a newly confirmed Cognito user. This will allow me to keep that user's access limited to their folder. I have created a Lambda function that can list the current users registered in the user pool. I know Cognito has a "confirmation event" and "post authentication" event trigger, and I have selected my function to run on that trigger.
But I do not know how to make the folder when the user authenticates or confirmed from that event. Screenshot below is of my Lambda code.
Here is my code for post authentication trigger, but it does not work:
from __future__ import print_function
def lambda_handler(event, context):
# Send post authentication data to Cloudwatch logs
print ("Authentication successful")
print ("Trigger function =", event['triggerSource'])
print ("User pool = us-east-1_EVPcl4p64", event['userPoolId'])
print ("App client ID = ", event['callerContext']['clientId'])
print ("User ID = ", event['userName'])
# Return to Amazon Cognito
return event
Here is the code for list user. It works, but now how to fetch only user name and on that basis create a folder in an s3 bucket?
import json
import boto3
import re
def lambda_handler(event, context):
# TODO implement
client = boto3.client('cognito-idp')
response = client.list_users(
UserPoolId='us-east-1_EVPcl4p64',
AttributesToGet=['email']
)
x = response
print json.dumps(x)
print(y["email"])
pattern = '^#gmail.com$'
test_string = response
result = re.match(pattern, test_string)
print(response)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
Use PostConfirmation trigger instead. Post authentication trigger will fire every time when a user signs in.
Here is how you get the email address:
email = event['request']['userAttributes']['email']
Here is how you create an S3 "folder" for that user using email as the folder name:
s3 = boto3.client('s3')
bucket_name = 'example-bucket-name'
directory_path = f"users/{email}/"
s3.put_object(Bucket=bucket_name, Key=directory_path)

How to authenticate to AWS using AWS-ADFS to use Boto3 - Python

I have the following Python code to connect to a DynamoDB table in AWS:
# import requests
from __future__ import print_function # Python 2/3 compatibility
import boto3
import json
import decimal
from boto3.dynamodb.conditions import Key, Attr
# Helper class to convert a DynamoDB item to JSON.
class DecimalEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, decimal.Decimal):
return str(o)
return super(DecimalEncoder, self).default(o)
def run():
dynamodb = boto3.resource('dynamodb', region_name='ap-southeast-2')
table = dynamodb.Table('work-gtfsr-tripupdates-dev-sydt')
response = table.query(
# ProjectionExpression="#yr, title, info.genres, info.actors[0]", #THIS IS A SELECT STATEMENT
# ExpressionAttributeNames={"#yr": "year"}, # Expression Attribute Names for Projection Expression only. #SELECT STATEMENT RENAME
KeyConditionExpression=Key('pingEpochTime').eq(1554016605) & Key('entityIndex').between(0, 500)
)
for i in response[u'Items']:
print(json.dumps(i, cls=DecimalEncoder))
run()
The following code is confirmed to work when I connect to my personal AWS account (authenticating via AWS CLI), but it does not work behind a firewall when I am authenticating via AWS-ADFS. When I run the code to connect to the corporate AWS instance, I get the error:
botocore.exceptions.ClientError: An error occurred (UnrecognizedClientException) when calling the Query operation: The security token included in the request is invalid.
When I run 'aws-adfs login' script (that is confirmed to work), it seemingly is correctly populating the .aws folder in my home drive and has worked when deploying Lambda functions in the past. Should I be doing something in the code to accommodate aws-adfs session tokens?
Found on another Stack Overflow page that apparently the Boto library requires the '[default]' entry within the ~/.aws/credentials file.
I tested the aws-adfs login script by authenticating with a profile called 'default', and everything works now.
With boto3 you need to set up a session for your resource.
session = boto3.session.Session(profile_name="other_name")
dynamodb = session.resource('dynamodb', region_name='ap-southeast-2')
Where "other_name" is the name of your profile you have authenticated with.

How to stop/start services of ambari cluster using AWS Lambda and AWS API Gateway

I wan't to pass the web services to AWS Lambda so that using those web services I can stop/start the services of ambari cluster.
Thanks in advance.
AWS Lambda can be easily integrated with almost all webservices including EC2 through programmatic api calls using boto3
You just need to create a boto3 client of the aws service in lambda function and you can use them the way you want (for e.g start/stop) .
AWS Lambda also provides you to schedule its invocation.
As you have mentioned in the comment , you need to stop them using an api.
Here is the basic code snippet that works -
# you need to create a zip that includes requests lib and upload it in lambda function
import requests
# name of lambda_function that needs to invoke
def lambda_handler(event, context):
url = ''
json_body = {}
try:
api_response = requests.put(url=url, json=json_body)
except Exception as err:
print(err)
If you have these servers running in aws then you can create a boto3 client and use
import boto3
from boto3.session import Session
aws_access_key = 'xxxxxxxxxxxxxxxxxxxx'
aws_secret_key = 'yyyyyyyyyyyyyyyyyyyyy'
region = 'xx-yyyyyy'
def lambda_handler(event, context):
try:
sess = Session(aws_access_key_id=aws_access_key,
aws_secret_access_key=aws_secret_key)
ec2_conn = sess.client(service_name='ec2', region_name=region)
response = client.start_instances(
InstanceIds=[
'i-xxxxxx','i-yyyyyy','i-zzzzzz',
])
except ClientError as e:
print(e.response['Error']['Message'])
P.S: this is a basic snippet and can vary according to your use case