Aws Api gateway Unable to load fast api swagger docs page - amazon-web-services

I have deployed a simple fast api to aws API gateway .All the end points working fine however i am unable to load the swagger docs page I see below error
Api Code:
from fastapi import FastAPI
from mangum import Mangum
import os
from fastapi.middleware.cors import CORSMiddleware
stage = os.environ.get('STAGE', None)
openapi_prefix = f"/{stage}" if stage else "/"
app = FastAPI(title="MyAwesomeApp",root_path="stage")
#app.get("/")
def get_root():
return {"message": "FastAPI running in a Lambda function"}
#app.get("/info")
def get_root():
return {"message": "TestInfo"}
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
handler = Mangum(app)
I tried adding the root as mentioned below
https://fastapi.tiangolo.com/advanced/behind-a-proxy/
Any help on this will be appreciated .

I found the solution we need to add root_path="/dev" like app = FastAPI(title=settings.NAME,root_path="/dev")
while creating fast api app .This root path should be same as the api gatway stage name .
However this approach does not work if you are using fastapi versioning

Related

aws cdk: serve swagger ui with api gateway

I want to use AWS ApiGateway to serve a swagger-ui of an API from a lambda function using express. For this I want to use the endpoint GET /docs.
My Lambda function swaggerUiLambdaHandler looks like this:
import express from 'express';
import serverless from 'serverless-http';
import swaggerUI from 'swagger-ui-express';
const options = {
swaggerOptions: {
url: 'https://petstore.swagger.io/v2/swagger.json',
},
};
const app = express();
app.use('/v1/docs', swaggerUI.serve, swaggerUI.setup(undefined, options));
module.exports.handler = serverless(app);
In the cdk stack of the api, I added the following 2 endpoints for serving the index.html and the .js/.css resources:
const docs = myApi.root.addResource('docs');
docs.addMethod('GET', new apigw.LambdaIntegration(swaggerUiLambdaHandler));
const docsProxy = docs.addResource('{proxy+}');
docsProxy.addMethod('GET', new apigw.LambdaIntegration(swaggerUiLambdaHandler));
However, calling the endpoint with the browser, all network requests succeed but I get a white canvas and the error
Uncaught SyntaxError: Unexpected token '<'
Do you have any idea? I think it might be related to the content-type returned by api gateway, but I would appreciate some help.

Why am I getting a CORS error depsite using flask_cors?

I'm getting the error
Access to fetch at 'http://hpap-dev.pmacs.upenn.edu:5801/get-categories' from origin 'http://hpap-dev.pmacs.upenn.edu:5802' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
despite using flask_cors. My flask app and route look like this
#Start flask app
app = Flask(__name__)
#Open flask for querying from domains outside of the app
CORS(app)
<snip>
#app.route('/get-categories')
#cross_origin()
def get_categories():
frame_dict = file_cat_wrangle.get_wrangle_dict()
# orig
# return jsonify(response=frame_dict)
# stauffer - try this to resolve CORS problem
response = jsonify(response=frame_dict)
response.headers.add('Access-Control-Allow-Origin', '*')
return response
As you can see, the code's calling CORS(app) as well as #cross_origin() under the route. And I also tried adding the Access-Control-Allow-Origin header explicitly to the response, per another stackoverflow post.
I've rebuilt and restarted the flask server. AFAIK this used to work on this server (I've taken over this project at work and am pretty new to web dev). Other API calls that go between the front-end (node server) and back-end (the flask app) are working. I also checked that the route path and the call to it are identical.
Here's the call from the front end:
export const fetchCategories = () => {
return fetch(`${flask_url}/get-categories`)
.then(response => {
if (response.status >= 400) {
throw new Error(`${response.status}: ${response.statusText}`);
}
return response.json();
}).then(categories => categories);
}
Any suggestions??
Check if the header is really present in the network tab in your browser.
You can use hooks to enable cors as below:
#app.after_request
def after_request(response):
response.headers['Access-Control-Allow-Origin'] = '*'
return response
Turns out it was not actually a CORS-related error but some kind of odd error response. See my comment to the original question.

Invoking a Google Cloud Function from a Django View

I have created a Google Cloud function that can be invoked through HTTP. The access to the function is limited to the Service account only.
If I had a Django View which should invoke this function and expect a response?
Here is what I have tried
1) Before starting Django I set the environment variable
export GOOGLE_APPLICATION_CREDENTIALS
2) I tried invoking the function using a standalone code, but soon realised this was going nowhere, because I could not figure out the next step after this.
from google.oauth2 import service_account
from apiclient.http import call
SCOPES = ['https://www.googleapis.com/auth/cloud-platform']
SERVICE_ACCOUNT_FILE = 'credentials/credentials.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
Google's documentation does give you documentation around the API, but there is no sample code on how to invoke the methods or what to import within your Python code and what are the ways to invoke those methods.
How do you send a POST request with JSON data to an Cloud Function, with authorization through a service account?
**Edit
A couple hours later I did some more digging and figured this out partially
from google.oauth2 import service_account
import googleapiclient.discovery
import json
SCOPES = ['https://www.googleapis.com/auth/cloud-platform']
SERVICE_ACCOUNT_FILE = 'credentials/credentials.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
cloudfunction = googleapiclient.discovery.build('cloudfunctions', 'v1', credentials=credentials)
#projects/{project_id}/locations/{location_id}/functions/{function_id}.
path='some project path'
data='some data in json that works when invoked through the console'
data=json.dumps(data)
a=cloudfunction.projects().locations().functions().call(name=path, body=data).execute()
I get another error now.
Details: "[{'#type': 'type.googleapis.com/google.rpc.BadRequest', 'fieldViolations': [{'description': 'Invalid JSON payload received. Unknown name "": Root element must be a message.'}]}]">
I cant find any documentation on this. How should the JSON be formed?
making the json like {"message":{my actual payload}} doesn't work.
The requested documentation can be found here.
The request body argument should be an object with the following form:
{ # Request for the `CallFunction` method.
"data": "A String", # Input to be passed to the function.
}
The following modification on your code should work correctly:
from google.oauth2 import service_account
import googleapiclient.discovery
SCOPES = ['https://www.googleapis.com/auth/cloud-platform']
SERVICE_ACCOUNT_FILE = 'credentials/credentials.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
cloudfunction = googleapiclient.discovery.build('cloudfunctions', 'v1', credentials=credentials)
path ="projects/your-project-name/locations/cloud-function-location/functions/name-of-cloud-function"
data = {"data": "A String"}
a=cloudfunction.projects().locations().functions().call(name=path, body=data).execute()
Notice that very limited traffic is allowed since there are limits to the API calls.

Presigned URL for DynamoDB put_item

There are a few examples for the way to pre-sign the URL of an S3 request, but I couldn't find any working example to pre-sign other services in AWS.
I'm trying to write an item to DynamoDB using the Python SDK botos. The SDK included the option to generate the pre-signed URL here. I'm trying to make it work and I'm getting a URL, but the URL is responding with 404 and the Item is not appearing in the DynamoDB table.
import json
ddb_client = boto3.client('dynamodb')
response = ddb_client.put_item(
TableName='mutes',
Item={
'email': {'S':'g#g.c'},
'until': {'N': '123'}
}
)
print("PutItem succeeded:")
print(json.dumps(response, indent=4))
This code is working directly. But when I try to presign it:
ddb_client = boto3.client('dynamodb')
params = {
'TableName':'mutes',
'Item':
{
'email': {'S':'g#g.c'},
'until' : {'N': '1234'}
}
}
response = ddb_client.generate_presigned_url('put_item', Params = params)
and check the URL:
import requests
r = requests.post(response)
r
I'm getting: Response [404]
Any hint on how to get it working? I checked the IAM permissions, and they are giving full access to DynamoDB.
Please note that you can sign a request to DynamoDB using python, as you can see here: https://docs.aws.amazon.com/general/latest/gr/sigv4-signed-request-examples.html#sig-v4-examples-post . But for some reasons, the implementation in the boto3 library doesn't do that. Using the boto3 library is much easier than the code above, as I don't need to provide the credentials for the function.
You send an empty post request. You should add the data to the request:
import requests
r = requests.post(response, data = params)
I think you are having this issue, that's why you are recieving a 404.
They recommend using Cognito for authentication instead of IAM for this cases.

Making a signed HTTP request to AWS Elasticsearch in Python

I'm trying to make a simple Python Lambda that makes snapshots of our Elasticsearch database. This is done through Elasticsearch's REST API using simple HTTP requests.
However, for AWS, I have to sign these requests. I have a feeling it can be achieved through boto3's low-level clients probably with generate_presigned_url, but I cannot for the life of me figure out how to invoke this function correctly. For example, what are the valid ClientMethods? I've tried ESHttpGet but to no avail.
Can anyone point me in the right direction?
Edit: Apparently this workaround has been broken by Elastic.
I struggled for a while to do a similar thing. Currently the boto3 library doesn't support making signed es requests, though since I raised an issue with them it's become a feature request.
Here's what I've done in the meantime using DavidMuller's library mentioned above and boto3 to get my STS session credentials:
import boto3
from aws_requests_auth.aws_auth import AWSRequestsAuth
from elasticsearch import Elasticsearch, RequestsHttpConnection
session = boto3.session.Session()
credentials = session.get_credentials().get_frozen_credentials()
es_host = 'search-my-es-domain.eu-west-1.es.amazonaws.com'
awsauth = AWSRequestsAuth(
aws_access_key=credentials.access_key,
aws_secret_access_key=credentials.secret_key,
aws_token=credentials.token,
aws_host=es_host,
aws_region=session.region_name,
aws_service='es'
)
# use the requests connection_class and pass in our custom auth class
es = Elasticsearch(
hosts=[{'host': es_host, 'port': 443}],
http_auth=awsauth,
use_ssl=True,
verify_certs=True,
connection_class=RequestsHttpConnection
)
print(es.info())
Hope this saves somebody some time.
There are several Python extensions to the requests library that will perform the SigV4 signing for you. I have used this one and it works well.
While other answers are perfectly fine, I wanted to eliminate the use of external packages. Obviously, botocore itself has all the required functionality to sign requests it was just a matter of looking at the source code. This is what I ended up with for sending AWS API requests directly (things are hardcoded for the demonstration purposes):
import boto3
import botocore.credentials
from botocore.awsrequest import AWSRequest
from botocore.endpoint import URLLib3Session
from botocore.auth import SigV4Auth
params = '{"name": "hello"}'
headers = {
'Host': 'ram.ap-southeast-2.amazonaws.com',
}
request = AWSRequest(method="POST", url="https://ram.ap-southeast-2.amazonaws.com/createresourceshare", data=params, headers=headers)
SigV4Auth(boto3.Session().get_credentials(), "ram", "ap-southeast-2").add_auth(request)
session = URLLib3Session()
r = session.send(request.prepare())
I recently published requests-aws-sign, which provides AWS V4 request signing for the Python requests library.
If you look at this code you will see how you can use Botocore to generate the V4 request signing.
why not just use requests?
import requests
headers = {'Content-Type': 'application/json',}
data = '{"director": "Burton, Tim", "genre": ["Comedy","Sci-Fi","R-rated"],"profit" : 98 , "year": 1996, "actor": ["Jack Nicholson","PierceBrosnan","Sarah Jessica Parker"], "title": "Mars Attacks!"}'
response = requests.post('https://search-your-awsendpoint.us-west-2.es.amazonaws.com/yourindex/_yourdoc/', headers=headers, data=data)
this worked for me