405 error for POST on Docker Container in Cloud Run - google-cloud-platform

I tested a container I built locally. It accepts a POST request with a file and returns another processed file.
I uploaded the container to Artifact Registry on GCP. I have been trying to make some POST requests from my computer to test the service. Here is a CURL below, same issue with various client libraries. The same request works when I use a local port instead of the cloud run URL.
curl --globoff https://SERVICE_NAME.a.run.app
-X POST
-H "content-type: application/json"
-H "Authorization: bearer $(gcloud auth print-identity-token)"
-d '{"filename": RANDOM_FILE_NAME.pdf}'
I am receiving a 405 I pasted below.
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 405 HTTP method POST is not supported by this URL</title>
</head>
<body><h2>HTTP ERROR 405</h2>
<p>Problem accessing /. Reason:
<pre> HTTP method POST is not supported by this URL</pre></p>
</body>
</html>
What am I doing wrong ? I haven't seen any further options on Cloud Run I need to update, and I am clear my container accepts POST.

The “HTTP ERROR 405” error occurs when the web server is configured in a way that does not allow you to perform a specific action for a particular URL.
It’s an HTTP response status code that indicates that the request method which uses PDF as an input and outputs a parsed JSON after processing is known by the server but is not supported by the target resource.
Also you need to make sure that the service account used by PubSub has proper IAM permissions so it can indeed trigger your app which is in reference to posting requests as per your requirement.
I would also suggest that you check this tutorial which outlines step by step how to achieve this.

Related

Tandem - Is there an equivalent to cURL in HP NonStop?

I need to execute a simple HTTP POST to a URL from the NonStop computer to make sure a service functions correctly. Is there a way to do this?
Use the built-in TACL commands to make an HTTP POST request. The TACL command POST allows you to send an HTTP POST request to a specified URL. For example:
POST url headers=content-type:text/plain
data=This is the request body

call AWS Elasticsearch Service API with cURL --aws-sigv4

when I execute
curl --request GET "https://${ES_DOMAIN_ENDPOINT}/my_index_pattern-*/my_type/_mapping" \
--user $AWS_ACCESS_KEY_ID:$AWS_SECRET_ACCESS_KEY \
--aws-sigv4 "aws:amz:ap-southeast-2:es"
where $ES_DOMAIN_ENDPOINT is my AWS Elasticsearch endpoint, I'm getting the following response:
{"message":"The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details."}
I'm confident that my $AWS_ACCESS_KEY_ID:$AWS_SECRET_ACCESS_KEY are correct.
However, when I send the same postman request with the AWS Authentication and the parameters above, the response is coming through. I compared the verbose output of both requests and they have very minor differences, such as timestamps and signature.
I'm wondering, what is wrong with the --aws-sigv4 config?
This issue happens due to the* character in the path. There is a bug report in curl repository to fix this issue https://github.com/curl/curl/issues/7559.
Meanwhile, to mitigate the error you should either remove a * from the path or build curl from the branch https://github.com/outscale-mgo/curl-appimage/tree/http_aws_sigv4_encoding.

CSRF Token error when attempting to post from CURL to an API endpoint I control. How do I write the request?

I have not utilized this part of Django before, but I have an endpoint which is giving me a 403 error and is telling me that my request needs a csrf token. I was trying to figure out how best to get this since I was attempting to set up a bunch of curl requests to handle some simple queries to the endpoint. Likewise, I was thinking to also use POSTman, but I was not sure where documentation is to handle these request.
I have seen the cookie csrftoken, but when I was attempting to curl with it, it was still giving me a 403. thought it would looking something like this:
curl -d #profilepicturev2.png -b "csrftoken=Ebfn2OlfhSwFjAEQdoQon7wUjbynFoJqrtHMNPla3cy7ZfCMT9cxZ3OQHsbaedam" http://127.0.0.1:8000/api/files/uploader
Maybe I am mistaken? I am trying to send a photo to the server, so i was thinking that this would be correct and wasnt sure if i needed to add additional params in order to append additional data information.
i need to see your code, but i think you need to install "pillow" to send pictures in django !

Can AWS API Gateway support `application/x-www-form-urlencoded` with body and query string parameters?

Numerous services can accept query string parameters in the URL when a POST request is made with Content-Type: application/x-www-form-urlencoded and other parameters in the body, but it seems AWS API Gateway cannot while also accepting query string parameters.
When I call the AWS API Gateway with a POST Mapping Template for application/x-www-form-urlencoded and query string URL parameters (with a Lambda function), I get the following error:
{
"message":"When Content-Type:application/x-www-form-urlencoded,
URL cannot include query-string parameters (after '?'):
'/prod/webhook?inputType=wootric&outputType=glip&url=...'"
}
Here is an example cURL:
curl -XPOST 'https://{myid}.execute-api.{myregion}.amazonaws.com/prod/webhook? \
inputType=wootric&outputType=glip&url=https://hooks.glip.com/webhook/ \
11112222-3333-4444-5555-666677778888' \
-d "#docs/handlers/wootric/event-example_response-created.txt" \
-H 'Content-Type: application/x-www-form-urlencoded' -v
The specific goal is to get a Wootric webhook event posted to a Lambda function using a URL with query string parameters.
You can get the code here:
https://github.com/grokify/chathooks
The Wootric event body file is here:
https://raw.githubusercontent.com/grokify/chathooks/master/docs/handlers/wootric/event-example_response-created.txt
The GitHub issue is here:
https://github.com/grokify/chathooks/issues/15
The error message seems pretty definitive but I wanted to ask:
Is there a workaround to configure an API Gateway to support both?
Is there a standards-based reason why AWS would not support this or is this just a design decision / limitation?
If there's no solution to this, is there a good lightweight solution other than deploying a hosted server solution like Heroku. Also, do other cloud services support this with their API gateway + cloud functions, like Google?
Some examples showing support for both:
jQuery example: jQuery send GET and POST parameters simultaneously at AJAX request
C# example: Accessing query string variables sent as POST in HttpActionContext
Yes,there is a workaround and the key issue is to set the mapping template that will convert string into json . Very detailed example shown in
API Gateway any content type.
Please set the request property as "Content-Type", "application/json" for your HttpURLConnection like below
connection.setRequestProperty("Content-Type", "application/json");
I had a similar problem, with a 3rd party provider using web hooks. It turns out that my provider is transforming the url path from UPPERCASE to LOWERCASE. Example the endpoint should be apigateway.com/dev/0bscur3dpathRANDOM instead apigateway.com/dev/0bscur3dpathRANDOM. You get the point.
I'm not sure if I got the point in question correctly, but if you want to access the request body that is encoded as application/x-www-form-urlencoded(or anything, actually) in your Lambda function, you should use LAMBDA_PROXY request integration type (aka tick "Use Lambda Proxy integration" checkbox) when creating a method for your resource. Then you can access the request body in event.body field as a plain text in your lambda function and parse it manually.

Upload to S3 bucket through API Gateway AWS Service Proxy

As in the title, I can't seem to get it to work, i'm following the high level guide detailed here but any images uploaded seem to be blank.
What i've set up:
/images/{object} - PUT
> Integration Request
AWS Region: ap-southeast-2
AWS Service: S3
AWS Subdomain [bucket name here]
HTTP method: PUT
Path override: /{object}
Execution Role [I have one set up]
> URL Path Paramaters
object -> method.request.path.object
I'm trying to use Postman to send a PUT request with Content-Type: image/png and the body is a binary upload of a png file.
I've also tried using curl:
curl -X PUT -H "Authorization: Bearer [token]" -H "Content-Type: image/gif" --upload-file ~/Pictures/bart.gif https://[api-url]/dev/images/cool.gif
It creates the file on the server and the size seems to be double what ever was uploaded, when viewed I just get "image has an error".
When I try with .txt files (content-type: text/plain) it seems to work though.
Any ideas?
After reading alot and chatting to AWS technical support, the problem seems to be that you can't do binary uploads through API Gateway as anything that passes through automatically goes through a UTF-8 encode.
There are a few workarounds for this I can think of, my solution will be to base64 the files before upload and trigger a lambda when they hit the bucket to decode them
This is a old post, but I got a solution.
AWS now support binary upload through APIGateway READ.
In general, go to your API settings, and add a Binary Media type.
After that, you can handle the file in base64