AWS Lambda : body size is too long with raw file - amazon-web-services

I have an AWS Lambda function that return an image file. When trying to return the raw bytes of a 3MB image, I get the error message : "body size is too long"
However, when encoding the image to base64, the lambda works properly. I don't understand this behavior since the base64 image is bigger than the raw image.
Raw image : 3116012 bytes
Base64 encoded image : 4154684 bytes
Why do I get this error ?

Related

Convert Array of Base64 Strings to Audio file

I am using a ESP32 with a inmp441 device to send audio via bluetooth. The esp is sending float values converted to std::strings so they can be read as Base64 over bluetooth. I am able to receive the Base64 values and I store them in an array like so
base64Array = ['LTMwMi4xMjUwMDA=','MzkyLjM3NTAwMA==','MTY3My4zNzUwMDA=', ...]
Here is my questions, to create an audio file like mp3 or wav, do I need to encode all of the Base64 string in the array back to ASCII then concatenate them together and then decode them back to Base64 to get one long Base64 string. Or can I write every Base64 value in the array to a file (creating a new line for each value) and save that as an audio file?
I am using react native to create an app and need the bluetooth device recording functionality.

vertex ai: ResourceExhausted 429 received trailing metadata size exceeds limit

I am using google vertex AI online prediction:
In order to send an image it has to be in a JSON file in unit8 format which has to be less than 1.5 MB, when converting my image to uint8 it definitely exceeds 1.5MB.
To go around this issue we can encode the unit8 file to b64, that makes the JSON file in KBs
when running the prediction I get Resource Exhausted: 429 received trailing metadata size exceeds limit Is there anyone who knows what's the problem?

AWS Lambda - Body Size is Too Large Error, but Body Size is Under Limit

I am using a lambda function to service a REST API. In one endpoint I am getting "body size is too long" printed to the cloudwatch log.
The response I get from the function is status code 502 with response body { "message": "Internal server error" }. If I call the same endpoint but use a filter, the response body size is 2.26MB and works. This rules out that I am hitting asynchronous response body limit.
The response body size when it errors out is 5622338 bytes (5.36 MB).
This is how I am calculating the response size (python 2.7):
import urllib2
...
out = {}
resp = urllib2.urlopen(req)
out.statusCode = resp.getcode()
out.body = resp.read()
print("num bytes: " + str(len(bytearray(out.body, 'utf-8'))))
The advertised max response body size is for synchronous invocations is 6MB. From what I understand, I should not be receiving the error.
https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html
Other information:
Duration: 22809.11 ms Billed Duration: 22810 ms Memory Size: 138 MB Max Memory Used: 129 MB Init Duration: 1322.71 ms
Any help would be appreciated.
Update 4/22/21
After further research I found that the lambda function errors out if the size of the response is 1,048,574 bytes (0.999998 MB) or more.
If the response is 1,048,573 bytes (0.999997 MB) or less it works.
This is how I am returning responses. I hard code the view function to return a bytearray of a specific size.
Ex.
return bytearray(1048573)
I turned on logging for the AWS Gateway Stage that I am using, and the following error is getting written to the log. It implies that the function is erroring out. Not the invocation of the function:
Lambda execution failed with status 200 due to customer function error: body size is too long.
It's my understanding that the AWS Lambda Functions have a max response size of 6MB and the AWS Gateways have a max response size of 10MB.
AWS Lambda: Invocation payload response
https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html
API Gateway: API Gateway quotas for configuring and running REST API -> Payload size
https://docs.aws.amazon.com/apigateway/latest/developerguide/limits.html
Am I misunderstanding the limits?
I created a new lambda function for python 3.7 and found that it returns a more descriptive error. I was able to determine that the lambda function adds around 1 MB to the size of the response after it leaves the lambda function which explains why it was erroring out in the lambda function, but not in the endpoint code. In one case it added .82MB and another it add .98MB. It seems to be based off the size of the response. I suspect the response is getting base64 encoded, URL encoded, or something similar. I did not find documentation that could answer this and the responses were not encoded in any way on the receiving end.
Error Message returned by lambda function build with python 3.7:
Response payload size (8198440 bytes) exceeded maximum allowed payload size (6291556 bytes).
Lambda Function Code:
import requests, base64, json
def lambda_handler(event, context):
headers = {...}
url = ...
response = requests.get(url, headers=headers)
print (len(response.content))
print (len(response.content.decode("utf-8")))
return response.content.decode('UTF-8')
Below I have the size of the responses when printed inside the lambda function and the size of the response as determined by lambda in the error message.
I used these two print statements to determine the size of the response and these two print statements would always return the same length. Response.content is a byte array so my thought process is that getting the length of it will return the number of bytes in the response:
print (len(response.content))
print (len(response.content.decode("utf-8")))
Ex 1.
Size when printed inside the lambda function:
7,165,488 (6.8335 MB)
Size as defined in the error message:
8,198,440 (7.8186 MB)
Extra:
1,032,952 (0.9851 MB)
Error Message:
Response payload size (8198440 bytes) exceeded maximum allowed payload size (6291556 bytes).
Ex 2.
Size when printed inside the lambda function:
5,622,338 (5.3619 MB)
Size as defined in the error message:
6,482,232 (6.1819 MB)
Extra:
859,894 (0.820059 MB)
Error Message:
Response payload size (6482232 bytes) exceeded maximum allowed payload size (6291556 bytes).
I have decided to lower the soft limit in the endpoint code by another 1 MB (4.7MB) to prevent hitting the limit in the lambda function.
Tried the same in golang. events.APIGatewayV2HTTPResponse.Body can be ~5.300.000 bytes (didn't test the exact limit). Value was measured inside the lambda function (len(resp.Body))

aws sns publising compressed payload

There is a limit of 256KB as the max size of message which can be published to AWS-SNS. Can we compress a message using GZIP and send publish the compressed message to overcome the size limit ?
You can gzip the message body -- however -- SNS message bodies only support UTF-8 character data. Gzipped data is binary, so that is not directly compatible with SNS because not every possible sequence of bytes is also a valid sequence of UTF-8 characters.
So, after gzipping your payload, you need to encode that binary data using a scheme such as base-64. Base-64 encodes arbitrary binary data (8 bits per byte) using only 64 (which is 2^6, giving effectively 6 bits per byte) symbols and so the byte count inflates by 8/6 (133%) as a result of this encoding. This means 192KB of binary data encodes to 256KB of base-64-encoded data, so the maximum allowable size of your message after gzip becomes 192K (since the SNS limit is 256KB). But all the base-64 symbols are valid single-byte UTF-8 characters, which is a significant reason why this encoding is so commonly used, despite its size increase. That, and the fact that gzip typically has a compression ratio far superior to 1.33:1 (which is the break-even point for gzip + base-64).
But if your messages will gzip to 192K or lower, this definitely does work with SNS (as well as SQS, which has the same character set and size limits).
You already take a look at this? https://docs.aws.amazon.com/sns/latest/dg/sns-large-payload-raw-message-delivery.html
If you think that the file can increase on the time I suggest another approach.
Put the file on S3 bucket and attach the S3 Event Notification to SNSTopic so all consumer will be notified when a new file is ready to be processed.
In other word the message of the SNS will be the location of the file and not the file it self.
Think about it.
‪You can also use the SNS/SQS extended client library for large message payloads.‬
‪https://aws.amazon.com/about-aws/whats-new/2020/08/amazon-sns-launches-client-library-supporting-message-payloads-of-up-to-2-gb

Receiving JPEG images via http GET request

I want to receive images from an IP camera over HTTP via GET request. I have written a program that creates TCP socket connection with the camera and sends the following GET request to the camera:
GET /mjpeg?res=full HTTP/1.1\r\nHost: 143.205.116.14\r\n\r\n
After that, I receive images with the following function in a while loop:
while((tmpres = recv(sock,(void *) buf, SIZE, 0)) > 0 && check<10)
.....
where SIZE represents the size of the buffer. I, infact, don't know what size to define here. I am receiving a color image of size 2940x1920. So I define SIZE=2940x1920x3. After receiving the MJPEG image, I decode it with ffmpeg. But I observe that ffmpeg just partially/uncorrectly decodes the image and I just see a half (or even less) of the image. I assume it could be a size problem.
Any help in this regard would be highly appreciated.
Regards,
Khan
Why reinvent the wheel (for the umpteenth time)? Use a ready-made HTTP client library, such as libcurl.
For that matter, perhaps you can even just write your entire solution as a shell script using the curl command line program:
#!/bin/sh
curl -O "http://143.205.116.14/mjpeg?res=full" || echo "Error."
curl -o myfile.jpg "http://143.205.116.14/mjpeg?res=full" || echo "Error."
# ffmpeg ...
Save bytes received as binary file and analyze. May be an incomplete image (image can be encoded as progressive JPEG - is interlaced in fact - that means if you truncate the file you'll see horizontal lines.) or can be a ffmpeg decoding issue. Or something different. What is check < 10 condition ?