I'm writing tests to make sure my objects are cached by cloudfront. If the object is not cached, I expect the second time I make the same request, it will be served from the cache. However I noticed some objects are only served from cache at the 3rd request.
Anyone knows why it happens? In production this is not an issue, because who cares. But I'd like to know why, also I have to adjust my test code.
A test looks like this. 10% of my requests break the loop at i=1, the rest at i=0:
const client = axios.create({baseUrl: myUrl})
let response = await client.get(myPath)
expect(response.headers['x-cache']).toMatch(/Miss/)
for (let i = 0; i < 20; i++) {
response = await client.get(myPath)
if (response.headers['x-cache'] && /Hit/.exec(response.headers['x-cache'])) {
console.log('number of tests ', i + 1, path)
break
}
}
expect(response.headers['x-cache']).toMatch(/Hit/)
Here I read that requests to different edge locations are cached separately. However, the response header "x-amz-cf-pop" is always the same.
Related
I am working on an ASP Net Core 2.0 Web API. One of my endpoints returns a json object that includes a text field that can be fairly large. When this field gets around 10Mb in size the controller just stops until the timeout is hit. When I debug, I see that the json object is created in my business logic and passed to the endpoint controller but the controller just stops right after it receives json object with no error and doesn't return to the caller until the request finally times out. I increased my requestTimeout to 20 mins even though the business logic generates the json object in less than 2 minutes. It just hangs until the 20 minute timeout is hit.
Here is my controller action;
[EXAMPLE 1]
[HttpGet(Name = "GetFile")]
public async Task<FileResponseDto> GetFile([FromRoute] int companyId, [FromRoute] int siteId, [FromRoute] int FileId,
[FromHeader(Name = "Accept")] string mediaType, CancellationToken cancellationToken)
{
var fileResponseDto = _fileBll.GetFile(companyId, siteId, fileId, HttpContext);
// This is the point where the controller appears to hang
return await Task.Factory.StartNew(() => fileResponseDto, cancellationToken);
}
and my DTO object;
public class FileResponseDto
{
public string ReferenceId { get; set; }
public string Filename { get; set; }
public string ProcessingFile { get; set; }
}
The property that is the large string is the ProcessingFile property in the FileResponseDto class.
This works fine until my ProcessingFile property gets to around 30K lines (about 10Mb) and then the controller just hangs after it completes the line;
var fileResponseDto = _fileBll.GetFile(companyId, siteId, fileId, HttpContext);
At this point, my assumption was that I have hit some limitation in the size of the json object. So, to test, I changed my controller so that it returns a file instead, like what is shown below;
[EXAMPLE 2]
[HttpGet(Name = "GetFile")]
public async Task<FileContentResults> GetFile([FromRoute] int companyId, [FromRoute] int siteId, [FromRoute] int fileId,
[FromHeader(Name = "Accept")] string mediaType, CancellationToken cancellationToken)
{
var fileResponseDto = _fileBll.GetFile(companyId, siteId, fileId, HttpContext);
var outputFile = Encoding.ASCII.GetBytes(fileResponseDto.ProcessingFile);
return await Task.Factory.StartNew(() =>
new FileContentResult(outputFile, new MediaTypeHeaderValue(MediaTypeNames.Application.Octet))
{
FileDownloadName = fileResponseDto.Filename
}, cancellationToken);
}
Making this change works and I can receive a file download dialog popup and a successful file if I select "Send and Download" in Postman.
So, this leads me to believe that there is something size related to the json object being transferred in the first example.
However, web searches have not turned up anything useful on this issue, which makes me think that perhaps I am missing something here.
I did find this link in StackOverflow and tried it by using...
var outfileJson = JsonConvert.SerializeObject<fileResponseDto>;
outfileJson.MaxJsonLength = Int32.MaxValue;
but outfileJson did not have a MasJsonLength property.
So.. any ideas?
EDIT 6/8/18
After 2 days, 22 views and no actual responses. I figured something must be wrong with my approach. I realized that I did not mention that I was performing these tests in Postman, which is where I was seeing the problem. After further digging, I found a post on GitHub that seemed to be related to what I was experiencing in Postman (the hang on large response payload). It seems that Postman has a limit in the number of "rows" it returns in the response. The GitHub post was a feature request to increase the number of rows.
I am not sure how to handle this StackOverflow question now. Since I didn't mention Postman in the original post, I don't feel right just answering my own question. So, I guess I will leave it as is for a couple of days to see if anyone chimes in with their thoughts before I do that.
As it turns out, the was, if fact, an issue with Postman and the size of the response payload it currently supports. If, instead of selecting Send, I select Send and Download in Postman, It will download the JSON object and pop up a dialog box to allow me to save it to my local drive. Then when I examine the file, I can see the json object is correctly formatted and transferred.
I confirmed that it was only a Postman issue and not a .NET HttpResponse issue by performing the API call in a .Net client application, which was able to receive the Json object without error.
I have a Collection that has three endpoints. The first one creates an asset, the second one adds a file to the asset, and the third one lists all the assets.
How can I run the second one, the one that adds a file to the asset, more than once per each iteration of the Runner?
I'd like the test to create an asset and add multiple files to it for each iteration.
Any suggestions? I know I can duplicate the endpoint, but I was wondering if there was a programmatic way to do it.
Create 2 environment variables:
"Counter" (Number of times you want the request to run)
"RequestNumber" = 1 (To track the current request number)
Add this code to the test section of the request you want to run multiple times:
const counter = pm.environment.get("Counter");
const requestNumber = pm.environment.get("RequestNumber") || 1;
if (requestNumber < counter) {
postman.setNextRequest("RequestName");
requestNumber ++;
pm.environment.set("RequestNumber", requestNumber);
}
else {
pm.environment.set("RequestNumber", 1);
}
Instead of using postman.setNextRequest(), a bit cleaner way to hit the same endpoint is to use pm.sendRequest().
In Test or Pre-request Script, you can create a request object that would describe the request you want to send (URL, HTTP method, headers body, etc.) and put it in pm.sendRequest() function.
Consider:
const requestObject = {
url: 'https://postman-echo.com/post',
method: 'POST',
header: 'headername1:value1',
body: {
mode: 'raw',
raw: JSON.stringify({ key: "this is json" })
}
}
pm.sendRequest(requestObject, (err, res) => {
console.log(res);
});
To run the same request multiple times just put the function in for/for..in/for..of/forEach loop.
Consider:
for(let iteration = 0; iteration < 5; iteration++) {
pm.sendRequest(requestObject, (err, res) => {
console.log(res);
});
}
If you want you can modify the requestObject inside your loop.
Check out the Postman Documentation for more details.
So far, there is no straight forward solution using Postman, to configure several hits for the same request within a folder/collection.
Nevertheless, you can write some code in Pre-request script section, by adding a counter with number of hits you want and call postman.setNextRequest("request_name") method (read more about it from here) with you current request.
Out of Postman app scope, you can export your collection (as JSON file) and write some javascript code using newman which is a Command-line companion utility for Postman (more about newman from here) which gets a run method with a lot of iteration count and data options that would help you (for example, putting your second request in folder and iterates through it).
Hope that helps!
Background
I have a service A accessible with HTTP requests. And I have other services that want to invoke these APIs.
Problem
When I test service A's APIs with POSTMAN, every request works fine. But when I user python's requests library to make these request, there is one PUT method that just won't work. For some reason, the PUT method being called cannot receive the data (HTTP body) at all, though it can receive headers. On the other side, the POST method called in the same manner receives the data perfectly.
I managed to achieve my goal simply by using httplib library instead, but I am still quite baffled by what exactly happened here.
The Crime Scene
Route 1:
#app.route("/private/serviceA", methods = ['POST'])
#app.route("/private/serviceA/", methods = ['POST'])
def A_create():
# request.data contains correct data that can be read with request.get_json()
Route 2:
#app.route("/private/serviceA/<id>", methods = ['PUT'])
#app.route("/private/serviceA/<id>/", methods = ['PUT'])
def A_update(id):
# request.data is empty, though request.headers contains headers I passed in
# This happens when sending the request with Python requests library, but not when sending with httplib library or with POSTMAN
# Also, data comes in fine when all other routes are commented out
# Unless all other routes are commented out, this happens even when the function body has only one line printing request.data
Route 3:
#app.route("/private/serviceA/schema", methods = ['PUT'])
def schema_update_column():
# This one again works perfectly fine
Using POSTMAN:
Using requests library from another service:
#app.route("/public/serviceA/<id>", methods = ['PUT'])
def A_update(id):
content = request.get_json()
headers = {'content-type': 'application/json'}
response = requests.put('%s:%s' % (router_config.HOST, serviceA_instance_id) + '/private/serviceA/' + str(id), data=json.dumps(content), headers = headers)
return Response(response.content, mimetype='application/json', status=response.status_code)
Using httplib library from another service:
#app.route('/public/serviceA/<id>', methods=['PUT'])
def update_course(id):
content= request.get_json()
headers = {'content-type': 'application/json'}
conn = httplib.HTTPConnection('%s:%s' % (router_config.HOST, serviceA_instance_id))
conn.request("PUT", "/private/serviceA/%s/" % id, json.dumps(content), headers)
return str(conn.getresponse().read())
Questions
1. What am I doing wrong for the route 2?
2. For route 2, the handler doesn't seem to be executed when either handler is commented out, which also confuses me. Is there something important about Flask that I'm not aware of?
Code Repo
Just in case some nice ppl are interested enough to look at the messy undocumented code...
https://github.com/fantastic4ever/project1
The serviceA corresponds to course service (course_flask.py), and the service calling it corresponds to router service (router.py).
The version that was still using requests library is 747e69a11ed746c9e8400a8c1e86048322f4ec39.
In your use of the requests library, you are using requests.post, which is sending a POST request. If you use requests.put then you would send a PUT request. That could be the issue.
Request documentation
I'm trying to post a new object via a React form using a Reflux action. That part is working fine. The object is posting as expected, however, when I try to GET that object programmatically, it doesn't seem to be accessible unless I log out, restart my local server, or sometimes even simply visit the api page manually.
I can't seem to get consistent behavior as to when I can get the object and when I can't. It does seem that viewing the api page and then returning to my app page has some kind of effect, but I'm at a loss as to why. Perhaps someone can shed a little light onto this for me.
One thing that's for sure is that the POST request is working properly, as the object is always there when I check for it manually.
Also, if you notice in the code below, I check to see what the last object on the api page is, and the console responds with what previously was the last item. So I'm able to access the api page programmatically, but the object I created is not there (though it is if I visit the page manually). Note, too, that refreshing produces the same results.
Any ideas where the issue could be or why this might happen?
Here's the action:
MainActions.getUserProfile.listen(function(user) {
request.get('/api/page/').accept('application/json').end( (err, res) => {
if (res.ok) {
var profiles = res.body;
var filteredData = profiles.filter(function (profile) {
if (profile) {
return profile.user === user
}
else {
console.log('No Profile yet.')
}
});
if (filteredData[0]) {
var data = {
user: filteredData[0].user,
...
};
... // other actions
} else {
console.log(profiles[profiles.length - 1].user)
}
} else {
console.log(res.text);
}
});
});
The problem ended up being with the Cache-Control header of the response having a max-age=600. Changing that to max-age=0 solved the issue. In this situation, it doesn't make much sense to provide a cached response, so this I added this to my serializer's ViewSet:
def finalize_response(self, request, *args, **kwargs):
response = super(MyApiViewSet, self).finalize_response(request, *args, **kwargs)
response['Cache-Control'] = 'max-age=0'
return response
IE is heavily caching xhr requests. To overcome this problem users suggested to add a random number to the url. For example here
Whereas this will work, I'm looking for a way to add a random number globally / disable IE caching globally, and caching has also be disabled for resource xhr get calls. I'm guessing that I could achieve this goal by using one of the following approaches. But I'm not sure what I exactly have to do..
a) Using $httpProvider.defaults.transformRequest
b) by using request/response promise chaining which have been added in v 1.1.4
This is an old issue with IE. Not sure how to muck with Angular's $httpProvider (or that I want to), but here is a function that adds a random value to any url, whether or not it already uses a query string:
function noCacheUrl(url) {
var urlParser = document.createElement('a');
urlParser.href = url;
var q = "nc" + Math.random() + "=1";
q = urlParser.search ? urlParser.search + "&" + q : "?" + q;
return urlParser.protocol + "//" + urlParser.host + urlParser.pathname + urlParser.hash + q;
}
Note that the name of the added query param is random, not the value, making it less likely to collide with any existing name/value pairs. Example: &nc0.8578296909108758=1
http://jsfiddle.net/LgjLe/
I solved the caching issue with adding no-cache headers to the response at the server, using Django and a custom middleware.
class NeverCacheXhrMiddleware(object):
"""
sets no-cache headers for all xhr requests
xhr requests must send the HTTP_X_REQUESTED_WITH header to
be identified correctly as a xhr request using .is_ajax()
see: http://stackoverflow.com/questions/49547/making-sure-a-web-page-is-not-cached-across-all-browsers
"""
def process_response(self, request, response):
if request.is_ajax():
response['Cache-Control'] = 'no-cache, no-store, must-revalidate'
response['Pragma'] = 'no-cache'
response['Expires'] = '0'
return response