What's the best way to format/customize the remote response? - loopbackjs

Sometimes, we need to modify the response JSON data before it be sent to client. for example:
//model definition
{
"name": "File",
"base": "PersistedModel",
"properties": {
"filename": {
"type": "string",
"required": true
},
"filepath": {
"type": "string"
}
}
"protected": ["filepath"]
}
I want to get a url property on GET /files/:id request, and so I defined a url GETTER on prototype.
//file.js
module.exports = function(File){
var baseUrl = 'http://example.com/uploads/files/';
File.prototype.__defineGetter__('url', function(){
return baseUrl + this.id.toString() + this.filename;
});
}
My question is How to expose the url property to remote response when I make a request as following?
GET /files/123456
expect a response like:
{
id: '123456',
filename: 'myfile.ext',
url: 'http://example.com/uploads/files/123456/myfile.ext'
}
Thanks a lot!

Use a remote method/hook and customize your response accordingly. See https://github.com/strongloop/loopback-example-app-logic/blob/master/common/models/car.js.

You can use Operation Hooks to intercept CRUD actions independently of the specific method that invoke them.
The code below will add the url property to the File object when loading a File object.
File.observe('loaded', function(ctx, next) {
var baseUrl = 'http://example.com/uploads/files/';
ctx.data.url = baseUrl + data.id + data.filename;
next();
});
This will get called when any of the methods below are invoked, either directly in your JS or indirectly via the HTTP API.
find()
findOne()
findById()
exists()
count()
create()
upsert() (same as updateOrCreate())
findOrCreate()
prototype.save()
prototype.updateAttributes()
Other Operation Hooks include:
access
before save
after save
before delete
after delete
loaded
persist

Related

Postman JSON Schema Validation fails, if an Object.prototype function declared prior to the validation

I have a schema validation test in my postman collection, which validates if the response adhere to the schema. This is how I do it.
var schema =
{
"type": "object",
"properties": {
"data": {
"type": "object",
"properties": {....
}
pm.test("Schema Validation - TC001", function(){
pm.response.to.have.jsonSchema(schema);
});
When I execute just this script, it validates the schema of the response successfully.
However, in my postman collection I have declared a global function, prior to the schema validation, using Object.prototype() and I'm calling the function as _.funcABC("a","b","c")
Object.prototype.funcABC = function (var1, var2, var3) {
console.log("test");
}
And, my schema validation fails, when I run the entire collection.
While troubleshooting, I came across this, which indicates that the Object.prototype could interfere with JSONschema.
Is there a way to overcome this interference of Object.prototype() on JSONschema? So far, I couldn't find a workable solution.
Thanks.
What stops you from doing this:
pm.test('validate schema', function () {
let temp = Object.prototype.function1
delete Object.prototype.function1
pm.expect(ajv.validate(schema_response, response)).to.true;
Object.prototype.function1 = temp
})

Is it theoretically possible to use `putRecord` in Kinesis via singular http POST request?

I've been experiencing some issues with AWS Kinesis inasmuch as I have a stream set up and I want to use a standard http POST request to invoke a Kinesis PutRecord call on my stream. I'm doing this because bundle-size of my resultant javascript application matters and I'd rather not import the aws-sdk to accomplish something that should (on paper) be possible.
Just so you know, I've looked at this other stack overflow question about the same thing and It was... sort of informational.
Now, I already have a method to sigv4 sign a request using an access key, secret token, and session token. but when I finally get the result of signing the request and send it using the in-browser fetch api, the service tanks with (or with a json object citing the same thing, depending on my Content-Type header, I guess) as the result.
Here's the code I'm working with
// There is a global function "sign" that does sigv4 signing
// ...
var payload = {
Data: { task: "Get something working in kinesis" },
PartitionKey: "1",
StreamName: "MyKinesisStream"
}
var credentials = {
"accessKeyId": "<access.key>",
"secretAccessKey": "<secret.key>",
"sessionToken": "<session.token>",
"expiration": 1528922673000
}
function signer({ url, method, data }) {
// Wrapping with URL for piecemeal picking of parsed pieces
const parsed = new URL(url);
const [ service, region ] = parsed.host.split(".");
const signed = sign({
method,
service,
region,
url,
// Hardcoded
headers : {
Host : parsed.host,
"Content-Type" : "application/json; charset=UTF-8",
"X-Amz-Target" : "Kinesis_20131202.PutRecord"
},
body : JSON.stringify(data),
}, credentials);
return signed;
}
// Specify method, url, data body
var signed = signer({
method: "POST",
url: "https://kinesis.us-west-2.amazonaws.com",
data : JSON.stringify(payload)
});
var request = fetch(signed.url, signed);
When I look at the result of request, I get this:
{
Output: {
__type: "com.amazon.coral.service#InternalFailure"},
Version: "1.0"
}
Now I'm unsure as to whether Kinesis is actually failing here, or if my input is malformed?
here's what the signed request looks like
{
"method": "POST",
"service": "kinesis",
"region": "us-west-2",
"url": "https://kinesis.us-west-2.amazonaws.com",
"headers": {
"Host": "kinesis.us-west-2.amazonaws.com",
"Content-Type": "application/json; charset=UTF-8",
"X-Amz-Target": "Kinesis_20131202.PutRecord",
"X-Amz-Date": "20180613T203123Z",
"X-Amz-Security-Token": "<session.token>",
"Authorization": "AWS4-HMAC-SHA256 Credential=<access.key>/20180613/us-west-2/kinesis/aws4_request, SignedHeaders=content-type;host;x-amz-target, Signature=ba20abb21763e5c8e913527c95a0c7efba590cf5ff1df3b770d4d9b945a10481"
},
"body": "\"{\\\"Data\\\":{\\\"task\\\":\\\"Get something working in kinesis\\\"},\\\"PartitionKey\\\":\\\"1\\\",\\\"StreamName\\\":\\\"MyKinesisStream\\\"}\"",
"test": {
"canonical": "POST\n/\n\ncontent-type:application/json; charset=UTF-8\nhost:kinesis.us-west-2.amazonaws.com\nx-amz-target:Kinesis_20131202.PutRecord\n\ncontent-type;host;x-amz-target\n508d2454044bffc25250f554c7b4c8f2e0c87c2d194676c8787867662633652a",
"sts": "AWS4-HMAC-SHA256\n20180613T203123Z\n20180613/us-west-2/kinesis/aws4_request\n46a252f4eef52991c4a0903ab63bca86ec1aba09d4275dd8f5eb6fcc8d761211",
"auth": "AWS4-HMAC-SHA256 Credential=<access.key>/20180613/us-west-2/kinesis/aws4_request, SignedHeaders=content-type;host;x-amz-target, Signature=ba20abb21763e5c8e913527c95a0c7efba590cf5ff1df3b770d4d9b945a10481"
}
(the test key is used by the library that generates the signature, so ignore that)
(Also there are probably extra slashes in the body because I pretty printed the response object using JSON.stringify).
My question: Is there something I'm missing? Does Kinesis require headers a, b, and c and I'm only generating two of them? Or is this internal error an actual failure. I'm lost because the response suggests nothing I can do on my end.
I appreciate any help!
Edit: As a secondary question, am I using the X-Amz-Target header correctly? This is how you reference calling a service function so long as you're hitting that service endpoint, no?
Update: Followinh Michael's comments, I've gotten somewhere, but I still haven't solved the problem. Here's what I did:
I made sure that in my payload I'm only running JSON.stringify on the Data property.
I also modified the Content-Type header to be "Content-Type" : "application/x-amz-json-1.1" and as such, I'm getting slightly more useful error messages back.
Now, my payload is still mostly the same:
var payload = {
Data: JSON.stringify({ task: "Get something working in kinesis" }),
PartitionKey: "1",
StreamName: "MyKinesisStream"
}
and my signer function body looks like this:
function signer({ url, method, data }) {
// Wrapping with URL for piecemeal picking of parsed pieces
const parsed = new URL(url);
const [ service, region ] = parsed.host.split(".");
const signed = sign({
method,
service,
region,
url,
// Hardcoded
headers : {
Host : parsed.host,
"Content-Type" : "application/json; charset=UTF-8",
"X-Amz-Target" : "Kinesis_20131202.PutRecord"
},
body : data,
}, credentials);
return signed;
}
So I'm passing in an object that is partially serialized (at least Data is) and when I send this to the service, I get a response of:
{"__type":"SerializationException"}
which is at least marginally helpful because it tells me that my input is technically incorrect. However, I've done a few things in an attempt to correct this:
I've run JSON.stringify on the entire payload
I've changed my Data key to just be a string value to see if it would go through
I've tried running JSON.stringify on Data and then running btoa because I read on another post that that worked for someone.
But I'm still getting the same error. I feel like I'm so close. Can you spot anything I might be missing or something I haven't tried? I've gotten sporadic unknownoperationexceptions but I think right now this Serialization has me stumped.
Edit 2:
As it turns out, Kinesis will only accept a base64 encoded string. This is probably a nicety that the aws-sdk provides, but essentially all it took was Data: btoa(JSON.stringify({ task: "data"})) in the payload to get it working
While I'm not certain this is the only issue, it seems like you are sending a request body that contains an incorrectly serialized (double-encoded) payload.
var obj = { foo: 'bar'};
JSON.stringify(obj) returns a string...
'{"foo": "bar"}' // the ' are not part of the string, I'm using them to illustrate that this is a thing of type string.
...and when parsed with a JSON parser, this returns an object.
{ foo: 'bar' }
However, JSON.stringify(JSON.stringify(obj)) returns a different string...
'"{\"foo\": \"bar\"}"'
...but when parsed, this returns a string.
'{"foo": "bar"}'
The service endpoint expects to parse the body and get an object, not a string... so, parsing the request body (from the service's perspective) doesn't return the correct type. The error seems to be a failure of the service to parse your request at a very low level.
In your code, body: JSON.stringify(data) should just be body: data because earlier, you already created a JSON object with data: JSON.stringify(payload).
As written, you are effectively setting body to JSON.stringify(JSON.stringify(payload)).
Not sure if you ever figured this out, but this question pops up on Google when searching for how to do this. The one piece I think you are missing is that the Record Data field must be base64 encoded. Here's a chunk of NodeJS code that will do this (using PutRecords).
And for anyone asking, why not just use the SDK? I currently must stream data from a cluster that cannot be updated to a NodeJS version that the SDK requires due to other dependencies. Yay.
const https = require('https')
const aws4 = require('aws4')
const request = function(o) { https.request(o, function(res) { res.pipe(process.stdout) }).end(o.body || '') }
const _publish_kinesis = function(logs) {
const kin_logs = logs.map(function (l) {
let blob = JSON.stringify(l) + '\n'
let buff = Buffer.from(blob, 'binary');
let base64data = buff.toString('base64');
return {
Data: base64data,
PartitionKey: '0000'
}
})
while(kin_logs.length > 0) {
let data = JSON.stringify({
Records: kin_logs.splice(0,250),
StreamName: 'your-streamname'
})
let _request = aws4.sign({
hostname: 'kinesis.us-west-2.amazonaws.com',
method: 'POST',
body: data,
path: '/?Action=PutRecords',
headers: {
'Content-Type': 'application/x-amz-json-1.1',
'X-Amz-Target': 'Kinesis_20131202.PutRecords'
},
}, {
secretAccessKey: "****",
accessKeyId: "****"
// sessionToken: "<your-session-token>"
})
request(_request)
}
}
var logs = [{
'timeStamp': new Date().toISOString(),
'value': 'test02',
},{
'timeStamp': new Date().toISOString(),
'value': 'test01',
}]
_publish_kinesis(logs)

Can't read query parameter in AWS

I am wanting to pass a query parameter from API Gateway into AWS Lambda but I am always receiving null values.
Here's my Lambda function which I merely want to return the value of http://foo.bar?name=Dan
'use strict';
exports.handle = (context, event, callback) => {
callback(null, event.name);
}
In API Gateway I have done the following:
Create a Resource
Create a Method (GET)
Selected the correct Lambda function
Selected my GET method and clicked on Integration Request
Selected Body Mapping Templates
Set Content-Type to application/json
Added {"name": "$input.params('name')" }
Save and deploy!
However, when I load up my API the value of event.name is always null. Accessing the API is done via ...amazonaws.com/beta/user?name=dan
Edit: I've tried the accepted answer here but after simply returning the event in the callback, I only receive this data:
{
"callbackWaitsForEmptyEventLoop": true,
"logGroupName": "",
"logStreamName": "",
"functionName": "",
"memoryLimitInMB": "",
"functionVersion": "",
"invokeid": "",
"awsRequestId": "",
"invokedFunctionArn": ""
}
I have omitted the values.
The function arguments' placement for context and event are misplaced. Change their placement as below
'use strict';
exports.handle = (event, context, callback) => {
callback(null, event.name);
}
Even I had the same issue before and I have modified body mapping template like below. Please try it out.
#set($inputRoot = $input.path('$'))
{
"name" : "$input.params('$.name')"
}
If you are using path parameter then please try below,
#set($inputRoot = $input.path('$'))
{
"name" : "$input.path('$.name')"
}

How to Model.fetch(<object>) when the returned data is a single object

I want to make an API call for searching that looks like this:
https://myapi.com/search/<query>/<token>
where query is the search term and token (optional) is an alphanumeric set of characters which identifies the position of my latest batch of results, which is used for infinite scrolling.
This call returns the following JSON response:
{
"meta": { ... },
"results" {
"token": "125fwegg3t32",
"content": [
{
"id": "125125122778",
"text": "Lorem ipsum...",
...
},
{
"id": "125125122778",
"text": "Dolor sit amet...",
...
},
...
]
}
}
content is an array of (embedded) items that I'm displaying as search results. My models look like this:
App.Content = Em.Model.extend({
id: Em.attr(),
text: Em.attr(),
...
});
App.Results = Em.Model.extend({
token: Em.attr(),
content: Em.hasMany('App.Content', {
key: 'content',
embedded: true
})
});
In order to make that API call, I figured I have to do something like this:
App.Results.reopenClass({
adapter: Em.RESTAdapter.create({
findQuery: function(klass, records, params) {
var self = this,
url = this.buildURL(klass) + '/' + params.query;
if (params.token) {
url += '/' + params.token;
}
return this.ajax(url).then(function(data) {
self.didFindQuery(klass, records, params, data);
return records;
});
}
}),
url: 'https://myapi.com/search',
});
then somewhere in my routes do this:
App.Results.fetch({query: 'query', token: '12kgkj398512j'}).then(function(data) {
// do something
return data;
})
but because the API returns a single object and Em.RESTAdapter.findQuery expects an array, an error occurs when Ember Model tries to materialize the data. So how do I do this properly? I'm using the latest build of Ember Model.
By the way, I'm aware that it would be much more convenient if the API was designed in a way so I can just call App.Content.fetch(<object>), which would return a similar JSON response, but I would then be able to set the collectionKey option to content and my data would be properly materialized.
You simply need to override your models load() method to adjust the payload hash to what Ember.Model wants. There are no serializers in Ember.Model. There is both a class level load for handling collections and an instance level load for loading the JSON specific to a single model. You want to override the instance level load method to wrap the content key value in an array if its not one already.
I have been using Ember.Mode quite heavily and enhanced it for a number of my use cases and submitted PR's for both fixes and enhancements. Those PRs have been sitting there for a while with no response from the maintainers. I have now moved to Ember.Data which has been 'rebooted' so to speak and having a lot better result with it now.
I would strongly suggest walking away from Ember.Model as it appears dead with the new pragmatic direction Ember Data has taken and because the project maintainer doesn't appear to have any interest in it anymore.

How to get the return data of a save with Ember Data?

I'm using Ember.js with ember-data to make a GUI for a self made API.
I've been following this tutorial to handle authentication but I want to use ember-data instead of custom jQuery requests.
One thing I have to do is to call the API to create a new session, by sending email and password, and the API sends me back an API Key object.
Here is my LoginController handling the loginUser action (it's CoffeeScript) :
App.LoginController = Ember.ObjectController.extend
actions:
loginUser: ->
session = #store.createRecord 'session',
email: #get 'email'
password: #get 'password'
session.save()
Here is the result I get when creating a session:
{
"users": [
{
"id": "525fa0286c696c0b14040000",
"email": "john.doe#mydomain.com",
"first_name": "John",
"surname": "Doe"
}
],
"api_key": {
"id": "526e464c6c696c07d2000000",
"type": "session",
"key": "6b824d6a-a065-4b6f-bb28-5c19389760f8",
"expires_at": "2013-10-28T11:41:08+00:00",
"user_id": "525fa0286c696c0b14040000"
}
}
I have Session, ApiKey and User models. I can create the session, but the thing I don't understand is how to access the return value of the save() method.
I know that my ApiKey and User are loaded somewhere because I get an error after save() if their respective Ember model don't exist but I don't know how to access them.
I've tried to use save() callbacks like then() or didCreate event but there's a lack of documentation about arguments passed to these callbacks and how to use them.
Ember.js 1.1.2
Ember Data 1.0.0.beta.3
EDIT:
I've tried to create an actuel Session model on my API, resulting in this JSON output:
{
"api_keys": [
{
"id": "526f69526c696c07d2110000",
"type": "session",
"key": "4c26af37-2b21-49c2-aef5-5850a396da0b",
"expires_at": "2013-10-29T08:22:50+00:00",
"user_id": "525fa0286c696c0b14040000"
}
],
"users": [
{
"id": "525fa0286c696c0b14040000",
"email": "john.doe#coreye.fr",
"first_name": "John",
"surname": "Doe"
}
],
"session": {
"id": "526f6e666c696c18c0010000",
"api_key_id": "526f69526c696c07d2110000"
}
}
(note the root element is now session)
It doesn't work better because now my save action leads to the following error (not in the console but then points to error callback):
Object function () { [...] } has no method 'eachTransformedAttribute'
I get this error, the relation between Session and ApiKey being declared in Ember Data models or not...
Your second example JSON looks better: since you are saving a Session, I would expect a session node in the response and other models to be side loaded. You can access the saved session after it's saved by using a promise callback:
session.save().then (savedSession) =>
console.log savedSession.api_key.key
Since you have _id relationship keys in your JSON, I assume you are using the ActiveModel adapter and its default serializer:
App.ApplicationAdapter = DS.ActiveModelAdapter.extend()