Design history tracking database using django-rest-framework - django

I am using the django-rest-framework api for the first time. Here's my question:
I need to design a database in which there are two tables:
Server => To save the server information such as ip address and server name
id: INT, name: VARCHAR, ip_address: VARCHAR
Deploy => The deploys on the server including the deploy date and a comment message
id: INT, server_id: FK to Server, deploy_date: DATETIME, message: VARCHAR
I am asked to keep track of the deploy information and design the following APIs:
get /servers/ => get all the server information with the latest deploy on that server
Example:
[
{
"id" : 1,
"name" : "qa-001",
"ip_address" : "192.168.1.1",
"deploy" :
{
"id" : 1,
"deploy_date" : "2013-09-09 12:00:00",
"message" : "test new api"
}
},
{
"id" : 2,
"name" : "qa-002",
"ip_address" : "192.168.1.2",
"deploy" :
{
"id" : 2,
"deploy_date" : "2013-09-10 12:00:00",
"message" : "test second message"
}
}
]
get /deploys/ => get all the deploy information
Example:
[
{
"id" : 1,
"deploy_date" : "2013-09-09 12:00:00",
"message" : "test new api",
"server_id" : 1
},
{
"id" : 2,
"deploy_date" : "2013-09-10 12:00:00",
"message" : "test second message",
"server_id" : 2
},
{
"id" : 3,
"deploy_date" : "2013-09-08 12:00:00",
"message" : "test new api",
"server_id" : 1
}
]
// Deploy 3 does not show up in the get servers return json
// because deploy 1 was deployed on the same server but later than deploy 3.
POST /deploys/ => To insert a new deploy
POST /servers/ => To insert a new server
...
I have played around with django-rest-framework tutorial code and read about different api documentations but couldn't figure out how to implement the features that I list above.
Please let me know if you have any ideas on how to implement this or if you think another database design would fit this requirement more.
Thanks!

Found this question which is very similar to mine.
How can I apply a filter to a nested resource in Django REST framework?
And I solved my problem by using the SerializerMethodField

Related

AWS AppSync enhanced subscription filter not working

Implementing subscriptions for AWS AppSync I use the enhanced filter capability to filter out tasks, that does not belong to a specific user.
To distinguish between users an ID is used in the claims part of the verified JWT that is then parsed in the $context object in the VTL response mapping.
But subscribers will always receive all objects that are created without the filter taking effect.
Our graphql schema (simplified) is looking like this
type Mutation {
createTask(
done: Boolean!,
due: String!,
id: String!,
identityId: String!,
read: Boolean!,
note: String!,
): Task
}
type Subscription {
create: Task
#aws_subscribe(mutations: ["createTask"])
}
type Task #aws_iam
#aws_oidc {
identityId: String!
done: Boolean
due: String
id: String
read: Boolean
note: String
}
The datasource for the subscription resolver is a NONE datasource and the request and response mappings are the following:
Request:
{
"version": "2017-02-28"
}
Response:
$extensions.setSubscriptionFilter({
"filterGroup": [
{
"filters" : [
{
"fieldName" : "identityId",
"operator" : "eq",
"value" : $context.identity.claims.identityId
}
]
}
]
})
$util.toJson($context.result)
With this enhanced filter I expect AppSync to filter out all tasks where the identityId does not match the one in the token... but that does not work for any reason.
What do i miss?
After a long search and almost giving up, I found the solution myself.
It's all about the correct composition of the payload attribute in the request mapping.
Without the payload object one could not access the claims in the identity part of the context object. Or at least the filtering doesn't seem to work.
Finally my request mapping looks like this:
{
"version" : "2017-02-28",
"payload" : {
"resultList" : $util.toJson($context.result),
"idnId" : "$context.identity.claims.identityId"
}
}
And in the response mapping
$extensions.setSubscriptionFilter({
"filterGroup": [{
"filters" : [{
"fieldName" : "identityId",
"operator" : "eq",
"value" : $context.result.idnId
}]
}]
})
$util.toJson($context.result.resultList)
I can then access the two objects.
So the filtering now works as expected.

Postman random data for each run of a request

Currently using postman to build up a few collections for api calls to our custom API. I'm using the faker library which is built in and in the body of a request I am using for example:
"profile": {
"firstName": "{{$randomFirstName}}",
"lastName": "{{$randomLastName}}",
"email": "{{$randomEmail}}",
"login": "{{$randomEmail}}"
}
What I am expecting to happen is that new data is put in with each request ie:
run 1 : "firstname" : "adam"
run 2 : "firstname" : "peter"
but im not seeing this. Each time its the same:
run 1 : "firstname" : "adam"
run 2 : "firstname" : "adam"
Any idea why im not getting new fake data each time?

AWS API Gateway create API key from lambda for unique user : Not Working

I want to create a unique api key whenever a user logs into my website
I want to do this from AWS Lambda based on a trigger
I plan to do this through the RESTAPI via axios
I believe I have misunderstood the docs, It's just killing me as my submission is just around the corner
inside the function
var x = await axios.post('/apikeys',{
name : "aaaaaa",
description : "Strinaaaag",
enabled : true,
generateDistinctId : true,
value : "sasdasdasdasdsd",
stageKeys : [ {
restApiId : "Strissng",
stageName : "prod"
} ],
})
and this is not working even the one below
params = {
name : "aaaaaa",
description : "Strinaaaag",
enabled : true,
generateDistinctId : true,
value : "sasdasdasdasdsd",
stageKeys : [ {
restApiId : "Strissng",
stageName : "prod"
} ],
}
AWS.APIGateway().createApiKey(params,function(err,data){
........
})

Get notified of successful/failed builds on Amazon's CodeDeploy

I'd like to build a tool that would notify a user each time there was a successful or failed build on CodeDeploy through any communication medium (email, slack, etc). I've went through their documentation.. and nothing except for long-polling comes to mind. Any idea if there's some webhook option where i can register a URL and be notified?
Update on 2016-04-27
AWS officially announced this in February 2016:
You can now create triggers that send Amazon SNS notifications before, during, and after the deployment process for your applications. Triggers can be set for the deployment as a whole or for the individual instances targeted by the deployment, and are sent on both successes and failures.
Original answer
Not yet.
In this AWS forum thread, it was requested that CodeDeploy emit events so you can use Lambda to process them instead of polling for details.
The answer by AWS staff (emphasis mine):
We here on CodeDeploy agree.
Unfortunately, I can't give you an exact release date but keep an eye on our announcements, it's coming soon.
Here is a gist of an AWS Lambda function that posts a formatted CodeDeploy notification to Slack
https://gist.github.com/MrRoyce/097edc0de2fe001288be2e8633f4b22a
var services = '/services/...'; // Update this with your Slack service...
var channel = "#aws-deployments" // And this with the Slack channel
var https = require('https');
var util = require('util');
var formatFields = function(string) {
var
message = JSON.parse(string),
fields = [],
deploymentOverview;
// Make sure we have a valid response
if (message) {
fields = [
{
"title" : "Task",
"value" : message.eventTriggerName,
"short" : true
},
{
"title" : "Status",
"value" : message.status,
"short" : true
},
{
"title" : "Application",
"value" : message.applicationName,
"short" : true
},
{
"title" : "Deployment Group",
"value" : message.deploymentGroupName,
"short" : true
},
{
"title" : "Region",
"value" : message.region,
"short" : true
},
{
"title" : "Deployment Id",
"value" : message.deploymentId,
"short" : true
},
{
"title" : "Create Time",
"value" : message.createTime,
"short" : true
},
{
"title" : "Complete Time",
"value" : ((message.completeTime) ? message.completeTime : ''),
"short" : true
}
];
if (message.deploymentOverview) {
deploymentOverview = JSON.parse(message.deploymentOverview);
fields.push(
{
"title" : "Succeeded",
"value" : deploymentOverview.Succeeded,
"short" : true
},
{
"title" : "Failed",
"value" : deploymentOverview.Failed,
"short" : true
},
{
"title" : "Skipped",
"value" : deploymentOverview.Skipped,
"short" : true
},
{
"title" : "In Progress",
"value" : deploymentOverview.InProgress,
"short" : true
},
{
"title" : "Pending",
"value" : deploymentOverview.Pending,
"short" : true
}
);
}
}
return fields;
}
exports.handler = function(event, context) {
var postData = {
"channel": channel,
"username": "AWS SNS via Lamda :: CodeDeploy Status",
"text": "*" + event.Records[0].Sns.Subject + "*",
"icon_emoji": ":aws:"
};
var fields = formatFields(event.Records[0].Sns.Message);
var message = event.Records[0].Sns.Message;
var severity = "good";
var dangerMessages = [
" but with errors",
" to RED",
"During an aborted deployment",
"FAILED",
"Failed to deploy application",
"Failed to deploy configuration",
"has a dependent object",
"is not authorized to perform",
"Pending to Degraded",
"Stack deletion failed",
"Unsuccessful command execution",
"You do not have permission",
"Your quota allows for 0 more running instance"];
var warningMessages = [
" aborted operation.",
" to YELLOW",
"Adding instance ",
"Degraded to Info",
"Deleting SNS topic",
"is currently running under desired capacity",
"Ok to Info",
"Ok to Warning",
"Pending Initialization",
"Removed instance ",
"Rollback of environment"
];
for(var dangerMessagesItem in dangerMessages) {
if (message.indexOf(dangerMessages[dangerMessagesItem]) != -1) {
severity = "danger";
break;
}
}
// Only check for warning messages if necessary
if (severity == "good") {
for(var warningMessagesItem in warningMessages) {
if (message.indexOf(warningMessages[warningMessagesItem]) != -1) {
severity = "warning";
break;
}
}
}
postData.attachments = [
{
"color": severity,
"fields": fields
}
];
var options = {
method: 'POST',
hostname: 'hooks.slack.com',
port: 443,
path: services // Defined above
};
var req = https.request(options, function(res) {
res.setEncoding('utf8');
res.on('data', function (chunk) {
context.done(null);
});
});
req.on('error', function(e) {
console.log('problem with request: ' + e.message);
});
req.write(util.format("%j", postData));
req.end();
};
At a high-level you need to:
Setup an SNS topic
Create a CodeDeploy Trigger
Add sns:Publish permissions to the CodeDeploy IAM Role
Configure Slack with an Incoming Webhook
Write and configure a Lambda function to process CodeDeploy's SNS messages, construct Slack messages and send them to Slack at the Incoming Webhook
I used code similar to the gist above to setup a Lambda function for Slack notifications for CodeDeploy events. I documented the entire procedure, including screenshots over here.
I couldn't find a similar, end-to-end guide elsewhere so I hope this helps someone else who stumbles on this question.
Although there is no native solution, there is a workaround that you can employ to accomplish this. You can use lambda to trigger these events. On the AWS blog, they show on how to trigger codedeploy via lambda when you upload a file to S3 (https://blogs.aws.amazon.com/application-management/post/Tx3TPMTH0EVGA64/Automatically-Deploy-from-Amazon-S3-using-AWS-CodeDeploy). Using this same concept, you can have your lambda function listen to a error/success bucket, and modify your codedeploy package to upload a file to s3 which in turn you can use as an event trigger to send an email via SES (https://peekandpoke.wordpress.com/2015/02/26/dancing-the-lambada-with-aws-lambda-or-sending-emails-on-s3-events/) or contact a web service/page that does what you want. It might be a little hokey but it gets the job done.

How to query JSON file in emberjs

I´ve never work with JSON before, but have with xml, php, mysql. I have a populated database on the server and would like to develop a web application with ember.js to interact with this data (CRUD).
Where should I start? I know ember-data has most of the things I would need when developing, but I'm unsure of how to start.
Since the database holds different tables, is it possible to keep this information in one json file? is it the appropriate way to do it? How do I automatically produce this json file from the server?
you can start with a read query :
You upload a file sample with the json syntaxe on your server (or use your json service if it's enable) to start quickly. Test that you can access to it in your browser, for example :
[ {"id": 1, "desc": "hmarchadour"}, {"id": 2, "desc": "moderator"} ]
Well now you take/create a view in your embjer js and you can use JQuery to call this file/service :
Ember.View.create({
templateName : "templateName" // you
stuff : [],
didInsertElement : function() {
$.ajax({
type : "GET",
url : <url-of-your-sample>,
dataType: "json",
success : function(result) {
var tmpStuff = json2Stuff(result);
this.set('stuff', tmpStuff);
},
error : ... }
);
}
});
Regards,