API monitoring tool - django

I wish to monitor all the APIs that I created on one of my docker containers. That Docker container is using Django REST framework for its services.. and I am running it on Azure. I want to monitor my API by means of if it is working or if there are too many requests it will throw an alert.. what is its request per second something like that.
We are using sysdig for monitoring our containers but I don't think it has the capability to monitor all our APIs of our Django Rest Framework

To monitor your API performance and downtime, you could create custom scripts to ping your API and alert you if there's downtime, or you could use a third-party service to monitor remotely. This is the simpler option, as it doesn't require writing and maintaining code.
One third-party service you could use is mine, https://assertible.com. They provide frequent health checks (1/5/15 minute), deep data validation, integrations with other services like Slack and GitHub, and a nice way to view/manage test failures.
If you want to integrate with your own code or scripts, you can use Trigger URLs and/or the Deployments API to programatically run your tests whenever and wherever:
$ curl 'https://assertible.com/apis/{API_ID}/run?api_token=ABC'
[{
"runId": "test_fjdmbd",
"result": "TestPass",
"assertions": {
"passed": [{...}],
"failed": [{...}]
},
...
}]
Hope it helps!

You can use the monitoring functionality from Postman. For more information check out the following link [1].
[1] https://learning.getpostman.com/docs/postman/monitors/intro_monitors/

Since you're running on Azure, you should take a look at Application Insights:
Application Insights is an extensible Application Performance
Management (APM) service for web developers on multiple platforms. Use
it to monitor your live web application. It will automatically detect
performance anomalies. It includes powerful analytics tools to help
you diagnose issues and to understand what users actually do with your
app. It's designed to help you continuously improve performance and
usability. It works for apps on a wide variety of platforms including
.NET, Node.js and J2EE, hosted on-premises or in the cloud. It
integrates with your devOps process, and has connection points to a
variety of development tools. Source
API monitoring is described here.

Related

Can we deploy QlikSense on serverless architecture?

Can we deploy QlikSense/QlikView on serverless architecture?
Currently using Monolithic architecture, any other way to move on to serverless?
While I am not familiar with Qlik's products, it is unlikely they would be suitable for serverless architecture.
Companies generally offer the products either as:
Downloadable products that you run on your own server (which could be a virtual server in the cloud), or
Software-as-a-Service, where you access their website directly and no server is required (eg Salesforce)
"Serverless architecture" is a design decision that can be made when designing a software product. It means the the application is broken-down into small components ('microservices') that can be run on services like AWS Lambda, with no actual server.
However, such architecture would normally only be used for your own applications that you create. If another company has designed their system to be 'serverless', then they would normally run it on a cloud system (eg AWS) and offer it to users as Software-as-a-Service. It would be highly unusual to have a 'download' product that runs on a serverless architecture.
I notice that Qlik has product offerings that run on AWS (AWS Marketplace: Qlik), which runs on an Amazon EC2 instance, rather than serverless.
If you look into the Qlik Core product then yes Qlik can be deployed on an elastic containerised enviroment. But then as I understand it you don't get the standard objects and visualisations, user management etc. So you have to code your own stuff that ties into the Qlik Data Analytics Engine via apis.
From https://core.qlik.com/why-qlik-core/
So, how do you get it? Licensing info here but let’s talk components
Linux-based Associative Engine – provided as a Docker image with built-in support for Amazon Web Services, Microsoft Azure and Google Cloud Platform
Supporting APIs – these ingest your data into the Qlik Associative Engine through connectors
Supporting Open Source Libraries – these various libraries by Qlik expose the engine to help you build solutions faster
It’s all language agnostic but JavaScript lovers will find it easier to work with given the number of our open source tools available in JavaScript. Other top languages and tools used include R, Go, Shell, C#, Python, Java and D3. Qlik Core can also be managed with the orchestration tool of your choice for implementing, scaling and managing containerized applications.
It really depends on what exactly are you building. As The Budac mentioned you can use Qlik Core
If you just want invoke Qlik API (for example some automation jobs) then serverless functions are making sense.
Qlik Sense (both Enterprise and Kubernetes versions) expose a lot more different API which can be called from technically everywhere.
QlikView on the other hand is more ... conservative. QV is an older software and the API/integrations are more limited. For example: to call the Management API you have to be on the same domain as QV. Personally I've only connected to QV Management API only with C# and pretty sure you cant use JS/Node

Best option to schedule payments: azure scheduler, WebJob or Azure Functions or a Worker Role?

I've hosted my website on azure and now I want to schedule payments on a monthly basis. I am using Authorize.net for payments but I cannot use their recurring billing feature as it gives very little control. I've to perform checks in the database, make payments and update records. What should I use Azure Scheduler, Azure WebJob or Azure Functions a Worker Role?
Definitely not a Worker Role. They are very heavyweight and generally not worth the effort for a single, simple job like this.
Web jobs might be a good solution. It can run in the context of your web app, so you can use this with no additional cost. But you'll need to do some development with this - you have to create an app that calls Authorize.net.
If you only need to fire a single HTTP request, then using Azure Scheduler to schedule this HTTP action might be a good choice. You can configure the request itself (headers, payload) and it has error handling as well. But you might have to store sensitive information in the Azure portal, in the configuration of the scheduled job.
So I'd say forget about the Worker Role, then weigh simplicity against flexibility and development effort. That being sad, I would probably try it with the scheduler, and then move on to the WebJob, if I encounter something that is not feasible with the scheduler.
Edit:
Azure Functions can also be a good option - I'd say it's sort of a middle ground between the webjob and the simple scheduled option. It is part of the app services featureset, so it can run in the same appservice plan as the web app, so no costs. But here you have to code the http request to Authorize.net yourself as well. But Azure Functions is a lot more lightweight compared to webjobs - you do not have to create an exe (or ps script or whatever), you can just code the http request in a script editor inside the Azure portal. But you still have to do it yourself. This is a bit more flexible than the simple scheduled option though, which is something to consider when it comes to error handling.
So this is a good middleground, but I think it's still a lot of work given the complexity of the task (which is to fire a single HTTP request).
To get it working quickly, Logic Apps is a good choice. With Logic Apps, you can trigger it with a timer based on schedule you defined, use the out-of-box SQL/DocDB (depending on your exact scenario) to connect to your database. Although there's currently no Authorize.net connector available, you should be able to use the generic HTTP action to talk to its RESTful APIs. Most likely, you should be able to get this working very quickly. I'd also recommend submit a suggestion on aka.ms/logicapps-wish so we can track the request for Authorize.net connector, when available, is going to make this ever easier.

What are the pros and cons of developing a web app using Parse vs. AWS?

From what I know, Parse offers convenient communication stacks for various platforms such as iOS, so it is easy to build clients that use your web app.
But Parse also seems to be tightly integrated with Facebook. If you were to build a web app that does not need Facebook, but that may integrate with Facebook in the long term, is Parse the clear winner over deploying directly to AWS, or are there important disadvantages to consider?
As far as I understand their page Parse is a PaaS (platform as a service) provider like Heroku and others while AWS is a IaaS (infrastructure as a service) provider.
Pros for PaaS:
They care about the infrastructure
You build your app on an existing platform
For the start you don't need "ops-guys" as you don't do ops
You can take their knowledge and prebuilt tools for your advance
Pros for IaaS:
You have full control about the underlaying infrastructure
You can start with a greenfield and build what ever you want
You can use tools like Puppet / Chef / ... to control your servers
You don't have to pay for the additional stuff you get when using PaaS
(but have to pay your people for it)
So there is not a winner of this "battle" but you have to decide whether you want to use prebuilt tooling and give some independence for this or whether you want to have the absolute control over everything (nearly as you can't touch the hardware) and invest time and manpower into building your own tooling.
"Better, Faster, Cheaper.."
If you are pursuing mobile first strategy, Parse is a great tool for bootstrapping a mature, full web-presence from nothing more than an original beta app.
I dont have direct experience with AWS.
I have used Heroku/Parse integrating (very quickly) a stand alone mobile app with the back-end where the back end needs to cover following:
DB/persistence/noSql
Workflow - async tasks
REST API interface HTTP
Once the mobile app existed with only stubbed local data , Parse allowed a single engineer to build out ALL infrastructure mentioned above very quickly, taking the app from single user to multi-user with full DB and workflow that backs client side events with considerable server-side and cloud side business logic and process. Scaling related startup stuff that used to take weeks took only days.
The compression (time&money) when scaling up an app stack is really something. The Parse API did almost everything that i needed with one small exception (remuxing UGC media).
Personally, i abandoned the parse/android SDK in favor of a more robust REST API (threading on client-side and heavy HTTP activity ).
Developers used to Curl/REST dev stacks will take to Parse.

advantage of WSO2 AS instead of other application servers

Why would anyone use WSO2 Application Server instead of other application servers?
I rather encountered only problems with it, mainly due to class loading issues, so I would appreciate if someone could point out what are the advantages or the use cases when using WSO2-AS really makes a difference.
I can see the benefits of other standalone WSO2 products, but as far as the AS is concerned, I would rather rely on more lightweight servers and just package the libraries I need.
There are number of advantages on WSO2 Application Server.
1.) It provides in-built support for multi-tenancy, in case if you have isolated departments like organization there is no real need to have number of server instances you could simply create a new tenant.
2.) Automatic lazy loading support for tenants, web applications and web services. In a production system a particular tenant/web application/web service can be ideal for sometime it's a waste to allocate hardware resources continuously to such ideal applications specially if you use IaaS. WSO2 application server can detect such ideal tenant/web application/web service and release their resources and tenant/web application/web service will load again when a new request dispatch to the particular tenant/web application/web service.
3.) Wide range of deployment options, support to deploy on-premise, public or private IaaS , public or private PassS such as Apache Stratos. An an example one can deploy his application into WSO2 App Cloud (http://wso2.com/cloud/app-cloud/) instantly without downloading anything, later he can get same experience one of above platforms.
4.) Deployment synchronization feature, a clustered environment you may have very large number of nodes and upgrading application version and configuration changes across the cluster can be headache. Using Deployment synchronization feature you can modify only one node labeled as manger node and Deployment synchronization will take care about synchronize changes across the cluster automatically and consistently.
5.) When developing applications on WSO2 Application Server you can leverage carbon platform level features such as identity, registry, logging, distributed caching, multi-tenancy etc. As an example one can use identity features provided by the platform to mange users, roles permissions also for authentication and authorization without write something own.
6.) Inbuilt support for security standards such SSO among other WSO2 products.
7.) In-build monitoring capability for web services and web application through WSO2 BAM.
8.) Enhanced and rich dashboard for applications and services which facilitate to basic statistics, application management, security wizards, code generations, Try -It tools, run time logging configurations etc.
9.) Enhanced classloading mechanism (starting from AS 5.1.0), within one Application server instance you can have number of virtual server environments per application level. As an example one can specify an application run on minimal Tomcat mode or can assign to run Carbon mode which is ( Tomcat + Carbon platform).
When it come to your specific issue if you can specify your Application Server version and elaborate more on your classloading issue I can provide you more specific answer.
Having said above I want to mention that I'm from WSO2.

Kinvey server setup

Kinvey is Backend as a Service | Mobile Cloud Backend as a Service
Is Kinvey ( http://www.kinvey.com/ ) good, or using custom Java server with database a good idea ?
I am a member of the Kinvey engineering team, and can talk a bit about BaaS in general. While creating your own backend gives you a lot of flexibility and control, it is also a lot of work.
Back-end as a service providers like Kinvey offer a platform to speed up app development and have already done a lot of the work for you. Tasks like managing a database server and a web service front-end, managing the storage and streaming of files, providing a cross-platform push notification, providing a centralized user and authentication store, integration with social networks, buisiness logic and more are easily implemented with SDKs for each platform.
If I were to list the three main advantages of BaaS, they are:
Ease of implementation
Ready-made back-end platform for cross-platform apps
Automatic scalability if your app becomes successful
As far as disadvantages, your backend feature set becomes dependent on the vendor, and you certainly get more flexibility with a custom solution, but that can often be overcome with business logic. In my own (admittedly biased) opinion, the flexibility and cost savings make it worth at least giving BaaS a try seeing if the feature sets meet your specific needs.