I have an application built on a proprietary cloud platform, that accesses a MongoDB database (hosted at MongoHQ) via web services.
Since I can not load anything on the proprietary cloud platform, I can not run any native MongoDB driver on the platform. I am forced to use Web Services.
Everything was going swimmingly, until it came time to do basic summary/averages of the data.
Below is an example document, based on mongoDB's posted best practices for time series data. These data points are performance metrics collected within a single hour. (dt = top of the hour, vals are minutes/seconds format (m0214 means 2 minutes 14 seconds past the hour)
{
_id: ObjectId("531fb241406eb30d07260d61"),
dt: 1394586000000,
inst: "my_instance_key",
vals: {
m0014: 78,
m0214: 94,
m0614: 63,
m0814: 94,
m1014: 78,
m1214: 78,
m1414: 109,
m1614: 250,
m1814: 78,
m2014: 125,
m2214: 94,
m2414: 63,
m2614: 78,
m2814: 63,
m3014: 78,
m3214: 78,
m3414: 63
}
}
What I want to do is add a summarized "Hourly" value to the document, an average of all the minute values. However there doesn't appear to be a way to do this via a web services call since the aggregation framework seems to be absent from MongoHQ and MongoDBLab's web services API.
I guess my questions are these:
1) Is there a way to do this using published web services API's from MongoHQ or MongoDBLab? (that does not involve downloading all the data points to my app, and doing the math there)
2) Is there any hosted provider of managed MongoDB that allows access to aggregation framework, or provides an elegant way to accomplish this?
Thank you very much for your help!
Your explanation is not the best for why you are using a Service API as opposed to a driver connection. We can only presume you are using some legacy language such as classic ASP or some other.
If you are using ASP, then using the C# Driver (or really just the .NET driver) may be a possibility for you by following these instructions.
For anything else, I would say that these services are likely to not supply much more than basic CRUD operations. And you are not asking people to recommend are you? because you already know that is "off-topic".
So if you are really in a bind with the language you are working with, then why not create your own RESTFul API? Surely there is another language that you are comfortable in where you could implement this and use the native driver for that language implementation:
An approach like this with nodejs and express might serve as an example:
app.post('/api/db/:varDb/collection/:varCollection/aggregate',
function(req,res) {
// Req body has the parsed JSON as the pipeline commands
// Mongoclient is already open
var db = mongoclient.db(varDb);
db.collection(varCollection).aggregate( req.body.pipeline,
function(err,response) {
// Send JSON response to client
});
});
This would just be a matter of deploying that service somewhere where your legacy code application was able to access it. So yourAPI would now connect to your mongo hosting provider, and your application talks to your API.
These are probably the simplest types of services to build. Go on and try.
Related
I am familiar with firebase platform, but I am relatively a new user of the google cloud platform as whole.
I am working on a project built using a microservices structure, and I do have so many question for which I cannot find an answer or better I cannot find any example.
Unfortunately all the example that I am able to find are way to simple to be able to extrapolate a viable answer for my issues.
I adopted the new cloud run offer, and I decided to play with the full managed version (not kubernetes). I built few microservices (each service is built using express for node or flask for python - depending on what the services does). Each microservices expose it's own endpoint and has it's own api to call the methods - and I use a service account to allow the application to perform the internal calls.
I now want to expose the application to the external (specifically to my client built using vuejs technology), and I was trying to leverage another google product to create and expose an api: the google endpoints.
My question (specifically referred to the cloud run structure) is related to how is possible and what I need to do to create an api endpoints to communicate with the client app, that internally calls multiple services and combine their response in one.
Just to be clear, let's make an example:
Cloud run service 1 -> crud user api
Cloud run service 2 -> crud product api
Cloud endpoint external visible api -> get user from service 1, and after get products from service 2 and return the combined response all green products for user Jane Doe.
How I can aggregate the response directly in the endpoint gateway, check for failure and if everything goes smooth send the aggregate response to the client?
I need to build the aggregate endpoint in something else, like a cloud function for example? or I can do it directly in the google endpoints gateway?
Note that for cloud run the google endpoints is another cloud run container.
Thanks guys for some help, running pretty much out of option here.
As per my understanding, API Gateway should just work as a proxy, presenting all micro services as a single endpoint. To this scenarios I think you can have following 2 approaches :
1: Implement a new micro service (or on any of the existing one) which will do invocations and aggregation of responses.
2: Client(like UI) can invoke the services and do the aggregation on their side as well.
I feel, it is not a good idea to do it at api-gateway.
In my opinion, from an architectural point of view, the best option for you is to create a new microservice which will take the responses from the other two and then, it will aggregate them.
I understand that you want to aggregate the responses in a api-geteway and you are not able to find code examples for it. Here I was able to find a guide on what are you wanting to implement. The full code implementation can be found in this repository.
Keep in mind though, this idea of implementation is not a best practice.
This is ok, only if those two services that are going to be combined are independent. Meaning there is no functional/business relation between them and the concurrency or inconsistency problem will not occur in the process of aggregating.
This is about a Reporting Server solution.
I need some advice to choose a product, which will hold a SQL Database Server and a Web Service App (one that will make a call to a stored procedure and run an SSIS package - not much processing here -) and SSRS. I'm not familiar with this, it needs to be available 24/7, as I said there's no much processing just synchronizing data (few hundreds of thousands of records), what do you suggest me?
Requirements:
SQL Server Enterprise 2017: this will hold the database and execute
the SSIS package.
We have an SSIS package that will be executed from a .Net Web Service app which will execute a Stored Procedure on users demand.
The Server needs to run Reporting Services (SSRS).
Considerations:
Storage: Database will hold around 750K records (all text).
Bandwidth: There will be synchronization (data retrieval or updates
only) with an external system.
Use: the client has asked to consider a dedicated instance since they
will use it at their own discretion.
Now the only issue is, as far as I know, we can't call a Stored Procedure from the outside system (outside the server), or at least I have not found a way to do that, that's why I want to host both solutions in one place, so the Web Service App can call the Stored Procedure Locally.
So now I'm wondering, what should I do? should I leverage a full VM? how much will cost?
If you want to do PaaS and not have to manage infrastructure, take a look at the Azure App Service Environment is an Azure App Service feature that provides a fully isolated and dedicated environment for securely running App Service apps at high scale. This capability can host your:
Windows web apps
Linux web apps
Docker containers
Mobile apps
Functions
For SQL you can use Azure SQL Database Managed instance,a new deployment option of Azure SQL Database, providing near 100% compatibility with the latest SQL Server on-premises (Enterprise Edition) Database Engine, providing a native virtual network (VNet) implementation that addresses common security concerns, and a business model favorable for on-premises SQL Server customers. This is a fully isolated instance of SQL server.
I suggest you host a static site on blob, an Azure function on consumption model to make calls to SQL database and a SQL database. Of course, there are alternative architecture you can use, however all depends on detailed requirements.
In google cloud platform i want to write one application that will take http request , hit apis in chain and then show a template based on the response received from the api and populate them with data received from apis . There are many templates .
What is the best way to design on GCP considering the below.
1. The application will received huge traffic.
2. Some apis will return dynamic urls that template needs.
I was thinking of wrinting in java and putting that on Kubernetes , that will manage the traffic . But what should be the choice of database to be used ?
The data is mostly key value pairs and should be highly available , in case it is down some backup should be there
Yes, Kubernetes is one option, something else that you may want to consider to handle huge app traffic is Google App Engine (GAE), since you mentioned Java development you can use the GAE Standard environment which is easy to build, deploy and runs reliably even under heavy load (fully managed).
You may want to consider using Cloud Datastore since based on your description, it is the best fit for the application needs (NoSQL database and automatically handles sharding and replication). You can also use the diagram to choose the best storage option.
What I already have:
An asp.net core on .NET framework project which uses DocumentDB as its storage
An Azure WebJob which listens to a queue that my web project writes messages to for e-mail sending and other processing
Successfully deployed and running on Azure
This all works fine. In addition to the web project there is a Model and Data class library to separate the application into layers.
Currently, the web application invokes a web service and will save the result (a quite large xml document) in the cache and keep it there for 24 hours. This is not ideal as it takes a long time the first time. What I want instead is a nightly batch job which invokes this web service and then stores (overwrites) the response into persistent storage which the web application will then use instead.
I'm confused about which Azure "service" to use for this. What I have started on so far is another WebJob and the idea is to use the same DocumentDb storage to persist the web service response every night. However, I already have all the database repository etc. set up in the web application (Data class library), is it ok to just reference this project from the WebJob instead of having to rewrite some of the same code in the WebJob?
Is it better to use some of the other Azure storage options for this WebJob instead? Like Table Storage, Blob Storage etc? Basically the structure of the data received from the web service is very simple. For each item I just need to store a url, a title, description and unique product id. Obviously the web application needs to access this storage too by simply looking up the product id, and never writing to this storage.
Also, I'm not entirely sure if there is a better alternative than Azure WebJobs for this task, but it seems like the right approach.
Any feedback is appreciated. I'm generally just confused/overwhelmed by all the different services that Azure provides.
I'll answer some of your questions...
A webjob works fine for this task. If you have a webservice that is always on adding another webjob seems like a good idea. If your webservice isn't always on, you could have a look at Azure Functions. Azure Functions is sometimes called webjobs 2.0.
As for storage in Document DB there is a file size limit for 2MB (give or take). So, you'll have to find another solution there. I think that Azure Tables also have limitations on storage size, so you'll have to split the file in smaller chunks. So, recommended solution is to go with Azure Blobs.
You'll find some good reading in this answer regarding Blobs vs Tables vs SQL
- Getting started with Azure storage: Blobs vs Tables vs SQL Azure
I wish to monitor all the APIs that I created on one of my docker containers. That Docker container is using Django REST framework for its services.. and I am running it on Azure. I want to monitor my API by means of if it is working or if there are too many requests it will throw an alert.. what is its request per second something like that.
We are using sysdig for monitoring our containers but I don't think it has the capability to monitor all our APIs of our Django Rest Framework
To monitor your API performance and downtime, you could create custom scripts to ping your API and alert you if there's downtime, or you could use a third-party service to monitor remotely. This is the simpler option, as it doesn't require writing and maintaining code.
One third-party service you could use is mine, https://assertible.com. They provide frequent health checks (1/5/15 minute), deep data validation, integrations with other services like Slack and GitHub, and a nice way to view/manage test failures.
If you want to integrate with your own code or scripts, you can use Trigger URLs and/or the Deployments API to programatically run your tests whenever and wherever:
$ curl 'https://assertible.com/apis/{API_ID}/run?api_token=ABC'
[{
"runId": "test_fjdmbd",
"result": "TestPass",
"assertions": {
"passed": [{...}],
"failed": [{...}]
},
...
}]
Hope it helps!
You can use the monitoring functionality from Postman. For more information check out the following link [1].
[1] https://learning.getpostman.com/docs/postman/monitors/intro_monitors/
Since you're running on Azure, you should take a look at Application Insights:
Application Insights is an extensible Application Performance
Management (APM) service for web developers on multiple platforms. Use
it to monitor your live web application. It will automatically detect
performance anomalies. It includes powerful analytics tools to help
you diagnose issues and to understand what users actually do with your
app. It's designed to help you continuously improve performance and
usability. It works for apps on a wide variety of platforms including
.NET, Node.js and J2EE, hosted on-premises or in the cloud. It
integrates with your devOps process, and has connection points to a
variety of development tools. Source
API monitoring is described here.