How can I export a SQLAlchemy schema for Breeze? [closed] - flask

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have implemented a collection of APIs using Flask + SQLAlchemy backend. I want to use BreezeJS in the frontend. How can I export my db schema which can be understood by Breeze JS?

Davidism is correct on both counts.
Breeze needs metadata for the JSON at the server boundary. That may map directly to your db schema but it doesn't have to.
You can define the metadata entirely in code and that may be easier than trying to generate it from your db schema. It's really not hard to write metadata by hand.
It's up to you to decide how much of the breeze query URI (OData query) you want to support server side. You don't have to support the query at all. Plenty of folks find plenty of value in breeze w/o querying simply from how it helps manage data on the client.

Related

Why use Redis with PostgreSQL, why not just one of them? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have seen all over the web that people are configuring their PostgreSQL along side Redis. My question is, why would someone want to use in-memory storage system like Redis, when they have already configured permanent storage system like PostgreSQL.
I get that Redis works with RAM and that it is much faster, but is that the only reason?
There might be lot of combinations why people use that stack, but this is not necessary for all sites. That might be used, for example, for counting most visited pages, or Redis is good brocker for using with async tasks, like Celery. But yep, on my oponion the only reason to use it - is speed.

How to use Google DLP API to delete sensitive content from data stored in Google Big Query? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have a certain table in Google Big Query which has some sensitive fields. I read and understood about inspection of data but cannot find a way to redact the data using DLP API directly in BigQuery database.
Two questions:
Is it possible to do it just using DLP API?
If not, what is the best way to fix data in a table which runs into Terabytes?
The API does not yet support de-identifying bigquery directly.
You can however write a dataflow pipeline that leverages content.deidentify. If you batch your rows utilizing Table objects (https://cloud.google.com/dlp/docs/reference/rest/v2/ContentItem#Table) this can work pretty efficiently.

Make my Django app into a rentable service for companies [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have a webapp running with Django. I would like to make an Enterprise Edition of this website. In exchange for a yearly fee, I would like to allow companies to host the webapp on their own server and benefit from unique features
The problem is that I have no idea how to proceed. How can I execute this, without sending my whole code over to their computer and launching the django app there?
Is there something like generating an .exe that would run the django app without sending my code? How do companies usually proceed to make their tools available on another company's intranet?
well I haven't heard something like that, but what you can do is using a license in your code, there are many types of licenses, from open source to trade secret license, read more about it

How to write a web server (service) in OCaml? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 11 months ago.
Improve this question
I wish to write a web service server (using http protocol) in OCaml.
For a simple example, I want to write a http server. User can access it and provide parameter like http://myservice.com?i=1&j=2. Then my server get the request and parameters and calculate i+j and return the result.
Of course, my service will be more complicated for the calculation part. Instead of this simple example of calculation, what I really need to do is to
access the database (MongoDB) to get some data
access another 3rd party web service to get more data
calculate all data to get a result and return to the user.
So, I also need to consider parallelism / multi-threading, although I want to start with simple case first.
My questions are:
Which library should I use to first set up such a http server? I have looked into Ocamlnet 3 and think it might be a good candidate, but it lacks good tutorial and I still don't know how to use nethttpd or netplex etc.
How should I design the architecture of my web application? I know OCaml is not good at parallelism, then how can I make each service instance not blocking?

Collecting data from website without API [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I am looking to build a webapp to improve user experience in booking railway tickets in India. The API is impossible to get due to hefty charge to procure it. I have seen many apps that provide details of the trains etc through their apps.
My Question is how are they scraping data from the website.In general how can I legally get data shown to user (I don't want payment and stuff that are impossible without API) on any website. How do people scrape such data? Any tools/methods?
Bear with me if question is naive. I'm pretty new to this stuff.
They can get the train schedule information using any one of several programming languages though it is most likely done with ordinary PHP and any good webserver host. For example all indian train schedules can be found on the indianrail.gov website.
Sending a specially built URL to ..
http://www.indianrail.gov.in/cgi_bin/inet_trnnum_cgi.cgi?lccp_trnname=1123
using the POST method of sending form data should give you all the details for train number 1123 After that it becomes just a simple task of tidying up the results for storage in a database.
Update: well armoured site its checking both the user agent and referer of inbound requests.
Ammendum: the indianrail.gov site is changing to http://www.trainenquiry.com/ -> will have to take another look