I have a model schema and i want to read those and validate in postman test script.
could you please share your suggestions ?
Related
I've built a simple form on a web page built with flask. But I am looking to take the user input and query a snowflake database with it (using the snowflake connector) and print the output for the user.
I've set up a .py with the snowflake.connector which works, but I'm a little new to all this so any help would be much appreciated!
FORM ---USER INPUT---SNOWFLAKE---OUTPUT
I am using newman with htmlextra to run a Postman collection and generate an html report.
I have used a table within the Postman documentation section which is in Markdown format, something like this :
CODE
STATUS
RESULT
200
success
404
no_data_found
No saved report found for the logged-in user
It renders properly in Postman
However, when I open the htmlextra report, it looks something like this :
htmlextra report
Has someone come across this issue before?
Any ideas would be helpful.
Thanks in advance.
I need to implement a Django RESTful service where I can run an algorithm and get the result.
On the server I need to have a CSV file with the records for the algorithm. A service, let's say /train will allow me to train a Random Forest with the data in the CSV file, and finally another service /predict will receive the parameters and send me the result. The problem is that I have this running as a script on my computer and I don't know how to structure it for a web application.
I have already done RESTful APIs in Django but this problem is different I think.
I won't need models?
What about the serializers?
My idea is to send a GET request to /predict with the parameters needed for the Random Forest and return the algorithm result.
Any suggestions? Or a public repo with a similar problem?
let say you have
train_view for '/train' with POST request.
result_view for /predict with GET request
Do you need models ?
I think you need that since in request /predict you are going to apply logic on the data you have given in request /train, so create model.
Do you need serializers
Since you have model, you can write modelserializer
I am currently using Tastypie to provide a programmatic interface to my Django database. On problem that I've run into a couple of times is that when client code uploads data for a field that doesn't exist, Tastypie ignores it. This means that the client code has no idea that some of the data that it tried to upload was ignored. I'd like to tell the client that it tried to upload an unknown field, possible with a status code 406 (not acceptable).
I have two related questions:
Is it appropriate for RESTful design to reject this extra data?
If so, is there a tidy way to do this through Tastypie?
As an example of my concern, consider this toy Tastypie API:
from tastypie import resources, fields
class DemoResource(resources.ModelResource):
name = fields.CharField()
optional = fields.CharField(blank=True)
If client code uploaded the json data: {name: "new data", optioanl: "this field is misspelled"}, the misspelled optional field would be ignored. My current plan is to use a Tastypie validator to compare the bundle data against the bundle object, but this seems really non-DRY.
I need to populate sample data for Django Image/File fields in automated python code.
I want to initialize Django ImageFields and FileFields with some sample images and data files that are stored in my "site_media" directory of my Django project.
What would the code look like? This is not for testing, I want to autopopulate sample data into my Django website user's accounts (including this sample Imgae/File media.)
This should be done in python code without using fixtures.
If I understand you correctly you want to use fixtures, basically a JSON formatted file that holds data that will be put into the project database. They can be executed through the django-admin command like :
django-admin.py loaddata mydata.json
see http://docs.djangoproject.com/en/1.2/ref/django-admin/#loaddata-fixture-fixture