GeoDJango: retrieve last inserted primary key from LayerMapping - django

I am building an application with GeoDjango and I have the following problem:
I need to read track data from a GPX file and those data should be stored in a model MultiLineStringField field.
This should happen in the admin interface, where the user uploads a GPX file
I am trying to achieve this, namely that the data grabbed from the file should be assigned to the MultiLineStringField, while the other fields should get values from the form.
My model is:
class GPXTrack(models.Model):
nome = models.CharField("Nome", blank = False, max_length = 255)
slug = models.SlugField("Slug", blank = True)
# sport natura arte/cultura
tipo = models.CharField("Tipologia", blank = False, max_length = 2, choices=TIPOLOGIA_CHOICES)
descrizione = models.TextField("Descrizione", blank = True)
gpx_file = models.FileField(upload_to = 'uploads/gpx/')
track = models.MultiLineStringField(blank = True)
objects = models.GeoManager()
published = models.BooleanField("Pubblicato")
rel_files = generic.GenericRelation(MyFiles)
#publish_on = models.DateTimeField("Pubblicare il", auto_now_add = True)
created = models.DateTimeField("Created", auto_now_add = True)
updated = models.DateTimeField("Updated", auto_now = True)
class Meta:
#verbose_name = "struttura'"
#verbose_name_plural = "strutture"
ordering = ['-created']
def __str__(self):
return str(self.nome)
def __unicode__(self):
return '%s' % (self.nome)
def put(self):
self.slug = sluggy(self.nome)
key = super(Foresta, self).put()
# do something after save
return key
While in the admin.py file I have overwritten the save method as follows:
from django.contrib.gis import admin
from trails.models import GPXPoint, GPXTrack
from django.contrib.contenttypes import generic
from django.contrib.gis.gdal import DataSource
#from gpx_mapping import GPXMapping
from django.contrib.gis.utils import LayerMapping
from django.template import RequestContext
import tempfile
import os
import pprint
class GPXTrackAdmin(admin.OSMGeoAdmin):
list_filter = ( 'tipo', 'published')
search_fields = ['nome']
list_display = ('nome', 'tipo', 'published', 'gpx_file')
inlines = [TrackImagesInline, TrackFilesInline]
prepopulated_fields = {"slug": ("nome",)}
def save_model(self, request, obj, form, change):
"""When creating a new object, set the creator field.
"""
if 'gpx_file' in request.FILES:
# Get
gpxFile = request.FILES['gpx_file']
# Save
targetPath = tempfile.mkstemp()[1]
destination = open(targetPath, 'wt')
for chunk in gpxFile.chunks():
destination.write(chunk)
destination.close()
#define fields of interest for LayerMapping
track_point_mapping = {'timestamp' : 'time',
'point' : 'POINT',
}
track_mapping = {'track' : 'MULTILINESTRING'}
gpx_file = DataSource(targetPath)
mytrack = LayerMapping(GPXTrack, gpx_file, track_mapping, layer='tracks')
mytrack.save()
#remove the temp file saved
os.remove(targetPath)
orig = GPXTrack.objects.get(pk=mytrack.pk)
#assign the parsed values from LayerMapping to the appropriate Field
obj.track = orig.track
obj.save()
As far as I know:
LayerMapping cannot be used to update a field but only to save a new one
I cannot access a specific field of the LayerMapping object (ie in the code above: mytrack.track) and assign its value to a model field (ie obj.track) in the model_save method
I cannot retrieve the primary key of the last saved LayerMapping object (ie in the code above: mytrack.pk) in order to update it with the values passed in the form for the field not mapped in LayerMapping.mapping
What can I do then?!?!

I sorted it out subclassing LayerMapping and adding a method get_values() that instead of saving the retrieved data, returns them for any use or manipulation.The get_values method is a copy of the LayerMapping::save() method that returns the values instead of saving them.
I am using django 1.5
import os
from django.contrib.gis.utils import LayerMapping
import sys
class MyMapping(LayerMapping):
def get_values(self, verbose=False, fid_range=False, step=False,
progress=False, silent=False, stream=sys.stdout, strict=False):
"""
Returns the contents from the OGR DataSource Layer
according to the mapping dictionary given at initialization.
Keyword Parameters:
verbose:
If set, information will be printed subsequent to each model save
executed on the database.
fid_range:
May be set with a slice or tuple of (begin, end) feature ID's to map
from the data source. In other words, this keyword enables the user
to selectively import a subset range of features in the geographic
data source.
step:
If set with an integer, transactions will occur at every step
interval. For example, if step=1000, a commit would occur after
the 1,000th feature, the 2,000th feature etc.
progress:
When this keyword is set, status information will be printed giving
the number of features processed and sucessfully saved. By default,
progress information will pe printed every 1000 features processed,
however, this default may be overridden by setting this keyword with an
integer for the desired interval.
stream:
Status information will be written to this file handle. Defaults to
using `sys.stdout`, but any object with a `write` method is supported.
silent:
By default, non-fatal error notifications are printed to stdout, but
this keyword may be set to disable these notifications.
strict:
Execution of the model mapping will cease upon the first error
encountered. The default behavior is to attempt to continue.
"""
# Getting the default Feature ID range.
default_range = self.check_fid_range(fid_range)
# Setting the progress interval, if requested.
if progress:
if progress is True or not isinstance(progress, int):
progress_interval = 1000
else:
progress_interval = progress
# Defining the 'real' save method, utilizing the transaction
# decorator created during initialization.
#self.transaction_decorator
def _get_values(feat_range=default_range, num_feat=0, num_saved=0):
if feat_range:
layer_iter = self.layer[feat_range]
else:
layer_iter = self.layer
for feat in layer_iter:
num_feat += 1
# Getting the keyword arguments
try:
kwargs = self.feature_kwargs(feat)
except LayerMapError, msg:
# Something borked the validation
if strict: raise
elif not silent:
stream.write('Ignoring Feature ID %s because: %s\n' % (feat.fid, msg))
else:
# Constructing the model using the keyword args
is_update = False
if self.unique:
# If we want unique models on a particular field, handle the
# geometry appropriately.
try:
# Getting the keyword arguments and retrieving
# the unique model.
u_kwargs = self.unique_kwargs(kwargs)
m = self.model.objects.using(self.using).get(**u_kwargs)
is_update = True
# Getting the geometry (in OGR form), creating
# one from the kwargs WKT, adding in additional
# geometries, and update the attribute with the
# just-updated geometry WKT.
geom = getattr(m, self.geom_field).ogr
new = OGRGeometry(kwargs[self.geom_field])
for g in new: geom.add(g)
setattr(m, self.geom_field, geom.wkt)
except ObjectDoesNotExist:
# No unique model exists yet, create.
m = self.model(**kwargs)
else:
m = self.model(**kwargs)
try:
# Attempting to save.
pippo = kwargs
num_saved += 1
if verbose: stream.write('%s: %s\n' % (is_update and 'Updated' or 'Saved', m))
except SystemExit:
raise
except Exception, msg:
if self.transaction_mode == 'autocommit':
# Rolling back the transaction so that other model saves
# will work.
transaction.rollback_unless_managed()
if strict:
# Bailing out if the `strict` keyword is set.
if not silent:
stream.write('Failed to save the feature (id: %s) into the model with the keyword arguments:\n' % feat.fid)
stream.write('%s\n' % kwargs)
raise
elif not silent:
stream.write('Failed to save %s:\n %s\nContinuing\n' % (kwargs, msg))
# Printing progress information, if requested.
if progress and num_feat % progress_interval == 0:
stream.write('Processed %d features, saved %d ...\n' % (num_feat, num_saved))
# Only used for status output purposes -- incremental saving uses the
# values returned here.
return pippo
nfeat = self.layer.num_feat
if step and isinstance(step, int) and step < nfeat:
# Incremental saving is requested at the given interval (step)
if default_range:
raise LayerMapError('The `step` keyword may not be used in conjunction with the `fid_range` keyword.')
beg, num_feat, num_saved = (0, 0, 0)
indices = range(step, nfeat, step)
n_i = len(indices)
for i, end in enumerate(indices):
# Constructing the slice to use for this step; the last slice is
# special (e.g, [100:] instead of [90:100]).
if i + 1 == n_i: step_slice = slice(beg, None)
else: step_slice = slice(beg, end)
try:
pippo = _get_values(step_slice, num_feat, num_saved)
beg = end
except:
stream.write('%s\nFailed to save slice: %s\n' % ('=-' * 20, step_slice))
raise
else:
# Otherwise, just calling the previously defined _save() function.
return _get_values()
In a custom save or save_model method you can then use:
track_mapping = {'nome': 'name',
'track' : 'MULTILINESTRING'}
targetPath = "/my/gpx/file/path.gpx"
gpx_file = DataSource(targetPath)
mytrack = MyMapping(GPXTrack, gpx_file, track_mapping, layer='tracks')
pippo = mytrack.get_values()
obj.track = pippo['track']

Related

Flask app-builder how to make REST API with file items

I'm making a REST api that files can be uploaded based in MODEL-VIEW in flask-appbuilder like this.
But I don't know how to call REST API (POST /File).
I tried several different ways. but I couldn't.
Let me know the correct or the alternative ways.
[client code]
file = {'file':open('test.txt', 'rb'),'description':'test'}
requests.post(url, headers=headers, files=file)
==> Failed
model.py
class Files(Model):
__tablename__ = "project_files"
id = Column(Integer, primary_key=True)
file = Column(FileColumn, nullable=False)
description = Column(String(150))
def download(self):
return Markup(
'<a href="'
+ url_for("ProjectFilesModelView.download", filename=str(self.file))
+ '">Download</a>'
)
def file_name(self):
return get_file_original_name(str(self.file))
view.py
class FileApi(ModelRestApi):
resource_name = "File"
datamodel = SQLAInterface(Files)
allow_browser_login = True
appbuilder.add_api(FileApi)
FileColumn is only a string field that saves the file name in the database. The actual file is saved to config['UPLOAD_FOLDER'].
This is taken care of by flask_appbuilder.filemanager.FileManager.
Furthermore, ModelRestApi assumes that you are POSTing JSON data. In order to upload files, I followed Flask's documentation, which suggests to send a multipart/form-data request. Because of this, one needs to override ModelRestApi.post_headless().
This is my solution, where I also make sure that when a Files database row
is deleted, so is the relative file from the filesystem.
from flask_appbuilder.models.sqla.interface import SQLAInterface
from flask_appbuilder.api import ModelRestApi
from flask_appbuilder.const import API_RESULT_RES_KEY
from flask_appbuilder.filemanager import FileManager
from flask import current_app, request
from marshmallow import ValidationError
from sqlalchemy.exc import IntegrityError
from app.models import Files
class FileApi(ModelRestApi):
resource_name = "file"
datamodel = SQLAInterface(Files)
def post_headless(self):
if not request.form or not request.files:
msg = "No data"
current_app.logger.error(msg)
return self.response_400(message=msg)
file_obj = request.files.getlist('file')
if len(file_obj) != 1:
msg = ("More than one file provided.\n"
"Please upload exactly one file at a time")
current_app.logger.error(msg)
return self.response_422(message=msg)
else:
file_obj = file_obj[0]
fm = FileManager()
uuid_filename = fm.generate_name(file_obj.filename, file_obj)
form = request.form.to_dict(flat=True)
# Add the unique filename provided by FileManager, which will
# be saved to the database. The original filename can be
# retrieved using
# flask_appbuilder.filemanager.get_file_original_name()
form['file'] = uuid_filename
try:
item = self.add_model_schema.load(
form,
session=self.datamodel.session)
except ValidationError as err:
current_app.logger.error(err)
return self.response_422(message=err.messages)
# Save file to filesystem
fm.save_file(file_obj, item.file)
try:
self.datamodel.add(item, raise_exception=True)
return self.response(
201,
**{API_RESULT_RES_KEY: self.add_model_schema.dump(
item, many=False),
"id": self.datamodel.get_pk_value(item),
},
)
except IntegrityError as e:
# Delete file from filesystem if the db record cannot be
# created
fm.delete_file(item.file)
current_app.logger.error(e)
return self.response_422(message=str(e.orig))
def pre_delete(self, item):
"""
Delete file from filesystem before removing the record from the
database
"""
fm = FileManager()
current_app.logger.info(f"Deleting {item.file} from filesystem")
fm.delete_file(item.file)
You can use this.
from app.models import Project, ProjectFiles
class DataFilesModelView(ModelView):
datamodel = SQLAInterface(ProjectFiles)
label_columns = {"file_name": "File Name", "download": "Download"}
add_columns = ["file", "description", "project"]
edit_columns = ["file", "description", "project"]
list_columns = ["file_name", "download"]
show_columns = ["file_name", "download"]
Last add the view to the menu.
appbuilder.add_view(DataFilesModelView,"File View")

How to choose a default option out of few multiple options while creating document?

Since I am new to flask-pymongo. I want to design my database such that there are a few specific multiple options out of which one is chosen to be the default value. How do I do that?
I did not find any option to do that.
Example:
For the field Status, multiple options would be:
Active
Inactive
Locked
The default value to be chosen would be Active.
If you use classes with enumerations, this will aid your goal. The following works in Python 3.7. The nice thing is you can add to the Options list easily without having to rework any code.
from typing import Optional
from enum import Enum
from time import sleep
from pymongo import MongoClient
connection = MongoClient('localhost', 27017)
db = connection['yourdatabase']
# Define the enumerated list of options
class Options(Enum):
ACTIVE = 'Active'
INACTIVE = 'Inactive'
LOCKED = 'Locked'
# Define the class for the object
class StockItem:
def __init__(self, stock_item, status = None) -> None:
self.stock_item: str = stock_item
self.status: Optional[Options] = status
# Check if the status is set; if not set it to the default (Active)
if self.status is None:
self.status = Options.ACTIVE
# Check the status is valid
if self.status not in Options:
raise ValueError (f'"{str(status)}" is not a valid Status')
# The to_dict allows us to manipulate the output going to the DB
def to_dict(self) -> dict:
return {
"StockItem": self.stock_item,
"Status": self.status.value # Use status.value to get the string value to store in the DB
}
# The insert is now easy as we've done all the hard work earlier
def insert(self, db) -> None:
db.stockitem.insert_one(self.to_dict())
# Note item 2 does note have a specific status set, this will default to Active
item1 = StockItem('Apples', Options.ACTIVE)
item1.insert(db)
item2 = StockItem('Bananas')
item2.insert(db)
item3 = StockItem('Cheese', Options.INACTIVE)
item3.insert(db)
item4 = StockItem('Dog Food', Options.LOCKED)
item4.insert(db)
for record in db.stockitem.find({}, {'_id': 0}):
print (record)
# The final item will fail as the status is invalid
sleep(5)
item5 = StockItem('Eggs', 'Invalid Status')
item5.insert(db)

get() in Google Datastore doesn't work as intended

I'm building a basic blog from the Web Development course by Steve Hoffman on Udacity. This is my code -
import os
import webapp2
import jinja2
from google.appengine.ext import db
template_dir = os.path.join(os.path.dirname(__file__), 'templates')
jinja_env = jinja2.Environment(loader = jinja2.FileSystemLoader(template_dir), autoescape = True)
def datetimeformat(value, format='%H:%M / %d-%m-%Y'):
return value.strftime(format)
jinja_env.filters['datetimeformat'] = datetimeformat
def render_str(template, **params):
t = jinja_env.get_template(template)
return t.render(params)
class Entries(db.Model):
title = db.StringProperty(required = True)
body = db.TextProperty(required = True)
created = db.DateTimeProperty(auto_now_add = True)
class MainPage(webapp2.RequestHandler):
def get(self):
entries = db.GqlQuery('select * from Entries order by created desc limit 10')
self.response.write(render_str('mainpage.html', entries=entries))
class NewPost(webapp2.RequestHandler):
def get(self):
self.response.write(render_str('newpost.html', error=""))
def post(self):
title = self.request.get('title')
body = self.request.get('body')
if title and body:
e = Entries(title=title, body=body)
length = db.GqlQuery('select * from Entries order by created desc').count()
e.put()
self.redirect('/newpost/' + str(length+1))
else:
self.response.write(render_str('newpost.html', error="Please type in a title and some content"))
class Permalink(webapp2.RequestHandler):
def get(self, id):
e = db.GqlQuery('select * from Entries order by created desc').get()
self.response.write(render_str('permalink.html', id=id, entry = e))
app = webapp2.WSGIApplication([('/', MainPage),
('/newpost', NewPost),
('/newpost/(\d+)', Permalink)
], debug=True)
In the class Permalink, I'm using the get() method on the query than returns all records in the descending order of creation. So, it should return the most recently added record. But when I try to add a new record, permalink.html (it's just a page with shows the title, the body and the date of creation of the new entry) shows the SECOND most recently added. For example, I already had three records, so when I added a fourth record, instead of showing the details of the fourth record, permalink.html showed me the details of the third record. Am I doing something wrong?
I don't think my question is a duplicate of this - Read delay in App Engine Datastore after put(). That question is about read delay of put(), while I'm using get(). The accepted answer also states that get() doesn't cause any delay.
This is because of eventual consistency used by default for GQL queries.
You need to read:
https://cloud.google.com/appengine/docs/python/datastore/data-consistency
https://cloud.google.com/appengine/docs/python/datastore/structuring_for_strong_consistency
https://cloud.google.com/datastore/docs/articles/balancing-strong-and-eventual-consistency-with-google-cloud-datastore/
search & read on SO and other source about strong & eventual consistency in Google Cloud Datastore.
You can specify read_policy=STRONG_CONSISTENCY for your query but it has associated costs that you should be aware of and take into account.

Set default value for dynamic choice field

I have a form that asks the user to enter in their zip code. Once they do it sends them to another form where there is a field called 'pickup_date'. This gets the value of the zip from the previous field and gets all of the available pickup_dates that match that zip code into a ChoiceField. I set all of this within the init of the model form.
def __init__(self,*args,**kwargs):
super(ExternalDonateForm,self).__init__(*args,**kwargs)
if kwargs:
zip = kwargs['initial']['zip']
self.fields['pickup_date'] = forms.ChoiceField(choices = self.get_dates(zip))
elif self.errors:
zip = self.data['zip']
self.fields['pickup_date'] = forms.ChoiceField(choices = self.get_dates(zip))
The problem I have is when there are other errors on the form. I use the elif self.errors to regenerate the possible choices but it doesn't default to the original selected option. It goes back and defaults to the first choice. How can I make it so it's default option on form errors is what was originally posted?
Change self.fields['pickup_date'] to self.fields['pickup_date'].initial and see if that helps.
I got it to work after playing around for a while. Above, I was setting all the dynamic choices with a get_dates() function that returned a tuple. Instead of doing that I returned a field object like this using a customized ModelChoiceField instead of a regular ChoiceField....
class MyModelChoiceField(ModelChoiceField):
def label_from_instance(self, obj):
return obj.date.strftime('%a %b %d, %Y')
Dates function
def get_dates(self,zip):
routes = Route.objects.filter(zip=zip).values_list('route',flat=True)
pickups = self.MyModelChoiceField(queryset = PickupSchedule.objects.filter(
current_count__lt=F('specials'),
route__in=routes,
).order_by('date')
)
if not pickups:
pickups = (('----','No Pickups Available At This Time'),)
return pickups
in the init i set the value for self.fields['pickup_date'] like so..
self.fields['pickup_date'] = self.get_dates(zip)

Django formset unit test

I can't run a unit test with formset.
I try to do a test:
class NewClientTestCase(TestCase):
def setUp(self):
self.c = Client()
def test_0_create_individual_with_same_adress(self):
post_data = {
'ctype': User.CONTACT_INDIVIDUAL,
'username': 'dupond.f',
'email': 'new#gmail.com',
'password': 'pwd',
'password2': 'pwd',
'civility': User.CIVILITY_MISTER,
'first_name': 'François',
'last_name': 'DUPOND',
'phone': '+33 1 34 12 52 30',
'gsm': '+33 6 34 12 52 30',
'fax': '+33 1 34 12 52 30',
'form-0-address1': '33 avenue Gambetta',
'form-0-address2': 'apt 50',
'form-0-zip_code': '75020',
'form-0-city': 'Paris',
'form-0-country': 'FRA',
'same_for_billing': True,
}
response = self.c.post(reverse('client:full_account'), post_data, follow=True)
self.assertRedirects(response, '%s?created=1' % reverse('client:dashboard'))
and I have this error:
ValidationError: [u'ManagementForm data is missing or has been
tampered with']
My view :
def full_account(request, url_redirect=''):
from forms import NewUserFullForm, AddressForm, BaseArticleFormSet
fields_required = []
fields_notrequired = []
AddressFormSet = formset_factory(AddressForm, extra=2, formset=BaseArticleFormSet)
if request.method == 'POST':
form = NewUserFullForm(request.POST)
objforms = AddressFormSet(request.POST)
if objforms.is_valid() and form.is_valid():
user = form.save()
address = objforms.forms[0].save()
if url_redirect=='':
url_redirect = '%s?created=1' % reverse('client:dashboard')
logon(request, form.instance)
return HttpResponseRedirect(url_redirect)
else:
form = NewUserFullForm()
objforms = AddressFormSet()
return direct_to_template(request, 'clients/full_account.html', {
'form':form,
'formset': objforms,
'tld_fr':False,
})
and my form file :
class BaseArticleFormSet(BaseFormSet):
def clean(self):
msg_err = _('Ce champ est obligatoire.')
non_errors = True
if 'same_for_billing' in self.data and self.data['same_for_billing'] == 'on':
same_for_billing = True
else:
same_for_billing = False
for i in [0, 1]:
form = self.forms[i]
for field in form.fields:
name_field = 'form-%d-%s' % (i, field )
value_field = self.data[name_field].strip()
if i == 0 and self.forms[0].fields[field].required and value_field =='':
form.errors[field] = msg_err
non_errors = False
elif i == 1 and not same_for_billing and self.forms[1].fields[field].required and value_field =='':
form.errors[field] = msg_err
non_errors = False
return non_errors
class AddressForm(forms.ModelForm):
class Meta:
model = Address
address1 = forms.CharField()
address2 = forms.CharField(required=False)
zip_code = forms.CharField()
city = forms.CharField()
country = forms.ChoiceField(choices=CountryField.COUNTRIES, initial='FRA')
In particular, I've found that the ManagmentForm validator is looking for the following items to be POSTed:
form_data = {
'form-TOTAL_FORMS': 1,
'form-INITIAL_FORMS': 0
}
Every Django formset comes with a management form that needs to be included in the post. The official docs explain it pretty well. To use it within your unit test, you either need to write it out yourself. (The link I provided shows an example), or call formset.management_form which outputs the data.
It is in fact easy to reproduce whatever is in the formset by inspecting the context of the response.
Consider the code below (with self.client being a regular test client):
url = "some_url"
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
# data will receive all the forms field names
# key will be the field name (as "formx-fieldname"), value will be the string representation.
data = {}
# global information, some additional fields may go there
data['csrf_token'] = response.context['csrf_token']
# management form information, needed because of the formset
management_form = response.context['form'].management_form
for i in 'TOTAL_FORMS', 'INITIAL_FORMS', 'MIN_NUM_FORMS', 'MAX_NUM_FORMS':
data['%s-%s' % (management_form.prefix, i)] = management_form[i].value()
for i in range(response.context['form'].total_form_count()):
# get form index 'i'
current_form = response.context['form'].forms[i]
# retrieve all the fields
for field_name in current_form.fields:
value = current_form[field_name].value()
data['%s-%s' % (current_form.prefix, field_name)] = value if value is not None else ''
# flush out to stdout
print '#' * 30
for i in sorted(data.keys()):
print i, '\t:', data[i]
# post the request without any change
response = self.client.post(url, data)
Important note
If you modify data prior to calling the self.client.post, you are likely mutating the DB. As a consequence, subsequent call to self.client.get might not yield to the same data, in particular for the management form and the order of the forms in the formset (because they can be ordered differently, depending on the underlying queryset). This means that
if you modify data[form-3-somefield] and call self.client.get, this same field might appear in say data[form-8-somefield],
if you modify data prior to a self.client.post, you cannot call self.client.post again with the same data: you have to call a self.client.get and reconstruct data again.
Django formset unit test
You can add following test helper methods to your test class [Python 3 code]
def build_formset_form_data(self, form_number, **data):
form = {}
for key, value in data.items():
form_key = f"form-{form_number}-{key}"
form[form_key] = value
return form
def build_formset_data(self, forms, **common_data):
formset_dict = {
"form-TOTAL_FORMS": f"{len(forms)}",
"form-MAX_NUM_FORMS": "1000",
"form-INITIAL_FORMS": "1"
}
formset_dict.update(common_data)
for i, form_data in enumerate(forms):
form_dict = self.build_formset_form_data(form_number=i, **form_data)
formset_dict.update(form_dict)
return formset_dict
And use them in test
def test_django_formset_post(self):
forms = [{"key1": "value1", "key2": "value2"}, {"key100": "value100"}]
payload = self.build_formset_data(forms=forms, global_param=100)
print(payload)
# self.client.post(url=url, data=payload)
You will get correct payload which makes Django ManagementForm happy
{
"form-INITIAL_FORMS": "1",
"form-TOTAL_FORMS": "2",
"form-MAX_NUM_FORMS": "1000",
"global_param": 100,
"form-0-key1": "value1",
"form-0-key2": "value2",
"form-1-key100": "value100",
}
Profit
There are several very useful answers here, e.g. pymen's and Raffi's, that show how to construct properly formatted payload for a formset post using the test client.
However, all of them still require at least some hand-coding of prefixes, dealing with existing objects, etc., which is not ideal.
As an alternative, we could create the payload for a post() using the response obtained from a get() request:
def create_formset_post_data(response, new_form_data=None):
if new_form_data is None:
new_form_data = []
csrf_token = response.context['csrf_token']
formset = response.context['formset']
prefix_template = formset.empty_form.prefix # default is 'form-__prefix__'
# extract initial formset data
management_form_data = formset.management_form.initial
form_data_list = formset.initial # this is a list of dict objects
# add new form data and update management form data
form_data_list.extend(new_form_data)
management_form_data['TOTAL_FORMS'] = len(form_data_list)
# initialize the post data dict...
post_data = dict(csrf_token=csrf_token)
# add properly prefixed management form fields
for key, value in management_form_data.items():
prefix = prefix_template.replace('__prefix__', '')
post_data[prefix + key] = value
# add properly prefixed data form fields
for index, form_data in enumerate(form_data_list):
for key, value in form_data.items():
prefix = prefix_template.replace('__prefix__', f'{index}-')
post_data[prefix + key] = value
return post_data
The output (post_data) will also include form fields for any existing objects.
Here's how you might use this in a Django TestCase:
def test_post_formset_data(self):
url_path = '/my/post/url/'
user = User.objects.create()
self.client.force_login(user)
# first GET the form content
response = self.client.get(url_path)
self.assertEqual(HTTPStatus.OK, response.status_code)
# specify form data for test
test_data = [
dict(first_name='someone', email='someone#email.com', ...),
...
]
# convert test_data to properly formatted dict
post_data = create_formset_post_data(response, new_form_data=test_data)
# now POST the data
response = self.client.post(url_path, data=post_data, follow=True)
# some assertions here
...
Some notes:
Instead of using the 'TOTAL_FORMS' string literal, we could import TOTAL_FORM_COUNT from django.forms.formsets, but that does not seem to be public (at least in Django 2.2).
Also note that the formset adds a 'DELETE' field to each form if can_delete is True. To test deletion of existing items, you can do something like this in your test:
...
post_data = create_formset_post_data(response)
post_data['form-0-DELETE'] = True
# then POST, etc.
...
From the source, we can see that there is no need include MIN_NUM_FORM_COUNT and MAX_NUM_FORM_COUNT in our test data:
MIN_NUM_FORM_COUNT and MAX_NUM_FORM_COUNT are output with the rest of the management form, but only for the convenience of client-side code. The POST value of them returned from the client is not checked.
This doesn't seem to be a formset at all. Formsets will always have some sort of prefix on every POSTed value, as well as the ManagementForm that Bartek mentions. It might have helped if you posted the code of the view you're trying to test, and the form/formset it uses.
My case may be an outlier, but some instances were actually missing a field set in the stock "contrib" admin form/template leading to the error
"ManagementForm data is missing or has been tampered with"
when saved.
The issue was with the unicode method (SomeModel: [Bad Unicode data]) which I found investigating the inlines that were missing.
The lesson learned is to not use the MS Character Map, I guess. My issue was with vulgar fractions (¼, ½, ¾), but I'd assume it could occur many different ways. For special characters, copying/pasting from the w3 utf-8 page fixed it.
postscript-utf-8