Pyomo: Unable to print set list - pyomo

Edit: I realized that my initial model was set up as an AbstractModel and so because the model data and model framework are separated, it would of course show that the set was empty. However, if you change your model to a ConcreteModel and repeat, you'll find that the function will work. Of course, make sure things are case sensitive too!
I am just trying to learn and experiment with Pyomo. I have the following piece of code from the book that I am trying to run:
model.A = Set(initialize=[1,2,3])
print(len(model.a))
I should get 3. However, I get this:
'WARNING: Implicitly replacing the Component attribute A (type=<class pyomo.core.base.sets.SimpleSet'>) on block unknown with a new Component
(type=<class 'pyomo.core.base.sets.SimpleSet'>). This is usually
indicative of a modelling error. To avoid this warning, use
block.del_component() and block.add_component().
0

Try len(model.A) , pyomo is case sensitive.

I realized that my initial model was set up as an AbstractModel and so because the model data and model framework are separated, it would of course show that the set was empty. However, if you change your model to a ConcreteModel and repeat, you'll find that the function will work. Of course, make sure things are case sensitive too!

Related

Prevent normalizing of the model name by overriding 'normalizeModelName' in Ember Data

I would like to prevent normalizing (dasherized by convention) of the model name and instead use the original name. I want to override the function 'normalizeModelName' as the page http://emberjs.com/api/data/#method_normalizeModelName suggests that this should be possible. But i'm not able to do so.
Simply assigning a new function to DS.normalizeModelName is returning an error: Cannot assign to read only property 'normalizeModelName' of object '[object Object]'.
How could I prevent normalizing of the (internal) model name?
small warning: this is related to 2.10.0
i will not guarantee compatibility beyond or below this version. you need to dig through the corresponding files yourself.
since i had a similar problem i dug through the sourcecode of some ember-data stuff and came to the following conclusion:
you need to extend DS.JSONAPIAdapter with pathForType(name) wich essentially takes the name and camelizes as well as pluralizes it before it's being returned as plain text string.
this usually converts model names from foo-bar to fooBars by doing this: https://github.com/emberjs/data/blob/v2.10.0/addon/adapters/json-api.js#L134-L137
now about the opposite:
you need to extend DS.JSONAPISerializer with keyForRelationship(key, typeClass, method)
where key is essentially the model name aquired from relationships in your models. like: fooBar
this usually translates to foo-bar by simply doing return dasherize(key); https://github.com/emberjs/data/blob/v2.10.0/addon/serializers/json-api.js#L453-L455
you might to also dig through some other methods inside the serializer: modelNameFromPayloadKey modelNameFromPayloadType payloadKeyFromModelName payloadTypeFromModelName
just throw in some debugger; lines to see whats going through.

Yii dynamic model id

So I'm working on some unit tests and relational fixtures.
I'm creating a model dynamically like:
$model = CActiveRecord::model('Post');
$post = $model->findByPk(1);
But after that I cannot for some reason get $post->id. I traced the problem to CActiveRecord class:
public function __get($name)
{
if(isset($this->_attributes[$name]))
return $this->_attributes[$name];
...
Where $name = "id". It says that $this->_attributes[$name] does not exist! As a matter of fact _attributes is empty.
My Post class does not define id (or any other properties) as a public property and I don't want to do so either. I just let the AR map it to table columns for me.
What am I missing?
Edit 1
My fixtures are regular Yii fixtures - nothing really special about them.
What differs is the way I load them really. I extended the CDbFixtureManager to be able to specify the order in which they should be loaded by overloading load() method. Only thing of interest that actually fails is that in the fixtures that have foreign keys I use the following:
'comment1' => array('post_id' => $this->getRecord('Post', 'post1')->id);
That's where it fails. getRecord returns the actual Post record (since I know the Post fixture has already been successfully loaded and exists in DB), but on the ->id part I get an exception about that attribute not existing.
If I go into Post model and add public $id; to it, then everything works! But I'm not sure if it's good practice to go about declaring all properties public like that.
If you look at this page carefully:
http://www.yiiframework.com/doc/guide/1.1/en/test.unit
you'll see that they use an array form for retrieving fixtures:
$this->posts['sample1']['id']
There is an alias defined in their fixture array for each record and fixture items aren't loaded as models really ...
Does that help? If not, it would be helpful to see your fixture file :-)
I think I found the root cause of this issue for me. While my FixtureManager was using the testdb DBConnection, the models still used the regular one.
For whatever reason, my debugger was giving me misleading errors like the one described in my original post.
Once I was able to set the DBConnection of all Models in the unit test the puzzle snapped into place and everything is now working smoothly!

Changing a QuerySet object on the fly in Django

Can or should I ever do this in a view?
a = SomeTable.objects.all()
for r in a:
if r.some_column == 'foo':
r.some_column = 'bar'
It worked like a champ, but I tried a similar thing somewhere else and I was getting strange results, implying that QuerySet objects don't like to be trifled with. And, I didn't see anything in the docs good or bad for this sort of trick.
I know there are other ways to do this, but I'm specifically wanting to know if this is a bad idea, why it's bad, and if it is indeed bad, what the 'best' most django/pythonic way to change values on the fly would be.
This is fine as long as you don't do anything later that will cause the queryset to be re-evaluated - for example, slicing it. That will make another query to the database, and all your modified objects will be replaced with fresh ones.
A way to protect yourself against that would be to convert to a list first:
a = list(SomeTable.objects.all())
This way, further slicing etc won't cause a fresh db call, and any modifications will be preserved.
Yup. See docs here
SomeTable.objects.filter(some_column='foo').update(some_column='bar')
I would go with Django's idiom. It executes the SQL with a single statement with 'where' and 'update' rather sending multiple SQL statements like your code would. This saves time. Check with Django's 'connection' to test SQL time.

Move a python / django object from a parent model to a child (subclass)

I am subclassing an existing model. I want many of the members of the parent class to now, instead, be members of the child class.
For example, I have a model Swallow. Now, I am making EuropeanSwallow(Swallow) and AfricanSwallow(Swallow). I want to take some but not all Swallow objects make them either EuropeanSwallow or AfricanSwallow, depending on whether they are migratory.
How can I move them?
It's a bit of a hack, but this works:
swallow = Swallow.objects.get(id=1)
swallow.__class__ = AfricanSwallow
# set any required AfricanSwallow fields here
swallow.save()
I know this is much later, but I needed to do something similar and couldn't find much. I found the answer buried in some source code here, but also wrote an example class-method that would suffice.
class AfricanSwallow(Swallow):
#classmethod
def save_child_from_parent(cls, swallow, new_attrs):
"""
Inputs:
- swallow: instance of Swallow we want to create into AfricanSwallow
- new_attrs: dictionary of new attributes for AfricanSwallow
Adapted from:
https://github.com/lsaffre/lino/blob/master/lino/utils/mti.py
"""
parent_link_field = AfricanSwallow._meta.parents.get(swallow.__class__, None)
new_attrs[parent_link_field.name] = swallow
for field in swallow._meta.fields:
new_attrs[field.name] = getattr(swallow, field.name)
s = AfricanSwallow(**new_attrs)
s.save()
return s
I couldn't figure out how to get my form validation to work with this method however; so it certainly could be improved more; probably means a database refactoring might be the best long-term solution...
Depends on what kind of model inheritance you'll use. See
http://docs.djangoproject.com/en/dev/topics/db/models/#model-inheritance
for the three classic kinds. Since it sounds like you want Swallow objects that rules out Abstract Base Class.
If you want to store different information in the db for Swallow vs AfricanSwallow vs EuropeanSwallow, then you'll want to use MTI. The biggest problem with MTI as the official django model recommends is that polymorphism doesn't work properly. That is, if you fetch a Swallow object from the DB which is actually an AfricanSwallow object, you won't get an instance o AfricanSwallow. (See this question.) Something like django-model-utils InheritanceManager can help overcome that.
If you have actual data you need to preserve through this change, use South migrations. Make two migrations -- first one that changes the schema and another that copies the appropriate objects' data into subclasses.
I suggest using django-model-utils's InheritanceCastModel. This is one implementation I like. You can find many more in djangosnippets and some blogs, but after going trough them all I chose this one. Hope it helps.
Another (outdated) approach: If you don't mind keeping parent's id you can just create brand new child instances from parent's attrs. This is what I did:
ids = [s.pk for s in Swallow.objects.all()]
# I get ids list to avoid memory leak with long lists
for i in ids:
p = Swallow.objects.get(pk=i)
c = AfricanSwallow(att1=p.att1, att2=p.att2.....)
p.delete()
c.save()
Once this runs, a new AfricanSwallow instance will be created replacing each initial Swallow instance
Maybe this will help someone :)

Set the maximum recursion depth while serializing a Django model with Foreign key to JSON

I have a Django model created for Google's App Engine,
Model A():
propA = ReferenceProperty(B)
Model B():
propB = ReferenceProperty(C)
Model C():
propC = ReferenceProperty(B)
I have written custom Django serializer which will fetch the data for the ReferenceProperty(s) and serialize that along the initial model.
The problem occurs when I try to serialize an instance of Model A. My custom serializer will try to get propA, which contains a reference to Model C so the serializer will fetch Model C, which contains a reference to Model B and the recursion goes on and on. Is there any way to stop the recursion after a depth of say 2??
My serializer is a customized version of link text
P.S: I am willing to publish my code if that seems to needed. I have not currently attached the code since I am not at my development machine.
Thanks,
Arun Shanker Prasad.
Just modify your functions to take a 'depth' argument. Any time you follow a ReferenceProperty, call the function with depth one less than the depth that was passed in. If a function is called with depth==0, return None, or whatever other placeholder value is suitable in your case.
Why don't you just do recursion properly? Any recursive operation must have a base case, otherwise it will continue forever, as your problem indicates.
I'm trying to find a serializer that works with Google App Engine and follows relationships. Would it be possible for you to post the modified code you used to do this?