I recently switched over from cakephp 2.2.5 to 2.3 and the auto-truncate table is no longer working in 2.3.
What I did in 2.2.5 was just testing the framework using some small tables with no relations aka foreign keys constraints and the fixture import and auto-truncate table worked flawlessly.
Until I confirmed that I wanted to use cakephp and started developing my application using the CakeTestCase again. It stopped working. Data is still in the test database after they are imported the first time. So second time it would indicate that it fails to auto-load fixtures because data with the same IDs already exist.
So I started to suspect that it was mainly because the foreign key constraints I had in my current tables.
I searched the web and noticed that quite a lot of people having the same issue but there isn't a real solution to it yet.
The only one that seems to be a solution is here:
http://cakephp.lighthouseapp.com/projects/42648/tickets/2905-tests-fixture-table-ar-not-truncate-when-droptable-false
However, I tried turning on $dropTables = true, it didn't work. And also I tried modifying the CakeFixtureManager.php as suggested and it didn't work either.
Does anybody know how to fix this issue?
Related
So I am developing a django project in an AWS virtual env. To use a package, I need a newer version of Django, but I already have a lot of important data stored in Django's database.
My question is: Will updating the Django version mid-development compromise the data I already have in the database?
I apologize if the question seems stupid, I just really don't mess anything up.
Thanks in advance
The database Django uses is a separate thing (e.g. PostgreSQL, MySQL...), independent from it. Django only interacts with it to write and read data.
Updating Django to a new version might break something in your code if it uses old Django features that have been removed, but it won't affect your database.
Nevertheless, it's always a good idea to backup everything before crucial updates.
My Django tests all pass when I use the built-in SQLite as my database. However, when I switch to the built-in MySQL and PostgreSQL, they fail. (All local database servers.)
The connections to the DB are live enough -- the database is created and migrations applied. But the tests aren't even hitting views.py before the test server returns a 500 (and no additional explanation beyond "Server Error", even in debug mode).
I'm using Django with Django REST Framework, and testing with REST Framework's RequestsClient class. My test fixtures are generated with Factory Boy.
Any ideas? I have a vague feeling it's to do with Factory Boy but nothing concrete. The fact that it fails with multiple databases, and successfully builds the tables, makes me suspect it's not the database itself.
Versions tested:
Django 1.11.6 and 1.11.9
Django REST Framework 3.7.1
Postgres 10.1
MySQL 5.7.21
Factory Boy 2.9.2
I had a similar problem (not sure if its the same) going from SQLite to PostgreSQL. Are you using a setup method to setup your initial database objects?
Example::
def setUp(self):
user = User.objects.create_user(username='john', password='abbeyRd', email='john#thebeatles.com')
user.save()
I found that this would work fine on SQLite, but on PostgreSQL I was getting database connection failures (it would still build the database like you said, but couldn't run the tests without a connection error).
Turns out, the fix is to run your setup method as follows:
#classmethod
def setUpTestData(cls):
user = User.objects.create_user(username='john', password='abbeyRd', email='john#thebeatles.com')
user.save()
I don't remember the exact reason why this works and the other one didn't, but it had something to do with the connection to the database trying to reconnect every test if you use setUp(self) and if you use setUpTestData(cls) it only connects when the class is instantiated, but I might not have the details perfect. All I really know is it works!
Hope that helps, not sure if it addresses your exact issue, but it took me a good few hours to figure out, so I hope it helps you and maybe others in the future!
It turned out to be because of the testing class I was using. Apparently, although it is not right now stated in the documentation, the Django REST Framework RequestsClient class can only be used with the DRF APILiveServerTestCase. That is worked at all with SQLite is a weird fluke.
Link: https://github.com/encode/django-rest-framework/issues/5801
Edited to add, here's the reason for the "weird fluke", per the Django documentation:
When using an in-memory SQLite database to run the tests, the same database connection will be shared by two threads in parallel: the thread in which the live server is run and the thread in which the test case is run.
I don't have the exact answer to this question, but I had a similar problem where PostgreSQL complained about not finding a Foreign Key that has the id=1. This happened because -in some tests- I have hardcoded the object to test on to be the one having id=1. This worked fine on SQLite because the IDs were given sequentially starting from 1, however, this is not happening on PostgreSQL. I solved it by using the Model.objects.last().pk or Model.objects.first().pk or choosing another instance in between when I need to use a Primary Key.
Have recently tried working with xDB in Sitecore 8 and now looking for the way of cleaning out current stats from xDB without re-installing Sitecore. I deleted data files for Mongo (as was suggested) but still see figures in Analytics in Sitecore; also did iisreset but also did not help. What am I doing wrong? (I am new to Sitecore so might be missing something).
Have you tried to clean-up only MongoDB files, without Reporting database?
If yes, I think that is a point of your confusion. The way it works in xDB is that all tracking analytics data is written into Mongo and then by SessionEnd processed and saved into Reporting database, that is SQL database, same way as it was before previously in DMS. In that case you need to clean that database as well.
If you have access to SQL, you may use __DeleteAllReportingData stored procedure as the quickest:
More correct approach that goes well for instances where there is no direct access to DB is using admin tool for that located at /sitecore/admin/RebuildReportingDB.aspx. Also there was a module Analytics Database Manager previously, however I do not know its current state.
Reference: Walkthrough: Rebuilding the reporting database (from official documentation)
Some of my migrations should be run only when certain conditions are met - mostly because of the buggy nature of using south to migrate a django.contrib. needed when converting to our own user model. But since those migrations should run automatically I can't count on "--fake" - Sometime I need to run them, and sometime not - depending if the relation auth_permissions, for example, exists.
can I use the south/django orm in a migration forward section to check for an existing relation and run the migration in aif clause?
I tried using try/except in the migration but it seems to cause an error (currently can't reproduce that, I don't have this code any more)
How can I achieve that?
Thanks for the help!
Using Django 1.6.4 and south 0.8.4
I recently added a new field to one of my models and forgot to add the appropriate column to the table in the database. I have test cases that test adding a new instance of this model and changing an existing instance. Neither of these test cases failed. Yet when I try to change an instance with the live site I get
DatabaseError no such column
I have made some attempts to detect this error from within a TestCase but no such luck.
Any help is greatly appreciated.
The problem is that you (I really, really hope) don't use your production database for testing with, so all that you could detect is that the column doesn't exist on your test database, which is (presumably) recreated from scratch based on your model definitions, not that the column is missing from your production database.
A better approach to this problem is to use a migration tool like South, and automate the deployment process so that migrations are run as new code is deployed.
This will only work for small(ish) sites - you might find naively running migrations causes pain if you've got a high-traffic site. If you're in that situation, you may find David Cramer's write-up on schema changes informative.
Unfortunetly, using syncdb command is not enough to update the database.
You can insert the fields manually with your database management system or use a solution like South.
Use following link it as very well explanation for your problem.
http://amionrails.wordpress.com/2013/11/17/django-databaseerror-no-such-column-error/