there's a problem with my Neo4j-Test-Setup-Environment and the org.neo4j.test.ImpermanentGraphDatabase...
I have a class, TestGraphPopulator, for setting up some dummy data for my unit tests.
Because of adding, delete, update operations in my tests, I populate the graph in every test case in init-method annotated via #Before.
But there is some really strange behavior sometimes.
Some of my test fail, because there are more or less entities as expected. In a second, or third, etc. run, everything is fine, ..or not.
AND, in my /target directory of my project, there is a folder \target\test-data\impermanent-db with all the Neo4J database data...
I'm not sure of what my problem results of, but should NOT ImpermanentGraphDatabase only reside in memory??
For me, it looks like a bug, could anyone share some experience?? This seems very important to me, and maybe others...
Thanks a lot in advance!
Markus
P.S.:
Neo4J 1.8, Spring Data Neo4J 2.1 ..
This is indeed what has happened to me, too. It is confirmed to be a problem, and was recently fixed. I have confirmed that no files are created when I use 1.9-SNAPSHOT.
Related
When I go to SpecRunner.html in my browser, the unit tests run fine. The issue I am having is if I change one of the tests, or the code that it tests and refresh the page, it doesn't load the new tests or change at all. I thought that this would be a cache issue, but I have the chrome dev tools disable cache selected.
What am I doing wrong?
Thanks
EDIT: tried restarting my computer, nothing. Clearing cache, nothing. I don't understand why this would be happening.
EDIT2: tried force reload, nothing... changed the file name and reference and still loaded the old code...
This might not answer your specific problem, as there are a million reasons this could happen.
For me, it was because I had deleted some files and then re-added them as a git submodule. After doing this, ls was showing me the OLD files, and I didn't realize that I had to go up a directory and then back in to interact with the new files I had just pulled down.
Old question but I ended up here when looking for an answer, so here's my solution in case it helps anyone - I was having a similar problem with jasmine-rails, removing jasmine-specs.js from public/assets did it for me. I'm using the jasmine-rails gem from https://github.com/searls/jasmine-rails.
Context
I'm working on a project where we use JUNG libraries. My custom java code and the Jung libraries are all placed under the wwwroot/WEB-INF/lib.
Note that, Certain classes of JUNG depend on Apache commons-collections.jar which is also present in the same folder (given by default by CF).
Problem
We are in plan to move from CF7 to CF10, so, I'm doing a pilot testing on all the code in the project and found that certain code (that uses the JUNG) are failing with a 'class not found' exception.
On debug and checking the methods in the commons-collections-2.1.jar given in CF 10, I found that its different from the one present in CF 7. If I update the JAR to the latest commons-collections-3.2.1 version, the code works fine as it found all those classes needed.
I can change my code base to suit the default commons-collections-2.1.jar, but that would result in atleast 2-3 months of dev-test cycle. I want to take it as last option and better avoid it if at all possible.
Now, my questions are -
What would be the problem if I change(update) the 2.1 with 3.2.1?
What could be the parts of core CF that uses this JAR ?
Why does CF uses this outdated 2.1 (i feel so) JAR version ?
Home work I did
Testing : For not effecting the rest of applications, I have created a separate instance, changed the JAR to the latest 3.2.1, hosted my application on it and done some testing. Everything seems fine as I have not encountered any problems. But, I worry that there may be areas in CF that depends on this JAR and might break.
cfusion\lib : I have found that the commons-collections-3.2.1 already exists in the main lib (cfusion\lib) folder. So, may be, changing the JAR in the WEB-INF/lib does not make any difference.
cfusion\wwwroot\WEB-INF\lib : I have read that any JAR files placed in this folder are used for the applications only and not by the CF itself. I may be wrong or completely misunderstood the meaning. Correct Me!
Answers to this Q would help to my understanding of CF more as well as solve the problem.
I think I have found the answer.
Its a misunderstanding by me that Adobe has changed the commons-collections.jar in its WEB-INF/lib when it made CF10. It did not and even if it did (assuming), it was noway related to my problem.
After reading the comments from #Miguel-F, and #Leigh, I have checked through the previous installations and compared the lib changes. During that I found one of the previous developers have purposefully changed the commons-collections from the original given by CF to the custom one with with JUNG works properly. I confirmed this from the JAR compilation signatures and small note in them corresponding to the change.
Now, I have copied the same file into the new CF 10 and my reports are working all good.
Take Away:
The changes to JAR in /WEB-INF/lib/ does not effect your CF Installation on a whole ONLY if your working on custom code an importantly you know what you are doing.
If at all possible, please document such changes as part of application development logs so that when you leave the place, the next developer gets to know these customization without much hassle.
Talk about your problem to the community and during the process making them understand the problem, you may find an answer. :)
Thank you
I want to thank #Miguel-F and #Brian for their inputs on system configuration. And more thanks to #Leigh for his Different how?. It made me think again and find my answer.
I recently got PHPUnit working with xDebug for testing my Zend Framework applications. The fact I use ZF shouldn't make any difference to this question- just mentioned it for completeness.
Anyway, it all works fine, but now I want to set up an in-memory database using PDO SQLite. I have done this successfully and have created tables and inserted data. However this task seemed to take ages, the syntax from my Export did not match SQLites needs so I had to play around for a while.
Also SQLite does not support constraints (is that right?) which my application does use, so the whole process seems a waste of time if I cannot test my constraints.
Is using SQLite the right solution to my problem? Does anyone have any better ways of using it or any other DB solution to unit testing?
The idea of unit tests is to test smaller parts, so one way could be to work with small amounts of (static) sample data, for example as described in http://www.phpunit.de/manual/3.4/en/database.html
If you really need to test against a full database with all it's constraints, I think there is no way around just using the specific database of your application, for example MySQL.
Not really a Ruby on Rails question, but that is the framework which we are working in.
We are migrating data from a legacy system into our own system, and have been testing the code that will do the data migrations. These tests live alongside the rest of the applications tests, and so ran against our build server on commits, etc.
Once we've migrated this data, these tests will seemingly be useless to us, since the code they are testing will never be run again. What's more, is the tests will most likely get stale, and might require maintenance, lest they break our build.
Should we just be throwing these tests out afterward? Tagging them in some way so that they don't get ran after we do things for real? Something else?
Get rid of them.*
*Which is to say, let them sit in source control if you ever need to refer to them.
If it were me, I would separate out the project that does the data migration along with its tests. That way the tests don't generate noise in your current build process, and you only have to modify them if you (for some reason) touch the migration project again.
If this isn't possible, then just rip all of it out once you are done. If you ever need to get it back it should be in source control... right!?!
I'm working with Symfony + Doctrine + PHPUnit, with NetBeans IDE. Here' my current approach to unit testing.
setUp() function loads the test fixtures from .yml files
tearDown() function delete all data from models. this is done by looping through an array of all my models' names to something like Doctrine_Query::delete($modelName)->execute()
This seems to work, but i'm just curious if this is the correct way to do it. I am essentially clearing all tables after each test function by specifying the models/tables to 'delete all' from.
Q1: I am just wondering if this is the correct way...
Q2: this works nicely in Netbeans IDE, but does not seem to work via "./symfony test:unit". am i missing something or the CLI just works with lime?
./symfony test:unit runs symfonys own test suite that is using lime as a test framework, and not phpUnit.
And netbeans uses phpUnit for its integrated test support. hopefully netbeans will add test suport for symfony test suite in their incomming symfony suport in netbeans 6.8
If you want to use phpunit with symfony check out: PHPUnit plugin Just a note that this only works with 1.2.x it seems, for 1.4.x which is what I'm currently using at work check out: Another PHPUnit plugin this last one is in beta, but it works for 1.4.x according to the author, I'll be trying it out soon, so if I can remember, I'll come back here and throw up my findings. It's honestly not to hard to back out if you don't want to install it, so trying it is easy.
If you happen to try it, please post your findings, I'd be really interested in hearing your thoughts. I'm finding lime to be lame (HAH!) as it just makes mocking a chore.
I'm trying it externally with PHPUnit, no plugin. I'm using Doctrine. I am having quite a problem. If I run ONE PHPUnit (written by me) tes method, it's great. The second one, not so good. It may be the way I'm using Doctrine. It seems like even though I delete everything from the database (between PHPUnit Method calls) and restore the fixtures file, all in the setup(), DOCTRINE remebers previous values.
Doesn'jt matther whether I flush the connection, unset the parent object that is being erroneiously 'remembered', refreshRelated() etc, I still get old values when I do the first assignment to a relationship.
$parent=new ParentType;
//set parent values.
$child=new ChildType
//set child values
$child['Parent']=$parent;
$child->save();
The database is reflecting everything fine, it's Doctrine in PHPUnit that's not working. I haven't tried it OUT of PHPUnit yet, after all, test before use, right? But I may have to do that and see if it's Doctrine or PHPUnit