Is there a way to process Karma - Angular unit tests in batches? - unit-testing

We are facing an issue where our tests will start failing or running extremely slow after a certain point. I have seen articles online where others too are facing issues. The primary reason for those failures are memory consumed by browsers while we deal with DOM.
We are using seed project which builds our application using SystemJS. Our Current version of Angular is 2.2.3.
So, I am thinking of a work around where I can either parallel process our test runs (i.e. multiple karma server running, I did try that but it starts to consume 100% CPU) or batch processing. So, batches of small test runs which will ensure that karma is stopped and started again.
Is there a way?
Also, if we are able to achieve that, how to get a consistent coverage? We are using istanbul.
Please let me know if you have any more questions.
e.g. our service and model related tests run in 3 seconds (500+ tests) but our component tests (900+) take 15 mins.

There is pretty good plugin for Karma that allows sharding tests and executing them in parallel - https://www.npmjs.com/package/karma-parallel
We have integrated it in both AngularJS & Angular 4 & 5 projects.
With code-base with more than 2000 tests it is a must.

Related

SOAP UI - Multiple tests transmissions

hope you can able to help on my problem. As of now, I can only test transmissions via SOAP UI in single requests, and the thing is I have a lot of data to transmit and it would be too problematic and laborous to do tests one by one.
Is there a way in SOAP UI wherein I can perform multiple requests at once?Appreciate your feedback and input as always. Thanks!
I would recommend using Ready API! SoapUI NG Pro and use the a DataSource test step with a DataSource Loop. The Data Driven Sample Project (available on the starter page of Ready! API) contains a minimal working sample to get you up and running.
Disclaimer: I'm working for the company developing SoapUI and might be biased on the greatness of the Pro version.

sql 2014 express performance issues

I have a fairly large Windows application (about 10 years old, written in c++) which works with SQL2000 Express (MSDE). It operates with database pretty extensively, but doesn't have performance issues. Due to SQL2000 MSDE compatibility issues with Windows 7/8 I want to migrate the application to SQL2014 Express.
All database access code is written in t/sql and as such the application migrates to SQL2014 without any code changes and all features work as expected. Except it's so badly slow it makes no sense to use the application under SQL2014. All select/update/insert queries take about 5-20 times more time to execute.
These are connection strings that I tried:
Provider=SQLOLEDB;Data Source=localhost\app;User ID=app_user;Password=password;
Provider=SQLOLEDB;Data Source=localhost\app;Trusted_Connection=yes;
I don't convert SQL2000 database to 2014 as the application creates a new database from scratch from scripts on its first run. Nothing fails, the default DB size is 12MB, the schema is pretty well optimised.
I also tried the same under SQL2008R2 Express - it's as slow as SQL2014 Express. Tried different PCs under Windows 7/8/8.1 - all the same.
The main detail which I noticed is that when I run the application under SQL2014 the most CPU consuming process in Windows Task Manager is "Local Security Authority Process". This process doesn't appear in Task Manager at all when I run it under SQL2000 MSDE and the application runs much faster. I guess LSA may be very heavy processing my "open connection" requests, but I don't know what to do about it.
The application is written is a way that it doesn't keep connections open, but creates them on demand and then releases. I tried to run SQL 2014 service under different accounts - it made no difference.
This process doesn't appear in Task Manager at all when I run it under SQL2000 MSDE and the application runs much faster. I guess LSA may be very heavy processing my "open connection" requests, but I don't know what to do about it.
Typically, lsass.exe (LSA) been used by IPSEC Services(PolicyAgent),
ProtectedStorage and Security Accounts Manager(SamSs)
Try to disable IPSEC Services(PolicyAgent)

Strange behavior of unit test in CakePHP Test Suite 2.2

I am the only one in my development team having this problems with the unit tests.
Either the debugger, nor a google /stackoverflow or my colleagues could help me.
On my machine the unit tests work 10% of the time. The rest of the time they run until they load a site not found page after a long long time.
I have tried everything:
The entire cache is disabled
I tried it out with two browsers
Restarted Apache and MySQL
Login to the app
Debugging with Firebug
Nothing helps.
I have no clue what the problem is.
I'm running windows 8 over bootcamp for work. So there is not much installed (which could lead into side effects).
If I try it out several times, it will work by chance and pass all tests.
The rest of the time they run until they load a site not found page
after a long long time.
Are you executing the tests in the browser? You should use the console over the web front end. Running the tests in the browser can be problematic.

Improve SOAP UI performance

I've started using SOAP UI recently to test web services and it's pretty cool, but it's a huge resource hog.
Is there any way to reduce the amount of resources it uses?
It shouldn't be a resource hog, although I've seen it do this before. I leave it running on my PC all week, and a co-worker with a similar machine (dual-core running XP) has to kill it every few hours, otherwise it keeps using CPU. I'd try uninstalling/re-installing. Currently, my instance has been up for 10 days, running a mockservice that I've been hitting very hard (I've sent it thousands of requests). CPU time total (over 10 days) is about an hour and a half, but the "right now" number is about 1%.
There are no popular alternatives, aside from writing your own client in the language of your choice.
If you're testing WCF services, you can run wcftestclient from the Visual Studio command line. It works for local or remotely hosted services. Its no good for ASMX-style .NET 2.0 SOAP services though.
if you want to test using only json, you could use some of the light weight Rest clients ex. Mozilla Rest plugin.
We test our SOAP APIs manually with SOAP UI and otherwise use jMeter for automated SOAP API testing. While having a GUI seems attractive first, I find both applications quiet user-unfriendly and time consuming to work with.
As already suggested, you could do it in code using Java or maybe use a dynamic language like Ruby:
Testing SOAP Webservices with RSpec
SOAP web Services testing in RUBY
As user mitchnull mentions in his comment:
Disabling the browser component (-Dsoapui.jxbrowser.disable=true)
solved the 100% CPU usage issues for me. (when it was enabled, it
periodically went to 100% CPU even when not running any
tests/requests).

PHPUnit on a shared hosting plan?

PHPUnit works great, I love it actually, problem I'm having is that my hosting plan imposes a 30 second cap on the duration of a query. If PHPUnit tests take longer than that, the connection is closed by the server, and I never get to find out if all my tests passed or not.
Is there an existing automatic way of running an arbitrarily long test suite using AJAX to batch unit tests so that they'd never hit the 30s threshold? As long as each individual test takes less than 30s I think it should work.
Thanks
Why are you running the tests on the production server? Your tests are for running on your development server, to make sure your code is good before sending into production.
You may be able to change the default timeout with set_time_limit (link). That resets the time left to run whenever it's run.