I'm trying to figure out how to unit test a basic MS Bot Framework dialog and cannot get it to work the way everything on the internet says it should work.
Everything I find basically says follow this:
https://github.com/Microsoft/BotBuilder/blob/master/CSharp/Tests/Microsoft.Bot.Sample.Tests/EchoBotTests.cs
Well, here's the problem with that:
await Conversation.SendAsync(scope, toBot);
That is defined as internal so it is not accessible outside of the bot.builder code. So it is totally useless unless you are programming tests for internal bot.builder stuff.
Is there a new way to get around this?
Bot Framework is an open source project, you can download the code and modify it as you need. In your case removing the internal keyword. Another option would be to create a new class that inherits from the class you are trying to use and making your own access levels on the methods you need to override. This blog post describes how to use the code locally.
Related
In my recent project I'd like to try out an Aurelia-frontend with a Django-backend.
I did some projects with Django and want to use Django REST API for my backend.
I'm new to Aurelia and read the documentation several times.
Now I'm wondering if it would be good practice to explicitly define models (eg. User with nickname, email, mobile, address etc.) in the Aurelia-frontend because in Django I already defined my models in the models.py for the database. Since I fetch/ the data via api to my Django application I could maybe omit it.
In the Aurelia "getting started"-section of the documentation they defined the ToDo-model in a separate file, but the data wasn't attached to a database. Doing this seems to me like doing it twice (in back- and frontend) and violates the DRY principle.
What would you think is good practice? Thanks for your recommendations!
Defining classes on the client side has its advantages. First, you can map the response data into a class instance, and work with the data that way. Though, working with a JSON object isn’t tough.
Second, serializing a class into JSON is easy. Plus, some backend frameworks expect a very specifically formatted JSON object; sometimes a class is the only practical way of doing that.
Third, one thing you can do with a class that you cannot do with a JSON object (as far as I know) is add methods/functions. That extensibility alone can be worth the effort.
It certainly isn’t unusual to have classes defined on the back and front end. I have worked with Aurelia, and Angular, they both work nicely with them. I have done an Aurelia app without client side classes. What I really missed there was no Intellisense (a fourth advantage) in the IDE since nothing was exported/imported. BTW, I use VS Code.
DRY is nice. But, showing intent can go a long way, especially if someone else picks up the code when you are done with it. Classes can help there. Fifth advantage, helps to show intent.
Finally, I am sure there are many more advantages.
Conclusion: I would recommend using client side classes. You will not regret it.
Hope this helps!
I know there are existing tools for testing a ColdFusion application (MXUnit, MockBox), but I'm creating a custom tool, so that it will require less configuration.
When I run a unit test file, it's done via a generic 'model' which retrieves all functions from the unit test file. Within each test function, I have to call assertEquals -- but these functions are in the model, so I cannot access them.
I tried by passing the model itself to the unit test file so it can call the models functions directly but it doesn't work and it adds logic to the test file, which I don't like.
I can also extend the model in the test file but I will have to call directly the test file, call super.init(this) so the model can fetch test functions, etc..
Is there a way to achieve this kind of process? What's the best option?
In answer to your question, it sounds like you want to inject variables / methods into the subject under test. You can do it like so:
myInstance["methodName"] = myFunction;
You can then call the injected method like so:
myInstance.myFunction();
Both MXUnit and TestBox use this technique.
Having said that I don't quite understand why you want to re-invent the wheel. TestBox is an excellent, proven testing framework which has a wealth of features which would take you an incredible amount of time to replicate. I'm not quite sure what the configuration issue you have could be - it really doesn't require very much setup. Maybe it might be worth asking how to setup and use TestBox rather than how to build your own testing solution :)
There is a good book (which is available in a free version) which you can read on TestBox here : http://testbox.ortusbooks.com/
Good luck!
I'm working on a project that has client and server side. And I'm writing a "pre-check-in" tool that will validate a lot of our communication between client and server.
I already have unit tests on both sides, now I really want to test the integration between both.
Like a real client connection to the server and vice-versa.
I really want to go unit testing on this, but I'm having a really hard time figuring out how I can initialize the MMVMCross framework and my view model classes.
In another thread I've asked for help on my "console app" that runs the tests but it is also really hard to initialize the framework and it makes me loose the coolness of unit testing with Visual Studio and reSharper.
My view models use SQLite and HttpClient with async/await.
For instance: I can't find a way to instantiate a view model that would need this interfaces:
IChatService, IMvxMessenger, IDataService, ISettingsService
Some from the framework some from my own code.
I known and I'm trying to register my ones, on a TestFixtureSetUp, but off course this fails, as the MVVM base subsystem (ioc?) is not setup yet.
Some above services, like IDataService for instance, also needs ISQLiteConnectionFactory, IMvxMessenger, ISettingsService.
I know unit testing is supposed to be fast, but my idea is to put all this tests in a new Category, that I would run only before my check-ins and my buddy, server developer, would run before his check-ins.
What would be the best approach here?
Any hint, suggestions, things to investigate/study would help at this point, as I'm practically stuck on this one.
Sergio
For instance: I can't find a way to instantiate a view model that would need this interfaces:
IChatService, IMvxMessenger, IDataService, ISettingsService
Generally this would be done using Mock implementations and then calling the ViewModel constructor directly with those Mocks.
I'm having a really hard time figuring out how I can initialize the MMVMCross framework
In order to initialise the IoC part of the framework you can use the MvxIoCSupportingTest helper class -https://github.com/slodge/MvvmCross/blob/v3/Cirrious/Test/Cirrious.MvvmCross.Test.Core/MvxIoCSupportingTest.cs
If you then need additional parts of the framework, then generally you should mock these in some way. For example, see how navigation is mocked in these two articles:
http://blog.fire-development.com/2013/06/29/mvvmcross-enable-unit-testing/
http://slodge.blogspot.co.uk/2013/06/n29-testing-n1-days-of-mvvmcross.html
If it helps, an example of this type of test is https://github.com/slodge/MvvmCross-Tutorials/blob/master/Sample%20-%20TwitterSearch/TwitterSearch.Test/TwitterViewModelTest.cs#L21
If this answer doesn't have sufficient information, please provide a bit more example code about the tests you are trying to write.
I have tried searching around the web and on SO, but haven't seen much discussion about t (or maybe I'm just not using the right keywords).
What I would like to do is write a script (or use a utility that already exists) to verify that a class or set of classes has unit tests written for them in the test project.
I've got a release coming up, and I want to make sure that all public methods of my business layer have unit tests. I'm trying to get everyone on board with TDD, but it hasn't happened yet.
I've got a pretty basic idea of how I would write a script to check this (open file, parse method signatures into some list, open corresponding test class file and check that each method in the list exists somewhere in the test file), but I wanted to see what other options are available.
.Net code coverage tools, such as NCover & dotCover, already exist. I would use one of those and read their reports.
How to do TDD/BDD with asp.net 4.0 Web Forms? I have an existing website that has very large scale and we dont have any test with each change in database we have to guess what will break? I want to introduce unit testing please give me pointers?
A lot will depend on how your existing codebase is structured.
It can take a lot of work to retrofit Unit testing and especially TDD into a legacy project. Typically you'll find database and business logic residing in the code behind file of the web pages.
Use of interface types are your friend here as Visual Studio can generate Interface classes automatically for you. (Place the cursor over a class name, right click, select refactor and extract interface)
I would work towards separating your database code into a class library project of it's own. You can then specify the public interface through which the business logic etc can access the database. All other code should treat the database repository as a black box.
Create a factory to make your repository (based on that Interface), have it create a test type and a live type. The live type will link into your current database code. The test type will just return hard coded values. You can write tests using the live database, then you can write tests for the "test" database in a TDD manner.
Once they both match (all tests pass) any new database functionality is added by writing the test that runs on the "test" database first and then on the live database.
Remember all code should only use the Interface to the database not an instantiated live database class.
Once you've got the hang of the process you can delve deeper (if you wish) within the database code, but I would say that following the same process in separating and testing business and UI logic is more practical on a legacy project.
You may find that a pragmatic approach is to only separate out functionality following the process that I've described as you go to add new code. In other words before you add new functionality, separate out the code as described writing the tests that show that it passes (live and test version) then alter or add test for new functionality using whether the test passes or fails to guide your coding.
If covering all bases you want a test for failure, test for pass and maybe a test for exception scenario.
Good luck. It's not a job for the faint hearted (having had to do it many times in the past).