I've got a PresenterFactory that creates Presenter classes based on a Role parameter. Specifically, the Role parameter is an external class which I cannot control (IE 3rd party.)
My factory looks something like this:
public class PresenterFactory {
public Presenter CreatePresenter(Role role, ...) {
if (role.IsUserA("Manager")) {
return new ManagerPresenter(...)
}
if (role.IsUserA("Employee")) {
return new EmployeePresenter(...)
}
}
}
I'm stuck on how to write the unit test for this since creating the Role object forces a database access. I thought that I could Mock this object. My test looked like this:
public void TestPresenterFactory()
{
var mockRole = new Mock<Role>();
mockRole .Setup(role=> role.IsUserA("Manager"))
.Returns(true)
.AtMostOnce();
PresenterFactory.CreatePresenter(mockRole.Object, ...);
mockUserInfo.VerifyAll();
}
However I receive an ArguementException:
Invalid setup on a non-overridable member: role=> role.IsUserA("Manager")
I'm not sure where to go and sure could use some course correction. What am I doing wrong?
You can create a wrapper object for Role that has all the same methods and properties, but is mockable, and the default implementation simply returns the underlying Role's implementation.
Then your tests can use the wrapper Role to set up the desired behaviour.
This is often a way to get around concrete classes that really need mocking.
What you want to mock is the creation of a Role object, then pass that mock object into your CreatePresenter method. On the mock you would set whatever properties required to determine what kind of user it is. If you still have dependencies on the database at this point, then you might look at refactoring your Role object.
Consider using a mocking framework that does not impose artificial constraints (such as requirements for methods to be virtual, for classes to not be sealed, etc) on how your code must be written to be mockable. The only example of such that I'm aware of in .NET context is TypeMock.
In Java when using EasyMock extensions you would be able to mock "real" objects and methods, most likely there's equivalent or alternative mock framework that you can use for your purpose
Related
I have been working on .NET Core FirebaseAdminSdk. I want to write unit tests for my own services that are using FirebaseApp class.
FirebaseApp is a sealed class and there is not any interface to moq it.
Is there any way to mock FirebaseApp instance?
private readonly Mock<IFirebaseApp> firebaseApp = new Mock<IFirebaseApp>();
I need an interface something like this.
It's generally not a good idea to try to mock sealed classes like FirebaseApp, because they are designed to be used in a specific way and mocking them can lead to unexpected behavior and make it difficult to test your code correctly.
Instead of trying to mock FirebaseApp, you can use a technique called "dependency injection" to make it easier to test your code. Here's how it works:
Create an interface that defines the methods and properties that you need from FirebaseApp. For example:
public interface IFirebaseApp
{
string Name { get; }
FirebaseAppOptions Options { get; }
Task<string> GetAccessTokenAsync(bool forceRefresh);
void Delete();
}
Modify your code to accept an instance of IFirebaseApp through its
constructor or a property, rather than creating a new instance of
FirebaseApp directly. This is called "dependency injection".
In your unit tests, create a mock implementation of IFirebaseApp
using a mocking framework like Moq. Then pass an instance of the
mock to your code when you create an instance of your service.
This will allow you to easily control the behavior of FirebaseApp in your tests, and make it easier to test different scenarios.
I'm mocking my repository correctly, but in cases like show() it either returns null so the view ends up crashing the test because of calling property on null object.
I'm guessing I'm supposed to mock the eloquent model returned but I find 2 issues:
What's the point of implementing repository pattern if I'm gonna end up mocking eloquent model anyway
How do you mock them correctly? The code below gives me an error.
$this->mockRepository->shouldReceive('find')
->once()
->with(1)
->andReturn(Mockery::mock('MyNamespace\MyModel)
// The view may call $book->title, so I'm guessing I have to mock
// that call and it's returned value, but this doesn't work as it says
// 'Undefined property: Mockery\CompositeExpectation::$title'
->shouldReceive('getAttribute')
->andReturn('')
);
Edit:
I'm trying to test the controller's actions as in:
$this->call('GET', 'books/1'); // will call Controller#show(1)
The thing is, at the end of the controller, it returns a view:
$book = Repo::find(1);
return view('books.show', compact('book'));
So, the the test case also runs view method and if no $book is mocked, it is null and crashes
So you're trying to unit test your controller to make sure that the right methods are called with the expected arguments. The controller-method fetches a model from the repo and passes it to the view. So we have to make sure that
the find()-method is called on the repo
the repo returns a model
the returned model is passed to the view
But first things first:
What's the point of implementing repository pattern if I'm gonna end up mocking eloquent model anyway?
It has many purposes besides (testable) consisten data access rules through different sources, (testable) centralized cache strategies, etc. In this case, you're not testing the repository and you actually don't even care what's returned, you're just interested that certain methods are called. So in combination with the concept of dependency injection you now have a powerful tool: You can just switch the actual instance of the repo with the mock.
So let's say your controller looks like this:
class BookController extends Controller {
protected $repo;
public function __construct(MyNamespace\BookRepository $repo)
{
$this->repo = $repo;
}
public function show()
{
$book = $this->repo->find(1);
return View::make('books.show', compact('book'));
}
}
So now, within your test you just mock the repo and bind it to the container:
public function testShowBook()
{
// no need to mock this, just make sure you pass something
// to the view that is (or acts like) a book
$book = new MyNamespace\Book;
$bookRepoMock = Mockery::mock('MyNamespace\BookRepository');
// make sure the repo is queried with 1
// and you want it to return the book instanciated above
$bookRepoMock->shouldReceive('find')
->once()
->with(1)
->andReturn($book);
// bind your mock to the container, so whenever an instance of
// MyNamespace\BookRepository is needed (like in your controller),
// the mock will be loaded.
$this->app->instance('MyNamespace\BookRepository', $bookRepoMock);
// now trigger the controller method
$response = $this->call('GET', 'books/1');
$this->assertEquals(200, $response->getStatusCode());
// check if the controller passed what was returned from the repo
// to the view
$this->assertViewHas('book', $book);
}
//EDIT in response to the comment:
Now, in the first line of your testShowBook() you instantiate a new Book, which I am assuming is a subclass of Eloquent\Model. Wouldn't that invalidate the whole deal of inversion of control[...]? since if you change ORM, you'd still have to change Book so that it wouldn't be class of Model
Well... yes and no. Yes, I've instantiated the model-class in the test directly, but model in this context doesn't necessarily mean instance of Eloquent\Model but more like the model in model-view-controller. Eloquent is only the ORM and has a class named Model that you inherit from, but the model-class as itself is just an entity of the business logic. It could extend Eloquent, it could extend Doctrine, or it could extend nothing at all.
In the end it's just a class that holds the data that you pull e.g. from a database, from an architecture point of view it is not aware of any ORM, it just contains data. A Book might have an author attribute, maybe even a getAuthor() method, but it doesn't really make sense for a book to have a save() or find() method. But it does if you're using Eloquent. And it's ok, because it's convenient, and in small project there's nothing wrong with accessing it directly. But it's the repository's (or the controller's) job to deal with a specific ORM, not the model's. The actual model is sort of the outcome of an ORM-interaction.
So yes, it might be a little confusing that the model seems so tightly bound to the ORM in Laravel, but, again, it's very convenient and perfectly fine for most projects. In fact, you won't even notice it unless you're using it directly in your application code (e.g. Book::where(...)->get();) and then decide to switch from Eloquent to something like Doctrine - this would obviously break your application. But if this is all encapsulated behind a repository, the rest of your application won't even notice when you switch between databases or even ORMs.
So, you're working with repositories, so only the eloquent-implementation of the repository should actually be aware that Book also extends Eloquent\Model and that it can call a save() method on it. The point is that it doesn't (=shouldn't) matter if Book extends Model or not, it should still be instantiable anywhere in your application, because within your business logic it's just a Book, i.e. a Plain Old PHP Object with some attributes and methods describing a book and not the strategies how to find or persist the object. That's what repositories are for.
But yes, the absolute clean way is to have a BookInterface and then bind it to a specific implementation. So it could all look like this:
Interfaces:
interface BookInterface
{
/**
* Get the ISBN.
*
* #return string
*/
public function getISBN();
}
interface BookRepositoryInterface()
{
/**
* Find a book by the given Id.
*
* #return null|BookInterface
*/
public function find($id);
}
Concrete implementations:
class Book extends Model implements BookInterface
{
public function getISBN()
{
return $this->isbn;
}
}
class EloquentBookRepository implements BookRepositoryInterface
{
protected $book;
public function __construct(Model $book)
{
$this->book = $book;
}
public function find($id)
{
return $this->book->find($id);
}
}
And then bind the interfaces to the desired implementations:
App::bind('BookInterface', function()
{
return new Book;
});
App::bind('BookRepositoryInterface', function()
{
return new EloquentBookRepository(new Book);
});
It doesn't matter if Book extends Model or anything else, as long as it implements the BookInterface, it is a Book. That's why I bravely instantiated a new Book in the test. Because it doesn't matter if you change the ORM, it only matters if you have several implementations of the BookInterface, but that's not very likely (sensible?), I guess. But just to play it safe, now that it's bound to the IoC-Container, you can instantiate it like this in the test:
$book = $this->app->make('BookInterface');
which will return an instance of whatever implementation of Book you're currently using.
So, for better testability
Code to interfaces rather than concrete classes
Use Laravel's IoC-Container to bind interfaces to concrete implementations (including mocks)
Use dependency injection
I hope that makes sense.
I have seen lots of posts (and debates!) about which way round UnitOfWork and Repository. One of the repository patterns I favor is the typed generic repository pattern, but I fear this had lead to some issues with clean code and testability. Take the following repository interface and generic class:
public interface IDataEntityRepository<T> : IDisposable where T : IDataEntity
{
// CRUD
int Create(T createObject);
// etc.
}
public class DataEntityRepository<T> : IDataEntityRepository<T> where T : class, IDataEntity
{
private IDbContext Context { get; set; }
public DataEntityRepository (IDbContext context)
{
Context = context;
}
private IDbSet<T> DbSet { get { return Context.Set<T>(); } }
public int Create(T CreateObject)
{
DbSet.Add(createObject);
}
// etc.
}
// where
public interface IDbContext
{
IDbSet<T> Set<T>() where T : class;
DbEntityEntry<T> Entry<T>(T readObject) where T : class;
int SaveChanges();
void Dispose();
}
So basically I am using the Context property in each pattern to gain access to the underlying context.
My problem is now this: when I create my unit of work, it will effectively be a wrapper of the context I need the repository to know about. So, if I have a Unit Of Work that declares the following:
public UserUnitOfWork(
IDataEntityRepository<User> userRepository,
IDataEntityRepository<Role> roleRepository)
{
_userRepository = userRepository;
_roleRepository = roleRepository;
}
private readonly IDataEntityRepository<User> _userRepository;
public IDataEntityRepository<User> UserRepository
{
get { return _userRepository; }
}
private readonly IDataEntityRepository<Role> _roleRepository;
public IDataEntityRepository<Role> RoleRepository
{
get { return _roleRepository; }
}
I have a problem with the fact that the two repositories I am passing in both need to be instantiated with the very Unit Of Work into which they are being passed. Obviously I could instantiate the repositories inside the constructor and pass in the "this" but that tightly couples my unit of work to a particular concrete instance of the repositories and makes unit testing that much harder.
I would be interested to know if anyone else has headed down this path and hit the same wall. Both these patterns are new to me so I could well be doing something fundamentally wrong. Any ideas would be much appreciated!
UPDATE (response to #MikeSW)
Hi Mike, many thanks for your input. I am working with EF Code First but I wanted to abstract certain elements so I could switch to a different data source or ORM if required and because I am (trying!) to push myself down a TDD route and using Mocking and IOC. I think I have realised the hard way that certain elements cannot be unit tested in a pure sense but can have integration tests! I'd like to raise your point about Repositories working with business objects or viewmodels etc. Perhaps I have misunderstood but if I have what I see as my core business objects (POCOs), and I then want to use an ORM such as EF code first to wrap around those entities in order to create, and then interact with, the database (and, it's possible, I may re-use these entities within a ViewModel), I would expect a Repository to handle these entities directly in the context of some set of CRUD operations. The entities know nothing about the persistence layer at all, neither would any ViewModel. My unit of work simply instantiates and holds the required repositories allowing a transaction commit to be performed (across multiple repositories but the same context/ session). What I have done in my solution is to remove the injection of an IDataEntityRepository ... etc. from the UnitOfWork constructor as this is a concrete class that must know about one and only one type of IDataEntityRepository it should be creating (in this case DataEntityRepository, which really should be bettered names as EFDataEntityRepository). I cannot unit test this per se because the whole unit logic would be to establish the repositories with a context (itself) to some database. It simply needs an integration test. Hope that makes sense?!
To avoid dependency on each repository in your Unit of Work, you could use a provider based on this contract:
public interface IRepositoryProvider
{
DbContext DbContext { get; set; }
IRepository<T> GetRepositoryForEntityType<T>() where T : class;
T GetRepository<T>(Func<DbContext, object> factory = null) where T : class;
void SetRepository<T>(T repository);
}
then you could inject it into your UoW that would look like this:
public class UserUnitOfWork: IUserUnitOfWork
{
public UserUnitOfWork(IRepositoryProvider repositoryProvider)
{
RepositoryProvider = repositoryProvider;
}
protected IDataEntityRepository<T> GetRepo<T>() where T : class
{
return RepositoryProvider.GetRepositoryForEntityType<T>();
}
public IDataEntityRepository<User> Users { get { return GetRepo<User>(); } }
public IDataEntityRepository<Role> Roles { get { return GetRepo<Role>(); } }
...
Apologies for the tardiness of my response - I have been trying out various approaches to this in the mean time. I have marked up the answers above because I agree with the comments made.
This is one of those questions where there is more than one answer and it's very much dependent upon the overall approach. Whilst I agree that EF effectively provides a ready-made unit of work pattern, my decision to create my own unit of work and repository layers was to be able to control access to the database entities.
Where I struggled was in the need to be able to inject a repository into a unit of work. What I realised though was that in the case of EF, my unit of work was effectively a thin wrapper around multiple repositories with a Commit (SaveChanges) method. It was not responsible for executing specific actions such as FindCustomer etc.
So I decided that a unit of work could be tightly coupled to its specific type of DataRepository pattern. To ensure I had a testable pattern, I introduced a service layer that provided the facade for executing particular actions such as CreateCustomer, FindCustomers etc. These services that accepted an IUnitOfWork constructor parameter which provided access to the repositories (as interfaces) as well as the Commit method.
I was then able to create fakes of both unit of work and/ or repositories for testing purposes. This just left me with the decision of what could be unit tested with fakes and what needed to be integration tested with the concrete instances.
And this also gives me the opportunity to control what actions are performed on the database and how they are performed.
I'm sure there are many ways to skin this particular cat but the goals of provided a clean interface that is testable have been just about met with this approach.
My thanks to g1ga and Mike for their input.
When using Entity Framework (EF) (which I assume you're using) you already have a generic repository IDbSet. It's useless to ad another layer on top just to call EF methods.
Also, a repository works with application objects (usually business objects, but they can be view models or objects state). If you're just using db entities, you kinda defeat the Repository pattern purpose ( to isolate the business bojects from the database). THe original pattern deals only with busines objects, but it is a useful pattern outside the business layer too.
The point is that EF entities are Persistence objects and have (or should have) no relation to your business objects. You want to use the repository pattern to 'translate' the busines objects to persistence objects and viceversa.
Sometimes it might happen that an application object (like a viewmodel) to be the same with a persistence entity (and in that case you can use directly EF objects) but that's a coincidence.
About Unit of Work (UoW), let's say that's tricky. Personally, I prefer to use the DDD (domain driven design) approach and consider that any business object (BO) sent to the repoistory is a UoW, so it will be wrapped in a transaction.
If I need to update multiple BOs, I'll use a message driven architecture to send commands to the relevant BOs. Of course, that's more complicated and requires to be at ease with the concept of eventual consistency but I'm not depending on a specific RDBMS.
If you know that you'll be using a specific RDBMS and that will never be changed, you could start a transaction and pass the associated connection to each repository, with a commit at the end (that will be the UoW). If you're in a web setting, it's even easier, start transaction when the request begins, commit when requests ends (you can use an ActionFilter for ASp.Net Mvc).
However this solution is tied up to one RDBMS, so it won't apply to a NoSql or any storage which doesn't support transactions. For those cases, the message driven way is the best.
I've researched some information about techniques I could use to unit test a DbContext. I would like to add some in-memory data to the context so that my tests could run against it. I'm using Database-First approach.
The two articles I've found most usefull were this and this.
That approach relies on creating an IContext interface that both MyContext and FakeContext will implement, allowing to Mock the context.
However, I'm trying to avoid using repositories to abstract EF, as pointed by some people, since EF 4.1 already implements repository and unit of work patterns through DbSet and DbContext, and I really would like to preserve all the features implemented by the EF Team without having to maintain them myself with a generic repository, as I already did in other project (and it was kind of painful).
Working with an IContext will lead me to the same path (or won't it?).
I thought about creating a FakeContext that inherits from main MyContext and thus take advantage of the DbContext underneath it to run my tests without hitting the database.
I couldn't find similar implementations, so I'm hoping someone can help me on this.
Am I doing something wrong, or could this lead me to some problems that I'm not anticipating?
Ask yourself a single question: What are you going to test?
You mentioned FakeContext and Mocking the context - why to use both? Those are just different ways to do the same - provide test only implementation of the context.
There is one more bigger problem - faking or mocking context or set has only one result: You are not testing your real code any more.
Simple example:
public interface IContext : IDisposable
{
IDbSet<MyEntity> MyEntities { get; }
}
public class MyEntity
{
public int Id { get; set; }
public string Path { get; set; }
}
public class MyService
{
private bool MyVerySpecialNetMethod(e)
{
return File.Exists(e.Path);
}
public IEnumerable<MyEntity> GetMyEntities()
{
using (IContext context = CreateContext())
{
return context.MyEntities
.Where(e => MyVerySpecialNetMethod(e))
.Select(e)
.ToList();
}
}
}
Now imagine that you have this in your SUT (system under test - in case of unit test it is an unit = usually a method). In the test code you provide FakeContext and FakeSet and it will work - you will have a green test. Now in the production code you will provide a another derived DbContext and DbSet and you will get exception at runtime.
Why? Because by using FakeContext you have also changed LINQ provider and instead of LINQ to Entities you are running LINQ to Objects so calling local .NET methods which cannot be converted to SQL works as well as many other LINQ features which are not available in LINQ to Entities! There are other issues you can find with data modification as well - referential integrity, cascade deletes, etc. That is the reason why I believe that code dealing with context / LINQ to Entities should be covered with integration tests and executed against the real database.
I am developing an open-source library to solve this problem.
http://effort.codeplex.com
A little teaser:
You don't have to add any boilerplate code, just simply call the appropriate API of the library, for example:
var context = Effort.ObjectContextFactory.CreateTransient<MyContext>();
At first this might seem to be magic, but the created ObjectContext object will communicate with an in-memory database and will not talk to the original real database at all. The term "transient" refers to the lifecycle of this database, it only lives during the presence of the created ObjectContext object. Concurrently created ObjectContext objects communicate with dedicated database instances, the data is not shared accross them. This enables to write automated tests easily.
The library provides various features to customize the creation: share data across instances, set initial data of the database, create fake database on different data layers... check out the project site for more info.
As of EF 4.3, you can unit test your code by injecting a fake DefaultConnectionFactory before creating the context.
Entity Framework 4.1 is close to being able to be mocked up in tests but requires a little extra effort. The T4 template provides you with a DbContext derived class that contains DbSet properties. The two things that I think you need to mock are the DbSet objects that these properties return and properites and methods you're using on the DbContext derived class. Both can be achieved by modifying the T4 template.
Brent McKendrick has shown the types of modifications that need to be made in this post, but not the T4 template modifications that can achieve this. Roughly, these are:
Convert the DbSet properties on the DbContext derived class into IDbSet properties.
Add a section that generates an interface for the DbContext derived class containing the IDbSet properties and any other methods (such as SaveChanges) that you'll need to mock.
Implement the new interface in the DbContext derived class.
I'm trying to create a unit test for a code similar to this:
foreach (string domainName in Directory.GetDirectories(server.Path))
{
HandleDomainDirectory(session, server, domainName);
}
The problem is that I'm using the System.IO.Directory class in my code.
How can I create a testing method that won't be dependent on any folder I have on my hard disk.
In other words, How can I fake the response of "Directory.GetDirectories(server.Path)"?
(Please note, I do control the "server" object in my class, therefore i can give any path i want)
Thanks.
Rather than calling Directory.GetDirectories(server.Path) directly, you could create an interface like IDirectoryResolver with a single method that takes a path string and returns the list of directories. The class containing your code above would then need a property or field of type IDirectoryResolver, which can be injected through the constructor or a setter.
For your production code, you would then create a new class that implements the IDirectoryResolver interface. This class could use the Directory.GetDirectories method in its implementation of the interface method.
For unit testing, you could create a MockDirectoryResolver class which implements IDirectoryResolver (or use a mocking library to create a mock instance for the interface). The mock implementation can do whatever you need it to do.
You would inject a wrapper class.
public class DirectoryFetcher
{
public virtual List<string> GetDirectoriesIn(string directory)
{
return Directory.GetDirectories(directory);
}
}
And then inject that:
foreach(string directory in _directoryFetcher.GetDirectoriesIn(server.Path))
{
// Whatever
}
You can then Mock that guy at the injection point (this example uses Moq, and constructor injection):
Mock<DirectoryFetcher> mockFetcher = new Mock<DirectoryFetcher>();
mockFetcher.Setup(x => x.GetDirectoriesIn("SomeDirectory")).Returns(new List<string>
{
"SampleDirectory1",
"SampleDirectory2"
});
MyObjectToTest testObj = new MyObjectToTest(mockFetcher.Object);
// Do Test
When communicating with the outside world, such as file system, databases, web services etc. , you should always consider using wrapper classes like the others before me suggested. Testability is one major argument, but an even bigger one is: The out side world changes, and you have no control over it. Folders move, user rights changes, new disk drives appears and old ones are removed. You only want to care about stuff like that in one place. Hence, the wrapper -- let's call it DirectoryResolver like Andy White suggested ealier.
So, wrap your file system calls, extract an interface, and inject that interface where you need to communicate with the file system.
The best solution I've found was to use Moles. The code is very specific, and must do very specific thing. Wrapping it with wrapper class will be redundant. The only reason I needed wrapper class is in order to write tests. Moles allows me to write the tests without any wrapper class :)