I've just recently started looking into unit testing with PHPUnit and was wondering if it is possible to mock my entire database fo开发者_开发知识库r all my tests. My model classes (Table Row objects wrapped to provde an ActiveRecord implementation) are rooted to the database and some models have many levels of other model classes so mocking all of this seems like its gona be a pain in the rear end.
Is it possible to get PHPUnit to use data from a CSV file to act as my database and run my tests against the data there without having to touch my DAOs? I've read the section on database testing in PHPUnit but I'm not sure this is what I want because its not so much that I want to test the database or code interacting with the database, its more the fact that my model classes are very tied to the database and it would be a pain having to mock things all the time. If I could give it a CSV file to act as my database then I can just put my data into the CSV file and carry on as normal.
Not sure if I'm making myself clear so please ask for clarification. If it is possible to achieve this then that would be fantastic. Without this unit testing this beast may not be practical but I really want to bring in unit testing to the project.
Thanks
Ziad
The way I see it there are a few advantages to using DBUnit to test your DB-dependant code. These may or may not apply to all, I agree. I had not used DBUnit myself until very recently, as I found it was overkill for the small projects I am involved on.
Using DBUnit decouples your tests from the database. Your system under test should not depend on a DB server to run, you want as much isolation as possible. This is usually not a big deal, as a lot of people use unit tests for integration testing, and for that you would prefer using a database setup that matches your production environnment as much as possible.
One other reason DBUnit is favorable is in the time it saves when writing tests. Writing tests is not the part of coding I enjoy doing, and using mocks when possible saves a lot of time. Sure, creating a copy of your database is easy, but you still have to make scripts to load test data, have scripts to reset test data between each test run. A lot of tests will depend on specific datasets, and this gets harder to maintain as you add tests. Of course all this setup needs to be kept in sync with your actual application as it evolves. Depending on the application, this can get to be very time-consuming. Some applications have thousands of tests, and it is heartbreaking when a simple change in the application cause 20 tests and their datasets to have to be rewritten.
All that being said, there is a learning curve to use DBUnit, and the first time you play with it it can be a bit time consuming. Once it is in place for one test, it saves time for the subsequent ones. I don't use it everywhere, as I have a lot of tests written that depend on actual server, but the new code & tests I write I try to build up on my initial DBUnit setup, and appears clearly that this does save time in the longrun.
My 2 cents, good-day friend.
What about using your database but not committing changes? Or rolling back after the tests are run? We use a copy of our production site as a test environment on a separate server and do all of our testing there.
You could maybe create a copy of the DB, and simply connect to this copy in your test bootstraps?
I'm kind of in the same situation, and this is the option I'm currently working out... Not perfect, but at least it gets the first tests working
A technique that's worked for me before is to mock the database adapter directly and inject the mock into the dependency chain. This only works for very small units, however, as it's a big pain to make the mock adapter return the array structure you'd expect to get back from a specific query. If there are multiple queries involved in a single method call, this won't work at all.
The other tactic that I've had success with is using dbUnit for just setUp and tearDown of the database. Fixture data is stored in an XML or Yaml file and imported into an empty database before every test, providing a known state. Given a schema and a minimal data set, an in-memory database will run the tests reasonably quickly by avoiding the disk I/O issue.
Finally, consider if you even need to write tests for data persistence... If you're testing an ActiveRecord implementation, you probably have some tests that prove the basic CRUD operations at the library level. What you're really interested in testing doesn't rely on the database but that the in-memory objects behave as expected when given a known starting point. You can get a known starting point by creating the object in memory without loading it from or saving it to a database at all.
Just setup the expected state in memory by creating some objects, exercise the business logic, interrogate the final state, and never bother calling "save"... Or whatever you called that. If all you need is an object that looks like a database connection to satisfy a dependency, you've got a mock for that...
精彩评论