So here is the situation-- a quote from my boss: "[...] we need to focus on programming. [...] At the end of the day I want to write good software and not get bogged down with testing." This is said after we have had 3 months of a daunting list of bugs and recently designating a non-programmer to write web tests with Selenium framework.
My boss is very much unit-test shy (he can't see the cost benefit when it slows down developers). What are your opinions out there on web tests and programmatic tests in general? Should they be written by the (or a) programmer or does it matter? My thought was that part of writing good software is开发者_开发知识库 writing tests? He's a Microsoft ivory-tower kind of guy, and so any resources out there that have been put out by Microsoft (or good articles in general) in favor of testing by design would be helpful.
Here's what I did.
I wrote the tests anyway.
I wrote the code after writing the tests.
The code was rock-solid and (mostly) bug-free (to the limits of my abilities.)
I never told anyone I was doing TDD. Unless they ask.
It turns out that TDD is actually faster than messing around trying to design something, code it and hope it works.
A few things include an extra step 0: a "technology spike" to see how things work. This is followed by some test development to exercise the as-yet-not-written real software.
I'm a little behind schedule when it comes to starting design. Since my design is "design and write tests for that design" while some other folks design is "scratch around with some clever ideas but no real proof." Some folks can design on paper well. I can't. But I can design tests.
I'm generally pretty far ahead when it comes to finishing code. Since -- when I'm done coding -- all the tests pass.
Code Complete is a book which is part of the Microsoft Collection. It contains advice advocating peer review and brushing upon unit testing as a concept. It does not go too far into detail with unit tests, but it may warm him up to the idea and you can further explore the topic from there.
Ultimately you need somebody who is a programmer directly involved in automating testing... I mean, that's by definition.
Unit tests are most effectively written by the people who are most familiar with the subsystems they are written for, when someone else is chosen to write unit tests it takes them time to ramp up, and they may miss intention not documented or clear in the code which could result in worse coverage. On the flip side, the owner of the subsystem can be blind to certain deficiencies as well (but this is what peer code reviews are for!)
The rest of this is really just idle discussion about ethics, but it's important to consider.
Some people like to try and "sneak shit in" to the build when management makes silly decisions. This makes me not only uneasy, but also kind of wary about those programmers. I understand the motivation, I think we've all been there, but ultimately you should educate rather than participate in subterfuge.
Management plays an important role in scheduling and they rely on you for both accurate estimates and a general understanding of work being done. If you pad your estimates to sweep extra work under the rug is that really a good thing? What was a simple lie becomes this elaborate hoax you're playing on the people directly involved in helping your career progress.
What was a problem with process and estimation for legitimate work has now become a sticky ethics issue.
I strongly recommend going about your planned approach of convincing your manager to see your point of view through reason, logic, and appealing to his love of Microsoft. ;)
Over the long term if you find yourself constantly fighting management on decisions about programming process (which really isn't their job to make decisions on) it would probably be best to polish up that resume and find a better job.
Part of a programmer's job is to educate the people involved who have less expertise. Explaining that to your manager may help break down some of the intellectual barriers he has on the subject and soften him up to accepting your advice on the matter.
I go for the world that for something to be "done done " it needs to have been verified by at least two people. You don't always need a software tester on the team if everyone on the team believes that quality of software is everyone's job.
If you want it to be highly efficient then the developer writing the code should be writing the tests and someone reviews them with the production code. If you want to be highly effective then pair with someone and they write the tests while you write the code in a "paired tdd".
Remind the manager that the cost of bugs grows expontentially the later it is found.
I understand where your boss is coming from. After all, programmers should be able to churn out code that just "works". However, testing will happen no matter what, be it unit testing, integration testing in your company or after you install at the customer. Errors at each one of these stages have different costs and consequences, though...
You will often hear it's a good idea to make other people test your code because they might interpret the spec differently. However, the benefit of writing the tests for your own code is that you know where it can be defective.
An efficient approach could be a mixture of both: the programmer writing his own unit tests, focusing on corner cases and someone else doing some functional testing, focusing on the higher-level specification.
Not all tests are created equal, lets go over some:
- Unit tests / TDD. Its hard to argue against it, specially as its the opposite to "he can't see the cost benefit when it slows down developers", its faster. S. Lott goes over the details.
- Focused integration tests. Same as above, its faster. No having to run through a series of steps through your fully integrated app, to find out if that very thin layer of integration code you just wrote is working. That's worst when you consider that's repeated as many times you have to go in there, and even more when it wrongly becomes tightly coupled to different processes.
- Full system tests. With the above in place, you just want to know if all the pieces were hooked correctly and if the UI is working as expected. For the earlier you need a tiny bit amount of tests that are very quickly written; compare that to how many times people go manually over it (including yourself) and you have a winner :). For the later there are some things that are hard to catch with automated tests, so you don't focus automation on those.
- Exploratory tests. These should still be done, hopefully enough thought is put before the feature is built to avoid having to do extra changes at this point.
Now, QA usually comes up with additional scenarios. This is good, but its best when its included in a collaborative way in the automated tests.
One real issue can be current team skills. Unit testing is harder the more coupled the code is, this is usually worst with existing code with no unit tests, but it also initially makes it harder to move forward for a developer that doesn't know well how to design code loosely coupled/high cohesion code. It needs to be done, but be aware of this as the cost will go somewhere (to the team members or the company).
To create a good test suite, it takes quite a bit of effort; this will take longer than expected.
In this respect, your boss is right.
However, if you plan to extend your product (add features) or web site (add pages and javascript code), then you need to do create the suite anyway at a later time, though.
You could point out that unit testing has be built into the .Net framework since VS2005.
精彩评论