November 17, 2004 - Live blog from the Developer Testing Forum.

I am here at the 350 seat PARC-George E. Pake Auditorium in Palo Alto, waiting for the Forum to begin at 9am. We're expecting a good sized crowd. Traffic was terrible, but already a lot of people are in the lobby...I hope the food holds out. The combination of Kent Beck plus talent from local companies such as Google, Wells Fargo, and Oracle has resulted a lot of interest.

Forum Agenda

8:39am Waiting for Forum to begin at 9:00am.

9:00am The auditorium is filling up very quickly. Either the food ran out, or someone in the Lobby announced that the event was about to begin. Alberto Savoia (Agitar CTO and co-founder) is the "M.C." for the forum, and I see the A/V technician hooking him up with a microphone, so we must be starting shortly.

9:03 Forum is beginning with the SD Forum Chairwoman introducing Alberto Savoia. The 350 seat auditorium is almost full, with about a dozen people standing at the back (even though a few seats in the front row are open).

Developer Testing at Google
9:06 Alberto introduces Sriram Sankar, who received his Ph.D. at Stanford. Sriram is now at Google in charge of Developer Testing.

9:10 Sriram: Google is building very sophisticated products, with complex cutting edge technologies, never tried before algorithms, optimizations, heuristics. Their applications have significant scalability needs and have to deal with difficult issues such as spam, bots and attacks.

9:14 Testing is very important because of all of these challenges, plus the software needs to be durable--it can't crash.

9:17 Google believes that great code comes from happy engineers. Their engineering structure is very very flat. Engineers are largely self-managing and take on a lot of responsibility. There is a very strong peer review culture, and engineers are empowered to set their own goals. This structure creates an organization of very motivated and productive engineers. This makes the engineers feel empowered to build quality software.

9:20 Google has gone through tremendous growth in code base, users, engineers. More systematic processes for testing an analysis have been added.

9:22 Google has focused a lot on the early part of the development process-quality via design and review process. Design documents are required for all non-trivial projects and a formal peer review process is done. All changes to code base require peer review. Strict programming style guidelines and formal initiation to those guidelines for all new engineers. Great code comes from a good early design and review process! Process moves a bit slower because of thise, but quality and end-results are better.

9:25 Goals of testing and analysis--smooth development process without build breakage (Unit Testing and XP have made a big impact here), functional correctness and backward compatibility, robustness scalability and performance, understanding user needs/improving functionality

9:28 Standard Google practices include unit tests and functional tests, continuous builds, last known good build, release brances with authorized bug fix check-ins, focused teams for release engineering, production engineering and QA, bug tracking and logging of production runs.

9:30 Google brings in XP consultants to educate engineers, employs extreme feedback mechanisms like monitors and ambient orbs for visual feedback. They have specialized test and analysis tools during production and prior to production. Sometimes, they have fix-it weeks for fixing bugs, writing tests, improving documentation, etc.

9:32 When Google introduced XP to improve quality and other metrics, they hired a team of XP consultants and paired consultants with engineers. They created short projects for the employees with testing/XP as a theme. The teams focused on understanding code base and building good unit tests, and other TDD aspects. How to use infrastructure such as JUnit. XP was introduced at Google about 8 months ago. Engineers are not forced to use XP, but XP adoption is going well. Already, they have seen improvements in key metrics and in stability.

9:37 First steps were to build functional tests for existing code, develop tests that fail for bugs in the bugs database, unit tests for existing servlet handlers, unit tests and TDD for new code, and fix-it weeks devoted to developing unit tests and functional tests.

9:40 current status: very stable builds due to unit tests, better stability in backwards compatibility due to unit and functional tests. More TDD in future, with many more tests. Goal is to get to a stage where XP and testing offers benefits beyond build stability and backwards compatibility--much more quality in production software.

9:45 Google does a lot of logs of production runs and have tools and APIs to process them. Rule based and strema-based tools feed off logs and produce graphs, call pages, etc. Exception traces during production are extracted from the logs, and each stack traces is assigned to a particular engineer for further analysis (done wtih clever correlation with the SCM system to find the right engineer)

9:47 Servers are packaged togher with heap inspection methods, which can be invoked through special commands to the server. This produces a full memory dump which is then analyzed off-line by tools to locate problems. This is necessary because it is impossible to replicate the production system in its full complexity.

9:49 Google has many databases and uses an O-R mapping layer to hide details of access. Often performance issues are related to database access. Tools are used to identify database queries resulting from different servlet handlers. Ratchet tests that fail if database activity exceed thresholds.

9:51 Multi-threaded program behavior is a difficult area to test for, as these programs may demonstrate bad behavior (e.g. race conditions) only under certain circumstances that may be impossible to reproduce. Static analysis can't be used to check for this behavior. Google has a hybrid system that runs on the system being tested for a short time and then performs static analysis on the resulting execution trace.

9:54 Key messages: self-motivated engineers, grass roots adoption of XP and other new techniques, productivity and quality continues to improve, and is what helps Google keep up with their tremendous growth. Unit testing has helped us improve our infrastructure and to work better together.

END OF GOOGLE PRESENTATION, although Sriram is being mobbed at the speakers podium by people with questions and that will delay the start of the next session.

Making Developer Testing Work

10:00am - 10:50am Panel discussion-Making Developer Testing Work: Myths and Realities. Panelists include Rob Mee (Pivotal), Russell Gold (Oracle and TDD author), Sri Muthu (Wells Fargo) and David Vydra ( co-founder). Moderator is Alberto Savoia (Agitar)

10:08 Each panel speaker will have a few minutes to talk about Developer Testing and then Alberto will ask questions.

10:11 Rob Mee, president of Pivotal and XP consultant, programmer and coach. Started with agile methodologies and SmallTalk in the 90's with Kent Beck.

10:13 Russell Gold. Started with realization that there was no way of testing his own code, and he would have to re-do it to make it testable. About 7 years ago, at a Martin Fowler lecture on re-factoring he had a revelation. Unit testing was required. By using that method, he was pulled into the TDD world. At Oracle, he developed Web testing tool HTTPUnit.

10:15 Sri Muthu. Big managers THINK that testing is going on, when at most companies it really isn't. Now, the visibility is there. Business partners are asking for proof of quality. They are now taking previously outsourced code and are now having to test it. Wells Fargo is making Testing part of the Wells Fargo culture.

10:17 David Vydra. For first 17 years, didn't write any unit testing. First project with unit tests had a great outcome, and made him a tremendous fan of Developer Testing, and he started the test driven online community.

Alberto: What do you think is the biggest myth (or reality) of Developer Testing?

10:21 Rob: One of the companies he works with uses Developer Testing to a great degree, and the VP there states that he believes people who don't do that are committing professional negligent. Rob thinks he has a point, in that you need to validate and verify the coding. Conversely, the vast majority of developers don't do testing because they don't think that they are the right people to do it. That is a huge myth--a gap of perception for what is appropriate for a developer to do.

10:23 Russell: Myth is that developers think that Developer Testing is QA work, and that it is just something for them to do in addition to their real work. And this is totally not true. Another myth is that all this takes more time. Developer testing is unique in that you get a payback almost immediately from doing it. Because you don't have to do many things that you'd have to do otherwise--e.g. you don't spend nearly as much time in a debugger, don't have to worry about other people breaking your code.

10:25 Sri: myth is that the business people think that it's happening. Developers don't know why they should be doing it, they think that their code doesn't need it (reality is that sometimes it does need it, and even if it does, OTHER developers who might use or inherit code would need these tests). Developers don't know where they will find the time to do this. Reality is that it's easier to fix it now than fix it later. Find the tools to make testing a reality. Make the business case for the tools to management.

10:27 David: Continue on business theme. Have seen business spend a lot of money on things, but not on developer testing. Unit Tests are often worth more than the code itself. Especially for 1.0 versions of code. Unit tests allow you to be agile, evolve your systems.

Alberto: next question--about 1/3 of people exposed to TDD become devotees. How do you get the other two-thirds hooked?

10:33 Rob: sometimes the percentage is even lower. Some developers are just afraid to accept it. Someone with vision needs to lead the team. It's sort of like going to the gym-once you've started going, you're happy you've done it. And it helps to have a coach/trainer.

10:35 Russell: any time you ask someone to change the way they're working you send an implicit message that you don't like what they're doing now. Some people are just opposed to change. There is also a lack of knowledge. It's not always easy to make this change. It is the visionaries and the people who demonstrate success who will convert the people to testing.

10:37 Sri: A management mandate can only go so far. You need to reward folks who do testing. These people need to be given the best projects. So management support is critical. Sri has seen a higher adoption of TDD among younger developers-they often get it instantly. This influences the rest. Peer pressure plus management behavior are both required.

10:39 David: Added the observation that when surgeons started washing their hands before operations, there was resistance. Now everyone does it.


10:42 Audience: How do you make sure that requirements are testable? Rob: Start with the tests. Rob uses tests as executable requirements. As a result, the code is testable. Once the value is proven, you can go back to legacy code. Russell: Often, people think of acceptance testing in this fashion, rather than developer testing/unit testing. Rob: if you don't know how to write the test, how would you know how to write the code? Sri: Developer Testing results in a better product, earlier. They have started to put the test requirements in the design documents (e.g. generate test cases/assertions) while in the design phase. This requires updates and maintenance to requirements, which can be a cultural challenge. David: tests against existing code can be painful, but it should be done and usually can be done. Then move to a test-first system.

10:48 Audience: How do you know if you're doing testing right? Rob: very few are, but question is how to improve it. Many factors come into play. Russell: bottom line is productivity and quality. It's hard to look at a company and attribute it to a single technique, it's often a range of techniques. Sri: How many projects were completed, and how many people did it take? How happy are developers and customers? With TDD, can get work done with a lot less stress. And with more efficiency. Rob: you can really see the success of testing when companies have to redevelop a product because they hit a wall with a previous architecture. Alberto: Developer Testing is like flossing teeth. You won't lose all your teeth immediately if you stop, but unless you can come up with a really good reason NOT to floss, there's no excuse for not doing it.

10:55 Audience: a lot of concern that more testing = more time. What's the real story? Rob: The fastest developers he knows all do Developer Testing. Starting with people who know how to do testing, you win. There is a cost associated with training new people and retrofitting an older code base. Russell: doesn't know anyone he's taught the technique who's wanted to go back. Biggest productivity boost is getting rid of re-work. Debugging means that you've done something wrong. Developer Testing gets rid of that. And more than pays for the extra work. Sri: it is not cheap to start the process. It requires time, money, training, management commitment. Once you get people up the curve, you save. His products are not a one-time deal. They must be sustained and maintained. Need to think about the lifecycle of the product. After first release, significantly faster. First release of a new product was completed in 5 months. Next release was 1 month. New release cycles are now 4-6 weeks because of testing. David: Has seen some JUnit environments where tests can become a maintenance burden, you need to be careful there.

END OF PANEL DISCUSSION. Short break. Kent Beck presentation to follow

The Future of Developer Testing

12:23 Kent Beck: the Future of Developer testing

Accountability: is different than blame. What is accountability? One of the words Kent has recently learned has been health. It is different than quality. Quality = bugs. Health is state of software over time. Quality is like the state of your body's vital signs. But health is more than that--how do you respond to stress or exertion? My goal is not quality software--it's healthy software..healthy software over time. Healthy software generates a kind of value that is more than just software that has few bugs at release time.

11:27 Many aspects of software health. Individual health. Health of relationships (team, customers, managers, partners). Software health is everyone's responsibility.

11:31 But a lot of barriers to this. Cost (but compared to WHAT?-throwing software over the wall to QA and not counting the number of times it comes back?) Better tools = better results.

11:34 People don't have a good sense of the benefits, of how good can software be. What would it be like to get one defect report a month or a year? Some Developer Testing environments get that. You can trust that the software really works the way it's supposed to. But even if you brought out a spreadsheet and could prove without a doubt that the benefits were there, some people wouldn't adopt it. Cultural barriers exist. People often want plausable deniability. Others don't want to be told what to do.

11:38 Advantages of Developer Testing-if you want health, why is it a good place to start? Not the only place to start, but there are some very specific advantages of starting here. You get immediate feedback. In the right circumstances, you can get an improvement from Developer Testing in an hour. Another big advantage is that it doesn't rely on anyone else to change their behavior.

11:42 Similarly, a team can adopt Developer Testing without changing other teams. And the improvement is substantial. In some extreme circumstances, defect rates can be driven so low as to eliminate the need for QA. Finally, Developer Testing enables many of the follow-on improvements for software health, like continuous integration.

11:45 Most of the responsibility for Developer Testing is on you. The tools themselves take you a certain part of the way. They can automate the tedium out of testing, but can't test for you. You are responsible for the decision to do Developer Testing, for staying with it, the attention to the results

11:50 Where you focus your attention and maintaining that balance is a key to success. Part of developer testing is figuring out where to focus

11:52 Two keys to making changes and making them stick: finding a community for support (inside or outside the company), and accountability. Accountability creates trust. This kind of visibility creates trust inside the team and between teams and especially with customers.

12:04 Audience question: Developer Testing vs. Unit Testing and Functional Testing--Kent thinks that Developer Testing can be either. Developer Testing involves writing whatever kinds of tests are required to make the code better. The distinction doesn't enter his mind.

12:07 Audience question: do developers have the write tools for legacy code? Look at things as opportunities. The barriers are design issues, not test issues. Can be a chicken and egg problem. Sometimes you need to do something courageous, like refactoring or even just changing design a little bit. Once you're over the hump, things get better.

12:10 Audience question: how do I allocate budget for a project? Initially, a Developer Testing project would take just as long, as a first order approximation. Over time it improves.

12:14 Audience question: what about bugs introduced in the tests? Normally that really doesn't happen, as the code and the test tend to be checks for each other.

12:23 Kent Beck is now giving away signed copies of his latest book to some Forum attendees, and Agitar is serving lunch (but you need to be here!)

Posted by Mark DeVisser at November 17, 2004 08:37 AM

Trackback Pings

TrackBack URL for this entry: