Extreme Programming
Writing software is a big fat mess and is usually tangled up with
bureaucracy and Human collaboration issues. As development
progresses, software typically becomes more and more fragile and more
and more buggy. As we've discussed, it's rare that a system is
delivered that actual does what the customer wants (let alone on
time).
To fix the problem, people tried to apply engineering principles to
the problem. That "analyze, design, implement, test" sequence is what
I was taught as an undergraduate in the early 80's. That waterfall
method produces software that is inflexible, late, and most
importantly doesn't do what the user wants. Apparently it's not the
solution.
The problem is that users don't know what they want until you start
showing them something. Quoting Tom Burns:
"You know the least about a problem at the start of a project--that
is why it doesn't make any sense to do your 'final' design at the
beginning. You need to familiarize yourself with the problem space
by making something useful--by writing code. Letting people use it
and give feedback. Then you should upgrade your design (refactor)
as you discover what is truly important."
The single biggest lesson I learned, and a point emphasized by
Extreme Programming (XP), the subject of this lecture, was that
requirements never stay the same for even a few days. You had better
get used to it, or better yet, take advantage of it lest you drown.
Interestingly, there is a close relationship between what XP espouses
and between what I ended up following at jGuru, most of which I
learned on the job or by listening to Tom Burns, our CEO.
This lecture summarizes the principles of XP, but does not delve into
details of suggested solutions by Kent Beck and crew. I summarize
"Extreme Programming Explained" here and pepper it with experience I
gained from building jGuru.com.
Agile Development
XP is a form of Agile development that
focuses on short development cycles and close interaction with
customers. Refactoring and incremental feature alteration/addition
are the name of the game. From the Agile manifesto, Agile development
values...
- Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan
The Agile principles are
also useful.
Martin Fowler (of refactoring fame)
says quote:
- Agile methods are adaptive rather than predictive. Engineering methods
tend to try to plan out a large part of the software process in great
detail for a long span of time, this works well until things
change. So their nature is to resist change. The agile methods,
however, welcome change. They try to be processes that adapt and
thrive on change, even to the point of changing themselves.
- Agile methods are people-oriented rather than process-oriented. The
goal of engineering methods is to define a process that will work well
whoever happens to be using it. Agile methods assert that no process
will ever make up the skill of the development team, so the role of a
process is to support the development team in their work.
Agile programming has fallen short, conference told:
"McConnell noted what appears to be a contradiction in agile programming thus far. While intended to focus on individuals and interactions, agile seems to be mostly about processes and tools now, he said."
Ian MacFarland's XP slides. Ian gave a
great talk to a previous CS601 class and had a great idea about
determining cost and schedule for clients.
XP Principles
Erich Gamma's summary:
- Code is the key activity
- Just-in-time software culture
- Frequent release cycles
- Make change your friend
- Communication is done with code
- Life cycle and behavior is defined in test cases (i.e., code)
- Problem reports are accompanied by failed test cases
- Improve code with refactoring
Does not imply that just start "daredevil" hacking. You must be disciplined.
Kent Beck's Summary:
- Review code all the time (pair programming)
- Everybody tests all the time (coders and users)
- Use simplest possible design that works
- Work to define and refine the architecture all the time
- Integrate and test several times a day
- Really fast iterations (releases): seconds, minutes, hours not weeks or months
"Extreme" implies "what is good, do to the extreme."
Designed for 2-10 programmers
What would you do if you had lots of time to build a project? You'd build lots of tests, you'd restructure a lot, and you'd talk with the customer and other programmers a lot. The normal mentality is that you never have enough time. XP says that if you operate in this counterintuitive manner, you'll get done faster and with better software.
What's the difference from other methodologies?
- Early and continuing feedback from short cycles
- Incremental planning; plan expected to evolve
- Flexibly schedules implementation to respond to business needs
- Relies on oral communication, tests, and code to describe system structure and intent
- Evolutionary design process that lasts as long as system lasts
- Close collaboration of programmers with ordinary skills
- Reliance on practices that work with both short-term programmer instincts and long-term project interests
Overcoming Software Problems
- Schedule slips. XP has short release cycles so slip is limited. Lots of feedback during dev with interim releases. Do the highest priority item first so if you slip, you are missing just the low priority stuff. TJP: project examples: some didn't do the required stuff first and some tried to build all at the same time.
- Project cancellation. XP ask customer to choose smallest release that make most business sense--less to go wrong before going into production.
- System goes sour. XP has lots of tests that are run and re-run when you change things or add features. You program from confidence. TJP: ANTLR project example: no tests and am always worried I'll break something!
- High defect rate. XP again does unit tests and functional tests by users or from their perspective.
- Business misunderstood. XP says customer is integral part of team. Spec continuously refined as features appear. TJP: As you get a chance to actually toy with something you see its problems and limitations. Overdesign at first is a waste of brainpower.
- Business changes. With short release cycle, XP will be less likely to get caught mid release. Customer can change functionality for anything not yet done because programmers won't notice. They are just walking down the list.
- False feature rich. XP does only high priority stuff first. TJP: my first large project in high school: bookstore. I spent so much time doing cool features that didn't help functionality that I ran out of time and software never deployed.
- Staff turnover. XP gives programmers lots of responsibility for their own work and gives them lots of feedback. Less chance a programmer will become frustrated by being asked to do the impossible.
Important XP Philosophy
- Focus on quality. You can actually go faster in the long run if you build more reliable software. Making boatloads of tests give you the confidence to write faster and with less stress knowing that you are not going to break something. Everybody wants to work on a good system. If you don't focus on quality, the system will decay and no one will want to work on it.
- Focus on scope. Just as when you write software, having to care about less is a big help. By figuring out the minimum workable requirements and sticking to it, you will get better software and on time. "Customers can't tell us what they want. When we give them what they say they want, they don't like it." This is always true. Customer doesn't know what they want at first. Only seeing some actual software can they refine and limit the scope etc...
Use the softness of requirements as an opportunity, not a problem.
XP's Four Values
- Communication. Problems can often be traced back to poor communication (TJP; cite class project that split,diverged because members didn't communicate). Programmers might miss telling others of important design change or not talk to customers. Manager might not ask programmer the right question. TJP: example of somebody at jGuru delivering bad news about wrong version being sent to customer.
- Simplicity. "What is the simplest thing that could possibly work?" It's very hard to implement something stupid and simple especially when you are always looking ahead to the future--you desire software that won't have to be changed later. But worrying about the future implies you are listening to the "changes later cost more than changes now" philosophy. The speed you gain from simplicity today can help fix speed problems later if you turn out to be horribly wrong 2% of the time. TJP: You have no idea will survive even a few days. Elegant software mostly comes from refactoring after the code has lived a few "generations." Use an evolutionary, iterative refined process. Use OO techniques and data driven software as much as possible to isolate things that might change. Example: URL->pageclass map, config files, java->rdms mapping, managers, etc...
- Feedback. Programmers use tests to get feedback about state of the system. Customer asks for a feature, get immediate feedback about difficulty (TJP: I remember Tom Burns asking me all the time "How hard would it be to implement ..."). Someone tracking project gives feedback to programmers and customers. Feedback also works on scale of weeks and months as customers write functional tests for features. Code and put into production most important features first so you can learn from it. In the old days, "production" meant you were done. In XP, it's always under production.
- Courage. When you find a serious design flaw, have the courage to go fix it; your project will surely die w/o it. Throw code away even if you've worked hard on it. Sometimes rebuilding results in much better code. Sometimes you have lots of choices for a design, have the courage to just try them out to see how they feel; toss losers and start over with promising design.
It takes courage to fix a fundamental design flaw or make a huge simplification in an existing system. Try it out! TJP: Tom/me in France building Karel language debugger. Listen to others and try out huge simplifications. I had to throw out my design and I wasted all that meeting time.
Some Terence Thoughts
Testing
It's easy to say build lots of unit tests. It's another thing to know what a good test is. One of the things I noticed in student projects is that people test their code with that same code! Ack!
Consider testing forum message inserts.
public void testForumInsert() {
boolean worked = db.insertForum(...);
assertTrue(worked);
}
The problem is that I can implement db.insertForum() as return true; and get a "green signal."
Another less egregious problem is the following:
public void testForumInsert() {
Message in = ...;
int ID = db.insertForum(in);
Message out = db.getForum(ID);
assertTrue(in.equals(out));
}
This is a poor test because I could implement insertForum to store things in RAM (not even storing in a db) and getForum would yield the proper result.
You should test your code with "exterior code" and the rawest code you can find. For example, I would test insertForum by writing SQL via the shell or a Java program that looked at the physical db tables.
Reproducibility
You must have an integration box that is identical to your development box that is identical to your deployment box. All using the same OS and Java version. Otherwise you will not know if a threading problem or other weird system "feature" will bite you. You must know that tests run on your integration box implies it will run on your deployment box. Conversely, if the deployment box has a problem, you need to be able to find it on a non-live box.
Load testing tools and such are crucial to reproducing conditions found on the live site.
Paranoia and "caring"
Many programmers I have worked with just didn't seem to care as much as I did about the system. My attitude follows Yoda: "there is no try, only do!" Therefore, I tested and tested and never left "an enemy at my back." When I released something at jGuru I had great confidence in it. Our first system was so bad (getting phone calls at 3 AM to reboot a frozen server sucks! I should have made the employees get up to fix it--would have resulted in better software) that I swore the second system would be well done.
If you thing something might go wrong, it will of course. The only time that the second version of jGuru has crashed due to software error was right before I went on vacation just after launch naturally. In fact, when it crashed, I knew just where the problem was immediately--the one bit of code I just threw together and didn't test for boundary conditions! We got an infinite loop. The system has crashed about 10 times: 2 power failures (induced by the moron ISP), 1 software crash, 1 disk overflow (oops), a handful of crashes due to insufficient resources (memory or file descriptors needed by lucene search engine). Naturally, there have been bugs in functionality ;)
Communicating with yourself
You need to record all conversations with the customer so you can remember later what to build.
For each task, I make a list of things to do.
I also have an overall list of tasks to build and in what order (prioritized).
As I build something, I keep a lab notebook describing what I try and also asking/answering questions as if I'm talking to myself. Stream of consciousness style. I have often been diverted from a task for a long time and the notes help a lot when I come back to it months later to restart.
I also keep a bug list on a piece of paper. For some reason this works better for me than a web-based system. I check them off as I finish them etc...