I will slowly add my thoughts on how to combine "Earned value management" with FDD here to collect your feedbacks.
Why Earned Value Management (EVM)
Because many government contracts require Earned Value management for tracking/auditing purpose.
How to Combine EVM with FDD
Integrated development team
Communication disconnects, as the root cause of many project failures, are also the fact of life. Well-acknowledged disconnects are between Software/System Engineers, Testers/Software Engineers, and the most serious one is between customers/System Engineers. There are many proposals to mitigate this problem by adding more processes, however they are rarely effective and they generally are expensive as well.
Agile approach recognizes this fact of life. Rather than trying to get rid of this, it proposes to use shorter release iterations to surface up the disconnects early so that you address them early. Since each iteration produces a working system, everybody's understanding is aligned and disconnects are identified by demonstrating the system.
There are several ways to further reduce communication disconnects among developers (requirement and software developers). For projects with traditional team structure (SE IPT, SW IPT, and Test IPT), assigning technical person as Deputy Program Manager to ensure system and software work as a product-oriented team(As suggested at 2007 Workshop of Integrating System and Software Engineering, USC). Another way is to construct an integrated system and software team to further reduce the potential disconnect along the software development.
To support a much shorter release iteration, you shall have an integrated System and Software team to lower the communication barrier between System and Software engineers.
Here is the breakdown of Integrated System and Software team structure:
Integrated Feature development Team consists of Architecture Sub-Team, Requirement Development Sub-Team, one or two Service Development Sub-Team.
Architecture Sub-Team
A small group of senior developers working on infrastructure part required by a feature. They are the key to maintain a clean architecture skeleton along the software development cycle. They are also responsible for developing automated acceptance tests. Those automated acceptance tests are the key to maintain the software quality.
Requirement Development Sub-Team
A small group who know the domain and interact with customers regularly. They are responsible for interpreting requirements and writing test procedures. This sub-team works very close with Architecture sub-team to keep the architecture clean.
Service Development Sub-Team
A small group working on code above the infrastructure layer required by a feature.
Technology Team
A small group who have deep knowledge of technologies such database, design pattern, and algorithms. Team members of this team generally have double duties. They are responsible for reviewing developer's code and tests before they check in.
Integration/Product release Team
A small group develop/maintain software development infrastructure, conduct internal product verification of expected functionalities and perform product release. They are also responsible for the documentation (user manual, VDD, etc) associated with product release.
Feature identification in Inception
This phase shall start with clear definition of concept of operation, then follow with a high-level walk through of the scope of the system and its context. You shall produce high level design and overall architecture for the system.
You shall derive features from well-defined concept of operation, and review them with your customers. The checklist of evaluating feature identification is:
* Necessary – Something that must be included or an important element of the system will be missing for which other system components will not be able to compensate. In my past experiences, I found that customers tended to give "requirements" that are implementation approaches. In such case, you need to work with your customers to make requirement to describe the objective, not a solution.
* Testable– Must be able to determine that the requirement can be verified through testing. Other verification methods (inspection, analysis, demonstration) should be avoided as much as possible. At this stage, testability is not thorough elaborated yet, but at least you shall determine one verification method for a feature.
By the end of this phase, to get 100% progress, you shall produce:
* High level design and overall architecture. This up-front high level design is critical for the project to stay focused while drafting around when performing refactoring. Refactoring should not replace your up-front design.
* A list of well documented features and "long mission thread" (high level use cases)
* peer review design and feature identification with all stakeholders.
Initial feature assignment in Elaboration
The phase shall start with initial elaboration of each feature as described in Developer notebook. You shall perform the following activities:
* Through elaborating on requirement definition and testability, it aligns software developers, requirement developers and SME on the same phase.
* Through elaborating on implementation analysis, you shall initial high level complexity analysis, if a feature is too complex, then break it into several ones so that each feature can be done in several weeks.
* Ask your customer to prioritize the identified features.
* Work with your customers to assign features to each release iteration.
o A feature in one iteration shall NOT depend on features in future iterations.
o Closely related features shall be in the same iteration.
The duration of release iteration ranges from several weeks to 2 months, depending on project nature. The guidance here is to balance the payoff from regular release and the overhead of release process. No matter what duration you propose, you shall stream line your release process (e.g. automate integration tests and acceptance tests as much as possible).
By end of this phase, to get 100% progress, you shall have:
* produced a baseline that has been reviewed with your customers.
* completed and peer reviewed definition elaboration, testability analysis and implementation analysis in developer notebook.
Execution in a release iteration
A feature development team will pick up one feature and start working on it. The milestones for feature development are:
* Domain-walkthrough: Led by Requirement Development Sub_team to align everybody's understanding of a feature. Typically it takes 1-3 days.
* Design: Led by Architecture Sub-Team to conduct joint design for a feature. You shall produce detailed design in a light weight fashion. The design shall include developing test procedures and automated acceptance tests of the feature. Those automated acceptance tests along unit tests will be exercised all the time to prevent introduction of bugs in the future after "Promote to Build" milestone.
* Design Inspection: Peer review your design and acceptance tests with stakeholders.
* Code: Development of the feature. The effort includes unit test, code and documentation.
* Code Review: Peer review code and acceptance tests.
* Promote To Build: Integration/Production Release team verify the feature by already developed test procedures.
After a feature development could work on several features at the same time to maximum development parallelism.
At the end of this iteration, all the verified features will be demonstrated to stakeholders, and the requests for changes will be translated into new features, put into the backlog, prioritized by the customers and assigned to a future iteration.
Move features in/out from a release iteration
The estimates in hours of the features in an iteration should be updated to the best of your developers' knowledge regularly. The estimates are bottom up estimates, reflecting what the engineers know what entail at that time. The purpose is to allow you to adjust the scope of the current iteration.
So at some point, you might want to defer some features to next iteration to make schedule more manageable. However, you need to know the consequences.
* EV of a release iteration = the number of delivered features x budgeted cost of a feature.
* PV of a release iteration = the number of planned features x budgeted cost of a feature
where the budgeted cost of a feature = budgeted cost for all release iterations / the number of all planned features.
If you want to defer some features to next iteration, you will have variances in current iteration:
* CV = the number of deferred features x budgeted feature cost .
* SV = -CV
If customers want to add new features, some low priority features will be put to the backlog. So the number of planned features should stay roughly the same. Customers could increase the budget so that you can add another release iteration. Adding more release iterations is better than adding more developers because of the following:
* developers are not resources that could be put in and become productive right away.
* Volatility in your staff could damage team morale.
Track iteration status and project overall progress
There are many tools such as "FDD Project Management Application"(FDDMA) that could be used to manage project development.
For example, FDDPMA is a web-based application that manages software projects. It facilitates iterative development by reducing FDD management overhead, producing graphical progress reports, providing a workplace where all the FDD related documentation is collected.
FDDPMA is an open-source project. Java developers may download its source code, compile and install the application on their own servers. Those who do not want to deal with FDDPMA installation may use this site to manage their FDD projects.
Trend analysis for future release iterations
At end of each iteration, PM shall analyze the variances (CV and SV) and the actual cost of a feature (in terms of hours). For example, if the average actual cost of a feature is much higher than you budgeted, then you pretty sure that the future iterations are in trouble. In this case, you need to work with your customers to create a recovery plan(e.g. re-baseline).
Having schedule variance and cost variance is natural. It's your responsibility to educate your customers on this fact. It's also your responsibility to get customers in the loop to work out a joint recovery plan if necessary.
You should let customers make informed decisions when they want to change the scope. They should know the impact of adding/deferring/deleting certain features to schedule/cost in a quantitative fashion.
Key Points
* To use EVM, you must measure your technical performance quantitatively.
o Develop testable requirements
o Define clear and sharp exit criteria
* To combine EVM with Agile, you must measure your technical performance efficiently.
o Integrate System and Software engineering.
o Use Effective Agile techniques (Continuous Integration, Test Driven) in software development Agile Techniques.
o Develop acceptance tests first and automate acceptance tests as part of Continuous Integration.
o Use several shorter product release iterations to flatten development/integration complexity.
o Avoid un-necessary documentation and focus on effective communications.