Phases > Construction > Sample Iteration Plan

This illustration shows the relationship of the workflows in an early construction iteration. It is constructed from the Workflow Details as they would appear at that time. The intent is to indicate dependencies and show where workflows occur in parallel. The lengths of the bars in the chart (indicating duration) have no absolute significance. For example, it is not intended to convey that Plan the Integration and Plan Test must have the same duration. There is also no intention to suggest the application of a uniform level of effort across the duration of the workflows. An indication of the relative effort can be seen in the Process Overview. You can navigate to the corresponding Workflow Detail pages from each line of the chart by clicking on the Workflow Detail name. This illustration was created from a Microsoft Project Plan.


Note that there is significant continuing design work shown in this iteration, indicating that it is early in the construction cycle. In later construction iterations, this will diminish as design work completes, when the design work remaining will relate to Change Requests (defects and enhancements) that impact design. Requirements discovery and refinement is shown as complete at this stage, the remaining effort relating entirely to the management of change.

Sample Iteration Plan

Project Management: Plan the Iteration.

The project manager has updated the iteration plan based on what new functionality is to be added during the new iteration, factoring in the current level of product maturity, lessons learned from the previous iterations, and any risks that need to be mitigated in the upcoming iteration (see Artifact: Iteration Plan and Artifact: Risk List).

Environment: Prepare the environment for the iteration.

Based on the evaluation of process and tools in the previous iteration,  the Role: Process Engineer further refines the development case, templates, and guidelines. The Role: Tool Specialist does the necessary changes to the tools. 

Implementation: Plan system-level integration.

Integration planning takes into account the order in which functional units are to be put together to form a working/testable configuration. The choice depends on the functionality already implemented, and what aspects of the system need to be in place to support the overall integration and test strategy. This is done by the system integrator (see Workflow Detail: Plan the Integration in the Implementation discipline), and the results are documented in the Artifact: Integration Build Plan. The Integration Build Plan defines the frequency of builds and when given 'build sets' will be required for ongoing development, integration, and test.

Test: Plan and design system-level test.

The test designer ensures that there will be an adequate number of test cases and procedures to verify testable requirements (see Workflow Detail: Plan Test in the Test discipline). The test designer must identify and describe the test cases, and identify and structure the test procedures. In general, each test case will have at least one associated test procedure. The test designer should review the accumulated body of tests from preceding iterations, which could be modified and therefore available for re-use in regression testing for the current and future iteration builds.

Analysis & Design: Refine Use-Case Realizations.

Designers refine the model elements identified in previous iterations by allocating responsibilities to specific model elements (classes or subsystems) and updating their relationships and attributes. New elements may also need to be added to support possible design and implementation constraints (see Workflow Detail: Design Components) Changes to elements may require changes in package and subsystem partitioning (see Activity: Incorporate Existing Design Elements). Results of the analysis need to be followed by review(s).

Test: Plan and design integration tests at the subsystem and system level.

Integration tests focus on how well the developed components interface and function together. The test designer needs to follow the test plan that describes the overall test strategy, required resources, schedule, and completion and success criteria. The designer identifies the functionality that will be tested together, and the stubs and drivers that will need to be developed to support the integration tests. The implementer develops the stubs and drivers based on the input from the test designer (see Workflow Detail: Implement Test in the Test discipline).

Implementation: Develop Code and Test Unit

Implementers develop code, in accordance with the project's programming guidelines, to implement the Artifact: Components in the implementation model. They fix defects and provide any feedback that may lead to design changes based on discoveries made in implementation (see Workflow Detail: Implement Components in the Implementation discipline).

Implementation: Plan and Implement Unit Tests.

The implementer needs to design unit tests so that they address what the unit does (black-box), and how it does it (white-box). Under black-box (specification) testing the implementer needs to be sure that the unit, in its various states, performs to its specification, and can correctly accept and produce a range of valid and invalid data. Under white-box (structure), testing the challenge for the implementer is to ensure that the design has been correctly implemented, and that the unit can be successfully traversed through each of its decision paths (see Workflow Detail: Implement Components in the Implementation discipline).

Implementation: Test Unit within Subsystem.

Unit Test focuses on verifying the smallest testable components of the software. Unit tests are designed, implemented, and executed by the implementer of the unit. The emphasis of unit test is to ensure that white-box testing produce the expected results, and that the unit conforms to the project's adopted quality and development standards.

Implementation and Test: Integrate Subsystem.

The purpose of subsystem integration is to combine units that may come from many different developers within the subsystem (part of the implementation model), into an executable 'build set'. The implementer in accordance with the plan integrates the subsystem by bringing together completed and stubbed classes that constitutes a build (see Workflow Detail: Integrate Each Subsystem in the Implementation discipline). The implementer integrates the subsystem incrementally from the bottom-up based on the compilation-dependency hierarchy.

Implementation: Test Subsystem.

Testers execute test procedures developed in accordance with activities identified in Steps 3 and 5 (see Workflow Detail: Execute Tests in Integration Test Stage in the Test discipline). If there are any unexpected test results, the testers log the defects for arbitration on when they are to be fixed.

Implementation: Release Subsystem.

Once the subsystem has been sufficiently tested and it is ready for integration at the system level, the implementer 'releases' the tested version of the subsystem from the team integration area into an area where it becomes visible, and usable, for system-level integration.

Implementation: Integrate System.

The purpose of system integration is to combine the currently available implementation model functionality into a build. The system integrator incrementally adds subsystems, and creates a build that is handed over to testers for overall integration testing (see Workflow Detail: Integrate the System in the Implementation discipline).

Test: Test Integration

Testers execute test procedures developed in accordance with activities identified in Steps 3 and 5. The testers execute integration tests and review the results. If there are any unexpected results, the testers log the defects (see Workflow Detail: Execute Tests in Integration Test Stage in the Test discipline).

Test: Test System

Once the whole system (as defined by the goal of this iteration) has been integrated, it is subjected to system testing (see Workflow Detail: Execute Tests in System Test Stage in the Test discipline). The test designer will then analyze the results of the test to make sure the testing goals have been reached (see Workflow Detail: Evaluate Test in the Test discipline).

Project Management: Assess the iteration itself.

Lastly, the project manager compares the iteration's actual cost, schedule, and content with the iteration plan; determines if rework needs to be done, and if so, assigns it to future iterations; updates the risk list (see Artifact: Risk List); updates the project plan (see Artifact: Software Development Plan); prepares the iteration plan for the next iteration (see Artifact: Iteration Plan). Productivity figures, size of code, and size of database might be interesting to consider here.

The project manager, in cooperation with the process engineer and the tool specialist, evaluates the process and the use of tools. These lessons-learned will be used when preparing the environment for the following iteration.  

Result

The main result of a late iteration in the construction phase is that more functionality is added, which yields an increasingly more complete system. The results of the current iteration are made visible to developers to form the basis of development for the subsequent iteration.

Copyright  © 1987 - 2001 Rational Software Corporation

Display Rational Unified Process using frames

Rational Unified Process