FastWorks Project for MS Project 2007 Professional: Best Practices & TipsFastWorks is a lean, iterative approach for delivering products and projects faster by validating assumptions early, shortening feedback loops, and reducing waste. Applying FastWorks principles inside Microsoft Project 2007 Professional can help teams plan effectively while staying flexible enough to respond to new information. This article covers practical best practices, tips, and step‑by‑step guidance for combining FastWorks with MS Project 2007 to produce realistic, adaptable project plans.
1. Understand the core FastWorks mindset before planning
Before you create schedules and tasks, align the team around FastWorks principles:
- Focus on validated learning over exhaustive upfront planning.
- Frame work as experiments (hypotheses) with clear success criteria.
- Prioritize the smallest deliverable that provides useful feedback (Minimum Viable Product, MVP).
- Emphasize frequent, short feedback loops with customers or stakeholders.
Make sure stakeholders agree that the plan will change as the team learns. That mindset reduces resistance when you later re-sequence or re-scope tasks.
2. Set up MS Project 2007 for iterative work
MS Project 2007 defaults to waterfall-style planning. Configure it for iterative approaches:
- Use a single high-level project file and create summary tasks for each iteration/sprint (e.g., Iteration 1 — 2 weeks).
- Define a custom calendar for iteration cadence if it differs from standard working weeks.
- Add custom fields to tag tasks with FastWorks metadata: Hypothesis, MVP, Learning Objective, Experiment Owner, and Validation Status. (Use Tools → Customize → Fields.)
- Use the Notes field to capture experiment descriptions and acceptance criteria — treat notes as the repository for what you intend to learn.
- Consider using multiple baseline snapshots: save a baseline at the start of each iteration (Project → Set Baseline → Baseline → Save Baseline). That way you can measure how the plan evolved.
3. Model work as experiments, not fixed deliverables
Translate FastWorks artifacts into MS Project concepts:
- Hypotheses → create a parent summary task named after the hypothesis. Under it, add child tasks for experiment setup, execution, data collection, and analysis.
- MVP → treat the MVP as the deliverable tied to experiments. Create tasks for building, testing, and demonstrating the MVP.
- Learning milestones → add explicit milestones that mark validation events (e.g., “Customer validation completed”).
- Timebox experiments — set short durations and use constraints sparingly. Prefer “As Soon As Possible” starts and let dependencies drive sequencing.
Example structure:
- Iteration 1 (summary)
- Hypothesis A (summary)
- Build MVP feature A (task)
- Run user test sessions (task)
- Analyze feedback and decide (task, milestone)
- Hypothesis A (summary)
4. Prioritize ruthlessly and keep the plan small
FastWorks thrives on limiting scope:
- Use a simple scoring system (e.g., RICE — Reach, Impact, Confidence, Effort) and add a custom numeric field in Project to hold priority scores.
- Filter and group tasks by priority so only the top experiments appear in the current iteration.
- Resist long task lists in a single iteration — if a task grows, break it into smaller experiments.
Create a “Backlog” summary task with lower-priority experiments. Move items into active iteration summaries only when ready to start.
5. Track learning and validation explicitly
Recording outcomes is as important as tracking time:
- Use a custom Text or Flag field for Validation Status (Not Started / In Progress / Validated / Invalidated).
- After an experiment completes, update the Notes with outcomes and link to any artifacts (reports, videos, test data) stored externally.
- Add a “Decision” task or milestone after each experiment that forces a recorded outcome: pivot, persevere, or kill.
Example custom fields:
- Text1 = Hypothesis
- Flag1 = MVP ready?
- Number1 = Priority score
- Text2 = Learning outcome summary
6. Keep dependencies realistic and avoid over-constraint
FastWorks requires flexibility:
- Prefer finish-to-start (FS) dependencies where necessary, but use start-to-start (SS) with lag to indicate parallel experimentation when appropriate.
- Avoid hard date constraints (Must start on / Must finish on). Use constraints only for external deadlines.
- Use effort-driven scheduling carefully: when multiple resources share work, ensure task types and assignments reflect real team behavior.
7. Use resources and assignments to represent real teams, not roles only
Resource setup matters for accurate velocity:
- Define resources as people or small cross-functional teams rather than broad roles.
- Set realistic calendars and availability. If a team member is part-time, set their % units correctly on the assignment.
- For experiments needing rapid feedback, allocate a dedicated small team rather than scattering responsibilities across many people.
Track actuals: encourage the team to update Actual Work regularly so the schedule reflects reality, enabling better decision-making.
8. Measure what matters: learning velocity, not just earned value
Traditional metrics (cost variance, schedule variance) are useful but incomplete for FastWorks:
- Create metrics for experiments completed, hypotheses validated, and time-to-validated-learning.
- Use iteration-level baselines to measure change in scope and velocity (number of validated experiments per iteration).
- Continue to track burn rate and resource utilization, but interpret them in light of learning outcomes.
Example dashboard items:
- Iteration: experiments started / experiments validated
- Cumulative validated hypotheses
- Average time per validation
9. Implement short cadence reviews and adapt the plan
Run frequent ceremonies and use MS Project for quick updates:
- Hold iteration planning at the start of each iteration and update the MS Project file then.
- Use mid-iteration checkpoints to surface blocked experiments and reassign capacity.
- After each iteration, run a retrospective focused on learning quality: were hypotheses well-formed? Were validation criteria clear? Update how experiments are planned accordingly.
Keep the Project file lightweight; use it for scheduling and tracking, not as the sole source of truth for qualitative feedback and artifacts.
10. Integrate external tools where MS Project is weak
MS Project 2007 is strong on scheduling but weak on collaboration and lightweight backlog management:
- Use a simple external backlog tool (Trello, Excel, or a wiki) to capture experiment ideas, notes, and artifacts, and link to them from MS Project notes.
- For team-level daily work and rapid updates, complement Project with a shared board or Kanban system and synchronize key changes to MS Project at iteration boundaries.
- Use exported reports (Project → Reports) or custom Visual Reports to provide stakeholders with concise status focused on validated learning.
11. Reporting templates and examples
Create a few standard views and reports:
- Iteration Summary View: grouped by iteration summary tasks, showing priority, validation status, percent complete, and milestones.
- Experiment Log Report: list of hypotheses with outcomes and links to artifacts.
- Baseline Comparison: use saved baselines per iteration to show how scope shifted and what was validated.
Examples of useful fields in reports:
- Task Name, Start, Finish, Duration, Resource Names, Priority Score (Number1), Validation Status (Text2), Outcome Notes (Notes).
12. Common pitfalls and how to avoid them
- Treating MS Project as a fixed contract: re-emphasize FastWorks mindset and plan for change.
- Over-detailing tasks early: keep early iterations coarse for speed; refine tasks only when validated.
- Ignoring qualitative outcomes: require outcome notes and decisions after each experiment.
- Using too many custom fields: pick 4–6 meaningful fields to prevent clutter.
- Not updating actuals: enforce brief daily or weekly actuals updates to keep data useful.
13. Example iteration setup (concise walkthrough)
- Create Iteration 1 summary (2 weeks).
- Under it, add Hypothesis A summary with tasks: Build MVP A (3 days), User tests (2 days), Analyze results (1 day) and a milestone “Validation decision.”
- Set dependencies: Build MVP A → User tests → Analyze results → Validation decision (milestone).
- Assign a small cross-functional team and set realistic % units.
- Save baseline for iteration start.
- After tests, update Validation Status, paste outcome into Notes, set milestone complete, and decide pivot/persevere.
14. Quick tips — cheatsheet
- Use summary tasks per iteration.
- Model experiments as task groups with clear acceptance/validation criteria.
- Add custom fields: Hypothesis, Priority score, Validation Status, Outcome.
- Save baselines at iteration start.
- Keep tasks small and timeboxed.
- Capture outcomes in Notes and a post-experiment milestone.
- Complement MS Project with a lightweight backlog/collaboration tool.
Applying FastWorks in MS Project 2007 Professional means combining the rigor of scheduling with the flexibility of iterative learning. Treat the project plan as a living experiment: keep scope small, measure validated learning, and make decisions based on evidence rather than assumptions.
Leave a Reply