I’m often asked how much time it will take to estimate a project. The answer depends on the size of the project in person-hours. The estimation hours are roughly .85*power(PH,.4) where PH is the approximate person-hours of effort. That estimation related effort is then split roughly 65% to the estimator; 30% to the subject matter expert(s); and 5% for administrative (primarily meeting) support. |
|
|
Level 4 received an estimation related contract with California Health Benefits Exchange. |
|
|
All times are Pacific Time!
Free WebEx
Estimating with ExcelerPlan
9/10, 11 AM-12:30 PST
9/24, 11 AM-12:30 PST
10/8, 11 AM-12:30 PST
10/22, 11 AM-12:30 PST
11/12, 11 AM-12:30 PST
12/3, 11 AM-12:30 PST
To register for demos, email:
Jeff@portal.level4ventures.com |
|
|
Three Day Estimation Training We’ll be offering a 3 day estimation training class taught by William Roetzheim via Webex on 11/17, 11/18, 11/19. Class runs from 9 AM – 1 PM Pacific Time each day. Training is free and includes a 30 day trial license for new customers.
To inquire about registration email:
Jeff@portal.level4ventures.com |
|
|
Pay it Forward
Hopefully you’re finding ITCN a useful newsletter providing estimation related value added each month, along with a bit of humor and minimal selling. If so, please use the link below to Forward to a Friend, or if this copy was forwarded to you, to subscribe yourself.
Edward
Director of Sales and Marketing
|
|
|
|
Sign up for a 3 day live Webex class, taught by William Roetzheim, to be conducted on 11/17-11/19 and receive a 30 day trial copy of ExcelerPlan. |
|
|
|
|
|
|

Recovery/Turnaround Estimation. Depending on your source, information technology projects fail at anywhere between a 25% and 33% rate. GAO has found that the federal government spent “…billions of dollars on failed and poorly performing IT investments which often suffered from ineffective management, such as project planning, requirements definition, and program oversight and governance…” Some recent examples of project failures include Healthcare.gov, the Department of Homeland Security’s 2011 failed and now-abandoned $1 billion Secure Border Initiative, the Department of Veterans Affairs’ failed $609 million Financial and Logistics Integrated Technology Enterprise program, and the Office of Personnel Management’s canceled $231 million Retirement Systems Modernization program. When an information technology project is in trouble, estimation becomes a critical input to the tough management decisions about next steps.
The first and most critical question is whether to shut the project down or to attempt recovery. Whenever a project is in serious trouble, this option must be on the table. In my experience, there are two situations in which it is best to cancel a project outright. The first is one where the solution as implemented to date is fundamentally flawed from an architecture or quality perspective. Quality problems will typically show up as defect rates significantly above benchmark rates. Measures such as McCabe’s Cyclometric Complexity can also pinpoint quality issues. Architectural problems will typically show up as an inability of the application to achieve technical requirements in the areas of performance or functionality. The second situation leading to cancellation is one where the mission needs driving the requirements for the project are no longer compelling. For example, perhaps the revenue in this market segment has not grown as fast as expected; or an alternate approach using commercial off-the-shelf software is now viable; or the legacy solution no longer looks so bad.
Assuming that the project does not have core issues in the areas of quality or architecture, and the mission need is still there, the we need to develop a new baseline for the current project that is achievable. Some general guidelines include:
1. Accept no small slips. In other words, if you are going to re-baseline a project, the new baseline should be conservative and achievable with a high degree of confidence. If your original estimate was at the 50% confidence level (the peak of the probability curve), you might want to generate this new estimate at the 80% confidence level.
2. Ignore sunk costs. When estimating, you ignore all costs spent to date and you focus on preparing a “to complete” estimate of cost, effort, and schedule. Similarly, management decisions about the wisdom of proceeding forward with the project (or not) should ignore sunk costs and do their analysis purely on the basis of “to complete” costs.
3. Verify all complete and in-process work against benchmarks. In particular, compare the scope of engineering artifacts (e.g., test cases, documents) against the model predictions. If the models predict a 500 page Software Requirement Specification, and the “complete” document is 100 pages long, you know that there is a problem.
4. Review people issues, look for problems, and then either factor those into your baseline estimate or obtain and document corrective action commitments from management. Specifically, look at planned and current labor loading by role and month; average relevant experience by role; turnover; and stakeholder interactions.
Of course, the best recovery strategy is to avoid project failures in the first place. Next month, we’ll discuss 10 best practices that help ensure successful projects.
|
|
William@portal.level4ventures.com
|
|
|
 |
|
|
ExcelerSize: This month we continue our discussion of ExcelerSize, a Level 4 proprietary high level object (HLO) catalog set designed to size IT projects excluding purchased other direct charges (ODCs) such as hardware and software licenses (which are covered using separate models). You’ll recall that the catalog elements are grouped into five major categories:
- Project level sizing components that apply to the entire project.
- Application software sizing components.
- Data conversion sizing components.
- Data warehouse sizing components.
- Application support sizing components.
This month we’ll discuss ExcelerSize object modifiers. We’ll discuss Quantity, Area, Complexity, Work, and Uncertainty. Of course, you may not always know the right selection for all of these values for every component of your estimate. ExcelerPlan allows you to select “Unknown,” in which case it assumes Average but sets the internal variance for that item quite large.
Quantity is the quantity of catalog items (e.g., the quantity of reports). It may be a negative number, as for example if you are estimating the impact of a change request that is removing scope from a project. On very large projects, you may add supporting worksheets to ExcelerPlan and use formulas to calculate the quantity, often using Excel functions such as Sum, Sumif, or Sumifs. it’s also possible to use PERT style supporting worksheets with values for Best, Expected and Worst case scenarios. The entered quantity would then be the mean, calculated as:
(Best + (4 * Expected) + Worst) / 6.
Next month we’ll discuss the use of Area as a modifier.
|
|
|
 |
|
[This guest column reprinted with permission from: “KEYS TO SUCCESS: SOFTWARE MEASUREMENT SOFTWARE ESTIMATING SOFTWARE QUALITY”, Capers Jones, April 2014. Full article available on http://Namcookanalytics.com]
[Editor: This column extracts and summarizes points from the above presentation.]
Software measurement, estimation, and quality form a “tripod.” Attempting to improve one leg without improving the other two leads to imbalance and the structure will topple. Good measurements lead to good estimation. Good measurements and good estimation lead to good quality. Good quality leads to lower costs, shorter schedules, and more accurate estimates.
Automated estimation tools have been available since at least 1973 (see table below). Manual estimates work well < 250 function points. Automated estimates are more accurate > 250 function points for projects > 5,000 function points manual estimates are dangerous. Automated estimates are usually accurate or conservative. Manual estimates grow more optimistic as size increases.
Capers
|
|
 |
|
Dear Tabby:
My boyfriend always wants me to go faster and faster. I keep telling him that some things are impossible, but he just can’t seem to wait.
signed, Faster in Fresno
Dear Faster:
There is significant empirical evidence that trying to accelerate the pace of delivery for information technology projects to less than 75% of the optimum duration doesn’t work. In fact, researchers call areas faster than that 75% line the “impossible region” because the project success rates drops to zero. There are some ways to accelerate projects more than this, but they all involve fundamental changes in the project itself, not simply throwing more resources at the effort. For example, you can reduce scope; increase reuse; or split the project into multiple smaller independent projects.
signed, Tabby
|
|
|
 |
Task Overlap
Version 7.3 (shipping October) adds an environmental factor called “Overlap” that allows you to add either 10% or 25% overlap to dependent tasks in your WBS. Crashing the project in this way reduces the delivery time at the expense of lower efficiency (primary because of rework) and somewhat higher defect levels. The actual amount of schedule compression will vary based on the particular activities in your plan, up to the specified overlap percentage.
|
|
|
|
|
|