How Large an Estimation Effort: There are many situations where it’s useful to estimate the time required to prepare an estimate. Perhaps you need to budget for the time you’ll need to prepare an estimate. Perhaps you need to estimate the staff required to perform estimation activities for the organization. The input to this calculation is the size of the projects in dollars. Divide by a constant (I use 75); then raise to a power (I use 0.4). Multiply the result by a constant (I use .85). This simple power function will tell you the estimated total hours. 65% of those will be spent by the estimator, 30% by the subject matter expert(s), and 5% for administrative support. To determine total effort required for staffing purposes, apply this technique against the expected volume of estimates segmented by size. |
|
|
Version 7.1 of ExcelerPlan shipped on-schedule during the first week of April. This version adds an Expert Assistance capability with integrated estimation workflow, and Input Review process that automatically detects estimate problems, improved algorithms and data, much more detailed mappings between roles and WBS, Requirements, etc., and additional capabilities. |
|
|
All times are Pacific Time!
Free WebEx
Estimating with ExcelerPlan
4/16, 11 AM-12:30 PST
4/20, 11 AM-12:30 PST
4/30, 11 AM-12:30 PST
5/14, 11 AM-12:30 PST
5/28, 11 AM-12:30 PST
6/11, 11 AM-12:30 PST
6/25, 11 AM-12:30 PST
To register for demos, email:
Edward@portal.level4ventures.com |
|
|
Three Day Estimation Training We’ll be offering a 3 day estimation training class taught by William Roetzheim via Webex on 5/5, 5/6, and 5/7. Class runs from 9 AM – 1 PM Pacific Time each day, and the $1,495 fee includes a 6 month license to ExcelerPlan to use as a learning tool. (New customers only.)
To inquire about registration email:
Edward@portal.level4ventures.com |
|
|
ExcelerPlan Next Release
We just released version 7.1 of ExcelerPlan, and we’re already hard at work on version 7.2 (scheduled for release in July). Version 7.2 will add a flexible export capability that supports electronic export to Clarity, Microsoft Project, XML, CSV, and Excel. If you’re interested in beta testing version 7.2, let me know. We’re especially interested in beta testers for the interfaces to Clarity and Microsoft Project.
Edward
Director of Sales and Marketing
|
|
|
|
Sign up for a 3 day live Webex class, taught by William Roetzheim, to be conducted on 5/5-5/7, and receive a free 6 month license to ExcelerPlan along with your $1,495 class fee. (New customers only.) |
|
|
|
|
|
|
Traceability Applied to Cost Estimation: Let’s suppose that you go to the store with a grocery list, but there are no prices on the items on the shelf. You fill up your cart with what you need, the cashier scans everything without any prices showing up, and they then tell you the total cost. At that point, you have the option of accepting that total cost or rejecting it, requiring that you leave to repeat the process at a different store. How long would you put up with this approach to shopping? But isn’t that exactly what happens with your information technology estimation process? You define a set of requirements (the items in your cart), and you’re then told the total cost. Your options are then to take it or leave it. You might have been told that this is your only alternative, but I assure you that this is not the case. It is completely possible to provide traceability between the elements of project cost and the requirements that are driving that cost. Your options are not limited to take or leave it. You can examine the cost of each individual requirement and determine whether the cost of that requirement is justified by the business value generated. And to give you some indication of the value that results, while working with a Fortune 50 company and implementing this estimation traceability concept, we found that managers reduced project costs by 30% without any encouragement beyond providing the information necessary to take the value proposition analysis down to the requirement level.
|
|
William@portal.level4ventures.com
|
|
|
 |
|
|
ExcelerSize: Last month we began discussing ExcelerSize, a Level 4 proprietary high level object (HLO) catalog set designed to size IT projects excluding purchased other direct charges (ODCs) such as hardware and software licenses (which are covered using separate models). You’ll recall that the catalog elements are grouped into five major categories:
- Project level sizing components that apply to the entire project.
- Application software sizing components.
- Data conversion sizing components.
- Data warehouse sizing components.
- Application support sizing components.
This month we’ll discuss Commercial Off-The-Shelf (COTS) and Frameworks. It’s relatively rare today for a large IT project to involve complete custom development starting from scratch. A much more common scenario is to purchase existing software that is close to what you need, then to spend the effort required to configure and extend that software to fully meed your requirements. Extensions are covered as part of the application software estimation components discussed next month, but the core, purchased software is covered using one of COTS Application, COTS Module, or Framework (discussed below). This software estimation component is then treated as one line item with a purchase cost (ODC), a maintenance and support cost (ODC Maintenance), and reuse adjustments to represent configuration and testing. A COTS Module is a fully functional specialized component (e.g., Human Resources (HR) module). A Framework is a collection of styles, capabilities, or components that are then used to build and deploy an application. COTS Applications and COTS Modules are normally purchased, configured, enhanced/extended, and tested. Frameworks may be purchased or built.
Next month we’ll continue this discussion, talking about application software sizing components.
|
|
|
 |
|
[This guest column reprinted with permission from: “MINIMIZING THE RISK OF LITIGATION: PROBLEMS NOTED IN BREACH OF CONTRACT LITIGATION,” Capers Jones, July 2014. Full article available on http://Namcookanalytics.com]
[Editor: This column continues the discussion of litigation risk factors from last month’s edition.]
Problem 4: Poor Quality Control
It is dismaying to observe the fact that one of the most effective technologies in all of software is never used on projects that turn out to be disasters and end up in court. Formal design and code inspections have a 50 year history of successful deployment on large and complex software systems. All “best in class” software producers utilize software inspections. The measured defect removal efficiency of inspections is more than twice that of most forms of software testing (i.e. about 65% for inspections versus 30% for most kinds of testing).
Effective software quality control is the most important single factor that separates successful projects from delays and disasters. The reason for this is because finding and fixing bugs is the most expensive cost element for large systems, and takes more time than any other activity.
Successful quality control involves defect prevention, defect removal, and defect measurement activities. The phrase “defect prevention” includes all activities that minimize the probability of creating an error or defect in the first place. Examples of defect prevention activities include the Six-Sigma approach, joint application design (JAD) for gathering requirements, using formal design methods, usage of structured coding techniques, and usage of libraries of proven reusable material.
The phrase “defect removal” includes all activities that can find errors or defects in any kind of deliverable. Examples of defect removal activities include requirements inspections, design inspections, document inspections, code inspections, automated static analysis of code, complexity analysis, and all kinds of testing.
Some methods can operate in both defect prevention and defect removals domains simultaneously. The most notable example of a method that is effective in both defect prevention and defect removal roles is that of formal design and code inspections. Inspections are the top-ranked defect removal method in terms of efficiency. Also, participation in formal inspections is one of the top methods for defect prevention. After participation in several design and code inspections, participants spontaneously avoid the kinds of problems that were encountered. The net effect of inspections in terms of defect prevention is a reduction of about 50% of potential defects…
Unsuccessful projects typically omit design and code inspections and static analysis, and depend purely on testing. The omission of up-front inspections causes four serious problems: 1) The large number of defects still present when testing begins slows down the project to a standstill; 2) The “bad fix” injection rate for projects without inspections is alarmingly high; 3) The overall defect removal efficiency associated with only testing is not sufficient to achieve defect removal rates higher than about 85%; 4) Applications that bypass both inspections and static analysis have a strong tendency to include error-prone modules.
Next month: Poor Software Milestone Tracking
Capers
|
|
 |
|
Dear Tabby:
I was struck a glancing blow by a meteor, and now I have no memory of things before that time. I’d like to make a career as an estimator, but I don’t have any history. Is it possible to estimate without any history?
signed, Amnesiac in Atlanta
Dear AA:
A lot of people think that formal estimation requires good, solid historical data. If that was the case, no company would ever embark on an estimation process improvement initiative. When any estimation model is delivered, it will have a built-in error (bias). That bias may be big or small, but it will always be present. If you use an off-the-shelf solution without configuration and calibration, you can expect the bias to be large. If you configure and calibrate against benchmark data gathered from organizations similar to your own, the bias will be small. If you calibrate against solid historical data from your own organization, the bias will be very small. But whether the bias is large or small, as you gather actual data going forward you can adjust the calibration to remove the bias based on observed results.
signed, Tabby
|
|
 |
|
|
Input Review:
In February we discussed the ExcelerCost wizard that worked in a fashion similar to TurboTax, using a combination of interview questions and tailored data collection forms to gather the data necessary to create your estimate for you automatically. Another parallel with TurboTax is the Input-Review expert review of your estimate that is built in to ExcelerPlan. Input-Review does an expert level analysis of your estimate, generating and sorting a list of errors and warnings. This information serves as one more quality check to ensure that the estimate you are creating is as accurate as possible.
|
|
|
|
|
|