11 August 2015

Capture Requirements with Use Cases

The Use Case Principle

The main idea behind use cases for requirements capture is that all useful system functions have an impact at the interface between the system and the outside world. If no one will ever notice whether a function is present or missing, there's no need to implement it.

Every component, every circuit, every line of code must be able to justify its existence by serving some aspect of at least one interaction with the outside world; that is, it must help to realize a use case.

A use case model treats the system as a "black box"; it's completely described by its externally evident behaviour. Whether the product is a rail line or a spreadsheet, a clear understanding of the use cases tells you what needs to be achieved.

27 July 2015

The Value of Project Manager Certification

A question I've put to a number of PMI and Prince2 people has gone unanswered. 

Is there any quantitative evidence that using a certified PM improves the likelihood that a project's actual performance will meet or beat the original plan?

28 August 2013

SIP math and Monte Carlo Simulation

Thinking about the contrast between SIP math and MCS.

SIPmath has its roots in Monte Carlo Simulation, but the implementation is different. MCS munges generation, data, and use together. SIPmath extracts the data part and puts it in a SIP where, being pure data, it can be cataloged, and passed around as easily as you attach a picture to an email. How it's generated and used are separate concerns.

Under ideal conditions, where there's lots of data, each value in a SIP is valid because it has actually happened, and the frequencies in the SIP are the same as the reality. That is, a well-formed SIP is correct by construction. Since the rest is simple arithmetic, avoiding implementation errors and independent validation are both fairly simple.
MCS stratified sampling and SIPmath are the same except for where in the workflow the samples are taken.

On the other hand, MCS generating random values from a curve that approximates the data, is approximate by construction.  We can only hope to get close to the fidelity that comes effortlessly in a SIP composed from history.

07 July 2013

ProbabilityManagement.org News

ProbabilityManagement.org has been building some serious big name support. The list includes Chevron Corporation, Computer Law LLC, Foundation for Creativity in Dispute Resolution, General Electric, Lockheed Martin, Ortec Consulting Group , and Wells Fargo Bank. We've also been hard at work building demos and tools.

PM.org had an inaugural meeting in San Diego, hosted by Harry Markowitz, and started the ball rolling on an XML standard for SLURPs and SIPs. I mention this because I'm chairman of the standards committee. If you would like to be involved, let me know. Right now we've got an internal review in progress but it will soon be released for public feedback.

There are now a lot of good tutorials at sipmath.org, and the list of extra goodies available to members is growing nicely.

To make sure we stay busy we're starting up a consultancy in Toronto. To go with it, we've fired up a Google+ Community, Applied Probability Management, focused on the implementation side of the probability management discipline. Come see us at http://goo.gl/GIkMz.

If you haven't done it already, I urge you to go to sipmath.org, sign up as a member, and get involved. Feedback on and reviews of the tools we've been building would be appreciated.

08 June 2013

The Flaw of Expected Values

No matter how well-managed they are, projects tend to finish late and over budget. We keep doing things to correct this problem, but project failure rates have remained constant for decades.

It turns out that one of the reasons, perhaps the principal reason, is that the math we use for estimating project cost and duration is fatally flawed; it gives us consistently optimistic estimates.

The fatal flaw is the Flaw of Averages, eloquently described in Sam Savage's book of the same name.

In project planning and estimating terms, that's the Flaw of Expected Values.

09 March 2013

The Expected Finish Isn't

Conventional planning tools produce one or more expected values -- expected finish, expected cost.

"Expected value" is also known as the average or mean. But, an average over what? An average assumes a bunch of things whose values can be added up. Average time or average cost implies a large number of activities whose cost and duration can be averaged.

It also implies that the activities that finish early and below budget will provide the savings to underwrite the activities that finish late and over budget.

More generally, if the calculation of expected value is a valid calculation, the sum of the actual costs of a large number of activities should be close to the sum of their expected costs. Is this what happens in the real world?

Silly question -- it doesn't. Relative to expected values, task and project finishes range from a little early to a lot late, slightly under-budget to major overrun. The sum of the actuals is inevitably greater than the sum of the averages.

A sure sign of insanity is doing the same thing over and over again expecting a different result.

15 December 2012

SDXL Version 1.0

I've wrapped up a new release of SDXL. There's a bunch of graphics related goodies in it (including my preferred histogram), as well as a bit of a cleanup. And, a new Reference Manual.

I talk about it in the Google+ SDXL Community discussion Here and the files are, as usual, downloadable from smpro.ca.

24 November 2012

Calculating Uncertainty

It took me a lot longer than I thought it would to write this paper. I wanted a gentle introduction to simulation with SIPs and it turned out that gentle is not easy. So it's 20 pages with lots of examples and charts and two Excel workbooks to go along with the text. The workbooks aren't necessary but they help.
In many ways, the essence of Probability Management is how to do probabilities by counting stuff – and having a computer do the counting. This monograph focuses on that.

Now available as a paperback and as a Kindle e-book.

Pdf format and Excel workbooks:

20 November 2012

Risk = PxI is wrong

You're estimating a project.

Let’s say you have a risk element and the event has a 25% chance of happening. If it does, it will add $100,000 to the cost of a particular task. You’ll resist the temptation to just add $25,000 to the task cost, because that’s not what happens in the real world. It’s one project, not a million transactions, so the average is invalid. In each possible future, it’s $100,000 or nothing.

It’s possible that downstream events would be triggered by the $100,000 while $25,000 would fly under the radar. Also, looking at the range of possible project costs, the high numbers would be $75,000 low, and the low numbers would be $25,000 high.

So don’t use Probability x Impact. Ever.

08 November 2012

The Art of the SIP

Sam Savage has put another brick in the wall with Distribution Processing and the Arithmetic of Uncertainty, an article in the ORMS Analytics Magazine (2012 Nov-Dec).

The article expands on the concept of SIPs (Stochastic Information Packets) as packaged uncertainty. It shows how to use SIP math and raw Excel to do Monte Carlo Simulation "without the Monte Carlo."

He also introduces SIPmath – an Excel add-in to simplify building models that use SIP math. Once the model is built the add-in is no longer needed and the simulation can run without it.

Probability Management is on a roll. Read the article and then go to sipmath.com to learn more.

10 September 2012

The Underestimation Double-Whammy

The main thing we're trying to fix with Probability Management for projects is that conventional tools and techniques give us wrong estimates, and the errors are all one-sided; they consistently underestimate project cost and duration.

Underestimating resources makes it more likely that a project will be approved, and makes it more likely that it will fail. That's a double-whammy that results in more failed projects.

04 September 2012

Just Fix The Math

You see, there's this mystery: Spend a few minutes with Google and you can get a long list of the things that cause projects to fail; we know what they are and how to deal with them. To that easily accessed tradecraft, add the fact that institutions like PMI are certifying over 50,000 project managers a year. Project failure rates should be plummeting. But, for any given industry, failure rates have remained unchanged for decades. This leaves one thing to fix - the math.

Sam Savage has shown us what the problem is, and pointed us at the solution. The Art of the Plan includes my attempt at fixing the problem in project planning.

31 August 2012

The Only Good Risk Register Is An Empty Risk Register

By Mark Powell

Have you ever seen a risk register with 500 or more risks on it? It seems that these days a lot of projects have huge risk registers. How does this happen?

Most people believe that this is natural for a large and complex project.

A good friend recently described a proposal for the California High Speed Train that would go from San Diego to San Francisco and Sacramento. His pre-project draft risk register covered everything from track, signals, routes, station interchanges, software, train sets, health and safety, Environment, etc., and it was huge. Well, that, of course, is no surprise; it is one big, complex, project!

18 August 2012

The Book is Done

It's taken way longer than I thought it would, but I've finally got The Art of the Plan written and published. The e-book version is available from Smashwords in all the useful formats.

The printed version is still in process. I'm guessing early September for release.

The book covers most of the topics I've been writing about in The Art of the Plan blog – from identifying crystal-clear objectives and requirements through to modeling and simulation using Probability Management techniques to produce realistic project plans. There's an Excel workbook loaded with examples to go with it and, of course, it uses SDXL.

13 July 2012

Benefits Realization

Benefits realization – building on (un) safe foundations or planning for success?

Here's a really good article on closing the gap between project predictions and realization. Jenner covers the well-known sources of error and misrepresentation. Unlike other writers on this topic, he doesn't just wring his hands but responds with well-thought-out prescriptions.

His prescriptions include effective planning (start with benefits and requirements, design the solution later), Science (seek disconfirming evidence), Reference Class Analysis, Probability Management (distributions rather than point forecasts).

In short, this is an article I wish I had written.