Friday, December 12, 2008

Can a project perform PPQA on themselves?

Is it possible to use the project team to achieve PPQA SG1?
We have an idea to use the project team to assure the process using checklists and peer reviews, so the QA just audits work products AND assure that the project team do the checklists or peer reviews.
Is it enough for PPQA SG1?
If you've been following my posts you'll know that I'm not one to say that anything is absolute within the CMMI - local tailoring to accommodate culture and local norms is very important. So, I never say never . . . but . . . .
The PPQA PA specifically calls out the need to "objectively" evaluate processes AND the use of those processes.  It doesn't say "independent" though, but it would be uncommon for an organization to be mature and open enough when first implementing PPQA to perform this objectively on themselves.  Many Lead Appraisers insist on "Independence" to prove objectivity.  This isn't always correct though.
The other problem is that PPQA is the "eyes and ears" of the process deployment effort.  It's job is NOT to be the "process police," it's to uncover problems with the use of the process and make sure that information gets into the hands of the right people and is used appropriately.
One idea I advocate is a "rotating" PPQA responsibility.  PMs and Engineers from other projects serve for a period of time in a PPQA role (say 25% of their time for six months).  This has the hidden benefit of gaining "buy-in" from people who may not be fully engaged, and creates process evangelists. 

What is the minimum number of projects that need to be appraised?

Is there a minimum requirement for the number of projects that must be appraised for a L3 rating?  I was told that at a minimum 3 projects from the organization must be appraised for all SPs and all the SPs should have been implemented across all the 3 projects. Is that true?
Not exactly.  This seems like an easy question to answer, but like everything else related to the CMMI, it's more complex than it appears.  The SCAMPI MDD (and LA's guide for conducting appraisals) says that:
"In appraisals where the reference model scope includes any project-related PA, the organizational scope must include at least one focus project. If the organizational unit includes more than 3 projects, then the organizational scope must include sufficient focus projects and non-focus projects to generate at least 3 instances of each practice in each project-related PA in the model scope of the appraisal."
It then goes on to say:
- Focus projects must provide objective evidence for every PA within the model scope of the appraisal which addresses model practices applicable to those projects (this includes not only "SPs" but applicable "GPs as well!)
- Non-focus projects must provide objective evidence for one or more PAs within the model scope of the appraisal which address practices performed on projects. 
- Support functions must provide objective evidence for their functional areas (for instance Process audit function, SEPG, management, etc).
So, my reading of the MDD is that if you have three or more active projects, then you must come up with evidence for AT LEAST three instantiations of each practice across the focus and non-focus projects and at least ONE of them has to be a focus project (providing evidence for all PAs).  In this example, you could have a minimum of three focus projects, or you could have one focus project and ten (for example) non-focus projects, as long as you have three examples of evidence for each practice.
The MDD doesn't directly speak to what happens if you have only ONE project (I have several clients in this situation).  My interpretation of the rules here is that if you have only one active project, then that one project is a "focus" project and all of the evidence has to come from that single instantiation.
This scenario is unlikely in ML3 though, because you'll also need to add in "support" groups such as SEPG, QA audit, process management, et al and you will end up with more than one "project" to appraise then.
One more wrinkle.  The sample to be appraised is something that the Lead Appraiser and the Sponsor need to agree upon.  I received an RFP once from a DOD contractor that said they had "already determined the appraisal projects" and that there was no opportunity to discuss it.  That's a no-no.  The LA needs to buy into the sample in order to maintain integrity of the appraisal.

Thursday, December 11, 2008

What is the simplest way to satsify RD3.1 "Establish Operational Concepts and Scenarios?"

I have a problem on how to advice SMEs on the simplest, value added way to implement RD P 3.1 Establish Operational concepts and Scenarios. In which specific work products can we see the development of operational concepts? Is it ok to say that you can see the operational concepts reflected in the following work products as they are refined: non functional requirements, design restrictions, system architecture document, installation manual, deployment diagram, package diagram, component diagram?

First let's take a look at why RD.SP3.1 is even in the model.

Sometimes it's difficult to visualize the feasibility or reasonableness of a requirement without putting it in context. This is one reason prototypes are so popular - they literally paint a picture of what the implementation might look like. When they see this users often say "no, that's not what I meant" or "what would happen if we tried this?" This is a form of requirements validation.

RD.SP3.1 is asking you to validate requirements by putting them into some context.

The simplest and most useful way to do this is with a Use Case, User Story, or Storyboard. They require only a marker and a whiteboard, or software like Visio, to implement. They're extremely valuable at putting requirements in context, and they're low cost, low effort.

Developing a prototype for a more complex requirement, or a proof-of-concept (a "spike") is another way, albeit more costly, to drive out pesky requirements defects.

The other artifacts you've listed seem more like they belong in the Technical Data Package (design documents from TS) rather than in RD, with the exception of non-functional requirements, which need to have RD SP3.1 performed against them to be completed.

Wednesday, December 10, 2008

Should process improvement tools be represented in our software process improvement plan?

Our software group is currently revising the Software Process Improvement plan to cover CY2009 process activities. For the lst few years the Software Process Improvement Plan (SPIP) only concentrated on process definition work, however for next year the team has identified tools and applications that will work together with our current processes to improve productivity. The million dollar question is: can we still label this document as "Software Process Improvement Plan" (SPIP) although it contains many tool improvement information. The team prefers to maintain a single document rather than having to maintain multiple documents. What is the industry norm for managing this type of elements (Process + Tool)?

Anyone who has worked with me knows that I scoff at the idea of implementing tools when a process improvement program starts up. Too many companies see a tool as their salvation and end up creating a monster that helps them do the wrong thing – only faster.


We had this experience in the Broadsword Labs once when our sales staff, without consulting us, decided they needed a new “tool” for expense processing, went out and bought one, slammed it in, and it wreaked havoc with the client base because . . . you guessed it, they didn’t have a solid process behind it. The clients started seeing all these new forms and reports, no one knew how to use the system, there were errors in the expense reports, it was conusing what the roles were . . . . sounds like some GPs doesn't it?

It’s natural for a company to start to see opportunities where a tool could indeed be helpful after they've deployed some stable processes. Early in the "proces improvement" cycle these tend to be tools related to REQM and CM, then later MA, estimating, and then finally life-cycle management tools like Borland’s CALIBER product line.

I would expect a ML3 company to implement some tools, and to manage that implementation in the SPIP. This is an absolute MUST for ML5 (OID) where piloting and deploying “innovations” is spelled out in the SPs.

However you do it, the deployment of tools should be governed in your SPIP and under the guidelines spelled out in OPF.

Friday, November 28, 2008

We can't seem to get a clear description of what "Agile" is. Can you help?

We see a lot of discussion on Agile development and sometimes fanatics get involved and add a lot of fog to the conversations. Other than a vague (and contradictory) 'Agile manifesto', is there a definition that can be used to base opinions on or will this 'tag' remain a myth to most of us. I, for one, believe that 'Agile' is a qualitative word that should not be used to represent a methodology.

There isn't enough space to write an entire treatise on this, but here is the nitty-gritty.

"Agile" is a generic term that refers to a loose set of methodologies that include Scrum, XP, Feature Driven Development, and other iterative AND incremental methods. You are correct in that it is not a methodology in itself.

You often hear very loose descriptors like "trust," "iterative," "low documentation" etc used to describe "Agile" but these are pretty nebulous. To be more precise:

1) Agile methods are iterative, where the complete lifecycle (plan, reqt's, design, test, build) happens in short durations, often 30 days or less. These activities are usually not linear, but empirical, and do not always occur in a specific sequence

2) Agile methods are incremental, where a small set of features is developed, followed by refinement or another set of features. The methods don't espouse the "big bang" but more like small bits at a time

3) Agile methods negotiate scope, not time and budget. A "waterfall project" usually has a fixed scope, and a fixed budget. If a major change is requested, the timeline and budget is negotiated. Agile methods EXPECT change, but negotiate each release (or iteration) to contain a certain amount of "features" (or functionality). If they don't get it all done, they execute another release.

4) Agile methods MUST have buy-in from the end-user or customer, because this type of negotiation is not IT or engineering driven, it's driven by the business, so it's not for everybody.

The different Agile methods have other specifics (XP uses Pair Programming as a form of real-time validation, Scrum has "stand-up meetings and "sprints" for instance) but these are ornamental - not core to the concept of "Agile."

For more information (and for a discussion of CMMI and Agile methods) you can download an SEI Technical Note (CMMI and Agile: Why not Embrace Both!") which I co-authored with several other very talented individuals.

Good luck!

Friday, November 21, 2008

How can we be Agile and still satisfy estimation requirements for ML5?

I work in an ML5 organization. We're experimenting with scrum to see what benefits can be gained over our traditional waterfall model. But I'm stuck on the estimating part! When I've done scrum in the past we've done some planning poker and called it a day. But the head of my SEPG tells me that wideband delphi estimation is frowned upon at the high maturity levels because it is akin to guessing. Can you recommend how we can perform agile estimation in a way that's compatible with HM organizations?

So what's wrong with guessing?

I'm not sure your problem is as much about "Scrum" as it is about the inflexibility and lack of experience within your SEPG! The CMMI does not speak to specific estimating methods and while "planning poker and calling it a day" may not be enough to satisfy the CMMI, there isn't a requirement to perform more "traditional" methods such as SLOC, KOKOMO, or the like either.

There have been whole books written on high maturity and estimating, so now is not the right time to get into all of the glorious and fun-filled detail but I'll touch on some of the concepts.

As to Wideband Delphi being "akin to guessing" I guess I would say "it depends" (I'm not saying it's a "high-maturity practice" either). To say that is akin to saying that a design that has been put together by an expert software architect, and then peer reviewed by other experts, and then prototyped is akin to guessing on a design. It isn't. We call all of those things TS, VER, and VAL don't we?

But, on to the meat of it.....

The difference between a ML2/3 organization and a ML5 organization isn't in how they do things, but it is in the data used to make the decision on how to do things. The essence of ML5 is trying new and innovative techniques to improve performance based on data gathered through the execution of the process (like Scrum for instance).

Someone in your organization should be gathering appropriate data, developing baselines (determining the natural bounds of process performance), and performance models using various statistical methods. That data should be used to estimate and plan projects as well as select potential innovations (such as the use of Scrum) for piloting and eventual deployment.

So, if I were to use historical performance baselines as input into a wide-band delphi process that was intended to "feed" a scrum project, and I was doing so because I had identified Scrum as an innovation that could improve performance (supported by data that told me that phase and cost overruns were a problem we a problem) then I would be behaving exactly like a High Maturity organization was supposed to perform.

Of course, that's a hypothetical situation. You'd have to determine your own thread for getting value from Scrum and ML5, but if you read our latest SEI Technical note "CMMI and Agile: Why not Embrace Both" you'll learn a lot more about it. Get it at:

Wednesday, November 12, 2008

The CMMI-Agile Technical Report has been released!

Agile or CMMI: Why not Embrace Both! has been released by the Software Engineering Institute.

A new SEI technical note has been published. Please visit the following page to download the PDF version of the report.



CMMI® or Agile: Why Not Embrace Both!

Hillel Glazer (Entinex, Inc.)
Jeff Dalton (Broadsword Solutions Corporation)
David Anderson (David J. Anderson & Associates, Inc.)
Mike Konrad
Sandy Shrum




Agile development methods and CMMI (Capability Maturity Model® Integration) best practices are often perceived to be at odds with each other. This report clarifies why the discord need not exist and proposes that CMMI and Agile champions work toward deriving benefit from using both and exploit synergies that have the potential to dramatically improve business performance.


keywords: Agile, Agile methods, CMMI


cover: November 2008

publication date: November 2008

distribution: unlimited

editor: Sandy Shrum (

Thursday, October 23, 2008

Can we "self-assess" and declare ourselves ML2?

We are currently working through the ISO 9000 registration process and are confronted with the additional need for CMMI Level 2.  I’m told by some of my peers that all I have to do is “self assess” and declare myself CMMI Level 2.  That seems sort of absurd.  I’ve been told by another friend that we need to self assess and have a SCAMPI from a certified assessor who in turn can issue a certificate.  What is the correct path?
Sure!  I declared myself a millionaire last week . . . .  as long as it's my own private fantasy that's no problem, but if others knew about it they'd start to question my sanity!
A CMMI SCAMPI Appraisal is a rigorous event that can only be conducted by an external licensed SCAMPI Lead Appraiser who is authorized by the SEI.  The appraisal is conducted with a team of people made up of both internal and external resources who must complete both the "Introduction to CMMI" class and the SCAMPI Appraisal Team Training class.
Typically organizations conduct a SCAMPI "C" appraisal to identify the gaps.  This is a "non-binding" event that does not become part of the public record.  Then a SCAMPI "A" is conducted.  The results of the SCAMPI A become public record regardless of the result.
Your ISO9000 prep will help a little - especially in the area of documented processes and policies.  But a CMMI SCAMPI appraisal is much more detailed than an ISO9000 audit, as we seek to verify actual process performance.
Your best bet is to start by contacting a licensed Lead Appraiser ( and engaging him/her for a consultation to help develop a plan for succeeding.
Good luck!

We think GP2.8 is about reviewing and discussing the process. Do you agree?

We've had a disagreement with our Lead Appraiser on GP2.8.  She says every instance of GP2.8 needs to have a metric.  We think "monitor and control the x process" means day-to-day review and discussion of the process in general.  What say you?

I don’t think I agree with you.  That said, I don't think I agree with your LA either.  I started my CMMI journey about the way you’ve described . . . that GP2.8 is about discussing and reviewing the process.  My thinking on GP2.8 has evolved over the last several years to believe that GP2.8 represents potential “process levers” that can be tuned to improve performance.  I have inquired about this several times with folks at the SEI and the clear and consistent message I have received is YES, measures are appropriate here (but not the only thing that might be used). 
As a proponent and advocate of agile methods I’m not  implying that I favor anything “heavy” but I am saying the a healthy dose of process measures is very useful for any organization. That said, asking for a measure for every one is unreasonable, and it's not required by the CMMI.
I’m not clear on how I would “monitor AND control” something as complex as an engineering process by only  having “day to day” discussions about it.  This can be appropriate for some things, but across the 18 process areas in ML3, there are many opportunities to measure process performance related to productivity and quality (among other things) in these PAs. 
Furthermore, if you delve into OPP (or any of the HM PAs) you realize quickly that it is these very measures that give you the most valuable information about process performance – enabling you to make course-corrections to your process to improve performance.  Lacking that, there is way to understand what needs to be improved!
M&A is the PA that you would use to manage and execute those GP2.8 measures.  M&A is simply an infrastructure PA, which is why it is described in the model as a “support” PA.  It does not speak at all to “what” to measure, only how to build and execute a measurement infrastructure.  M&A must be “fed” by measures that are the result of process execution, so the other process areas are M&As “customer.”
 So, IMHO, I stand by my original assertion – GP2.8 exists to help us understand process performance, and measurement is an effective way to do this.  The trick, and I fully advocate this, is to keep it light and useful.

We are a ML3 company. Can we go directly for Maturity Level Five?

Our company achieved ML3 two years ago.  Can we go directly to Maturity Level Five, and skip ML4?

Let's differentiate between SCAMPI "A" Appraisals and the process improvement work that needs to take place to be "performing at a level of maturity."

You can always skip an appraisal, so the short answer is "you could" go directly to a maturity level five SCAMPI A appraisal.

The longer answer lies in the realization that all of the work in the ML4 PA's (OPP and QPM) will STILL have to happen because CAR and OID (ML5 PAs) depend upon a statistical understanding of process performance that includes Process Performance Baselines and Process Performance Models.

There are examples of companies that have developed robust satistical process control processes, including PPBs and PPMs, and have used them to quantitatively manage projects, but have chosen to not conduct a ML4 SCAMPI Appraisal against those processes.  Then they develop and implement processes for OID and CAR, and eventually conduct a ML5 SCAMPI "A" appraisal.

But I'm not sure why you would want to do this.  It seems like a risky strategy to me - what if you conduct your ML5 appraisal and all of work you've done for OPP and QPM is inadequate?  What if you misundestood the very specialized content of the HM process areas?  What if your data set is statistically invalid?

If you're trying to do this to save money on the appraisal, I would argue that the cost of the SCAMPI A is insignificant compared to the effort you are putting towards achieving high maturity performance, and in the grand scheme of things it's not worth the extra risk.

An additional consideration lies in the concept of organizational learning.  Your OU doesn't learn all at once, but learning takes place incrementaly.  Why not achieve ML4, learn from the data, become experts at SPC for software, and then dive in ML5?  Your chances of getting real value from the PA's is far greater if you take this incremental approach.

Tuesday, October 21, 2008

If you only attend ONE CONFERENCE next year, make it this one!

If you're in the CMMI business, or interested in CMMI in general, you should plan on attending the SEI's SEPG 2009 in San Jose', CA in March of 2009.

The theme is "Perform at a Higher Level"

Upcoming conferences - come hear Jeff speak!

Dear Readers,

Come join me at some of these great conferences where I'll be speaking in the next few months.
Great Lakes Software Excellence Conference

"Encapsulated Process Objects"

Grand Rapids, MI

November 5th, 2008


Agile Development Practices

"Agile CMMI"

Orlando, FL

November 12th, 2008


National Defense Industry Association (NDIA) CMMI User Group

"Encapsulated Process Objects: How Object Orientation and Agility can supercharge your process improvement program"

Denver, CO

November 17th - 20th


 - 2009 -

SEI's SEPG 2009

"MORE Notes from the Blogosphere"

"Agile and CMMI: Why not embrace both"

"Johnson Controls: an Agile CMMI experience report"

San Jose, CA

March 2009

Should we select the staged or continuous representation of the CMMI?

I am gearing up to introduce CMMI to my company. Up until yesterday, I was convinced that I should select CMMI-DEV +IPPD using the Continuous strategy.  I just read recently another train of thought that indicated for software development improvements only, w should use the Staged strategy and add RM process area to it. 
If I choose continuous, will it create more difficulty for me when I begin improving other processes? Is it difficult to match capability levels to maturity levels?  I just don't want to do extra work and waste my companies time.

There is no "right" answer for which representation you should choose.  I primarily work with clients who select the "staged" representation (which results in a "maturity level"). But there are numerous examples of companies that choose the continuous representation (resulting in one or more capability levels by process area).

The decision to choose one or the other depends on the goals of the organization.  If you want to work on improving a large "swath" of your organization - from project management to requirements to engineering - the staged representation is appropriate.  If you need to achieve a maturity level of CMMI (say ML2 or ML3) then staged is also appropriate.  If you desire to pick and choose amongst improvements, say, only work on metrics now, and requirements later, then the continuous representation is appropriate.

If you achieve Capability Level Two in all of the Maturity Level Two process areas, then "conversion" from a set of capability levels to a single maturity level is simple.

I see the staged representation as a "fork-lift" for bringining a software organization up to a higher level of performance.  I see the continuous representation as a jack (like the one for your car) that raises performance for one or more parts of your organization.

The downside of staged is that the scope is large and is quite a bit of work.  The downside of continuous is that you'll have processes in your organization performing at all different levels and interfaces between groups will be difficult.

You decide!

Oh, but the way IPPD is optional . .  it's not that it's bad, if you have time it's a great set of practices, but it's an "addition" to the model and is not required.

Could you explain the meaning behind VER SG2: Perform Peer Reviews

We're not sure we totally understand the meaning behind VER SG2 - Perform Peer Reviews.  Could you shed some light on it for us?

Sure, let me write something up, send it out, and have everyone review it.  After recieving their feeback I'll update the post and release it for everyone to . . . . hey wait!  I'm giving away the answer!!

Peer reviews are a proven method for identifying defects in all types of work products.  One of the common misperceptions of VER SG2 is that it is only performed as part of a linear "VER" process.  I see it as a "stand-alone" utility process that is used throughout a project to identify and drive out defects long before they become part of our product.

SP2.1 - Prepare for peer reviews involves identifying participants, selecting the appropriate "style" of peer review (Fagen inspection or something lighter perhaps), sending our materials, setting up the room, identifying roles, and so on.

SP2.2 - Conduct peer reviews is kind of obvious - conduct it PER THE PLAN (from SP2.1).  Identify and record issues, assign action items, and so on.

SP2.3 - Analyze Peer Review data involves understanding what was discovered during the peer review, both at a project and organizational level and taking appropriate action.

I would recommend you consider peer reviewing all plans, requirements, designs, code, and test results at a minimum.  The life you save may be your own!

What is considered good evidence for IPM SP1.3?

What are some direct evidence work products for IPM SP1.3 (Establish the project’s work environment)?  What is a good  source for examples of direct and indirect evidence for CMMI Version 1.2 Process Areas and Practices?
There are a number of sources that provide examples of work products that will result when you perform a process based on a CMMI practice(s).  The first one that comes to mind is the CMMI book, or the CMMI technical report (free download from
Another source is a SCAMPI PIID worksheet, of which there are many on the web for you to download.  These usually have some suggested work products, although keep in mind that you get what you pay for.
Either way, perhaps you should consider that you're thinking about this in reverse.  The artifacts don't drive the process, the process drives the artifacts.  If you've developed a useful process, it will have useful work products.
For IPM SP1.3, we often see safety and security procedures being implemented, hardware and software being delivered and installed, and contractor or new employee on-boarding being performed.  This usually results in a large amount of documentation that can be used as direct and indirect artifacts.

And now for something completely different - the CMMI Song!

This has been around for awhile and I've been meaning to post the link.  If you want to have some fun, take a listen to the CMMI song.

Tuesday, October 7, 2008

Are sub-practices required for a CMMI Appraisal?

Are sub-practices located under specific practices expected during CMMI Appraisals (e.g. SAM SP2.4 there are seven sub-practices.  Are Appraisers expecting seven artifacts or artifacts that prove all seven?
The CMMI defines Goals as required, Practices as expected, and everything else as information. So no, you do not need to produce evidence at the sub-practice level - only at the Specific Practice level.
However, as "informative material," sub-practices embody the intention of the authors of the CMMI and further describes and clarifies the meaning of the practice.
It's good to read them and understand what the authors meant by each practice - but it doesn't mean you need to produce evidence for every one.

Can I get a list of questions we'll be asked during a SCAMPI Appraisal?

I'm driving my company to CMMi Level 3 certification. We are a very different company in the sense that we are a product development company and the final product is the combined output of electronics, mechanical and software.  Can you please give me some of the example questions, LA will ask for in the appraisal - especially in mechanical.

People often ask me this - some even complain to the SEI that there isn't a "question bank" available that every LA is supposed to ask.  Unfortunately, this betrays a lack of understanding about how the CMMI, and SCAMPI, is used.

In SCAMPI, process performance is verified through a number of means, with interviews being one of them.  There are also other means - documents, presentations, surveys, and others.  SCAMPI finds practices to be strong or weak based on the combination of available evidence.  Depending on the organization, we may not even ask a question about a certain practice if other evidence is available.

The questions we ask also are best if they're tailored to the organization's situation. For instance, if it's a company that makes microprocessors, questions about software may be invalid.  

Another instance is the need to tailor questions to fit the local culture.  If I'm appraising a small, agile organization I'm not going to fill my questions with references to "Change Control Boards," governance infrastructure," "SEPGs" and other words or phrases typical at large companies.  If it's a huge DOD contractor, I'm not going to talk about "sprints," "jolt cola" or skateboards either!

I do have a database of questions I ask, and the list is always growing just like I am always learning.

Thursday, October 2, 2008

How do we practically implement GP2.10

GP2.10 asks us to review status of the process with "higher level management."  Do we really have to review EVERY process with them every week?  I don't want them to micro-manage me that way.

Good question.  It's hard enough to get an upper-manager to take a phone call, let alone review "the process" with us. But think of it this way: the process is the ONLY tool management has to proactively run the business - they should be happy to do this! I know that isn't always practicle though.

I favor a "rotating strategy" where we cycle through the process at monthly meetings.  So in January we review, say, the processes associated with Project Management.  In February we might look at software development.  And so on.

Upper management doesn't HAVE to provide feedback, but it's much better if they do.  After all, these are the "levers" we need to successfully run this business.

Is CAR applicable to the project or the organization?

Should CAR be applicable only at the organization level and not at the individual project level? I ask this question since CAR is towards resolving common causes which is coming from the QPM, OPP outputs.

The CMMI specifically assigns usage of CAR to the projects, and then tells us it COULD be an input into OID, the organizational PA used to identify and implement improvements.  CAR can address problems at the project level, and can be an input into OID ("Collect Improvement Proposals")

On the other hand, if the "problem" you're trying to fix is an organizational problem, such as say, the process not working, then there is no reason CAR couldn't be applied to the "process project."

Do you have a list of documents we need to achieve CMMI?

Hey Jeff!!
I need a help from you. Can you provide me the list of work products needed to achieve CMMI in our organization? 
Let me ask you this.  Should you reach your destination in a car by looking at the white line in the road through your rear-view mirror?  
This kind of "reverse engineering" of the CMMI was never really intended by the authors.  The documents you are referring should be the product produced from performing a process.  
So, if you conduct an estimate using "wide-band delphi" the work products, or "documents," you produce may include the three or more estimates developed by the assigned estimators, the list of tasks in a WBS, and the output of the formal estimate.
If I said, go produce an estimate, you might just sit at your desk and type one up without executing the process.  But you would have "the document."  Does that mean you performed the process? No, it means you typed in your best guess of the estimate.  And it definitely doesn't mean your CMMI ML2.
The CMMI does expect you to produce a "direct artifact" and an "indirect artifact" (in most cases).  This is not the same as saying that all you need is the artifact.  They are merely a consequence of the process being performed. You must also produce evidence that you actually followed a process.
There is a list of "suggested work products" in the CMMI book that you COULD use as a guide, but this list is highly dependant upon your company, methodology, and culture.  

Tuesday, September 30, 2008

Isn't QPM just really great project management?


We're ML3 and we believe we have world-class project management capabilities.  We do all of the things in PP, PMC, RSKM, SAM, and IPM really well and we must be performing "at the next level."  Doesn't that just make us ML4?

Hmmmm.  uh .. .hem.  Let's take a look.

You're right in that being ML4 requires you to have a solid foundation with proven performance from the ML2 and ML3 process areas - so you get points for that.  Without that you're not even in the game.

But QPM is different than "advanced project management."  It's not necessarily "better" project management (although I would argue that it would make managing projects "better") because "better" is subjective.  But it is different.

It's different in that it depends upon the use of statistical techniques, as well as the presence of process-performance baselines and models, to be useful.

SG1: Manage the Project Quantitatively

The word "quantitatively" is a special word in the CMMI that infers the use of statistical methods.  The practices supporting this goal all depend on the use of these methods to succeed.

SP1.1 Establish the project's objectives

These objectives are for quality and for process performance.  They need to be based on what can actually be accomplished, and that information is only available if you're performing OPP successfully. In the case of a "mandate" from management (a quality "specification") you ALSO will need OPP to identify the risks associated with the mandate, and the corrective actions that will need to be taken to achieve it.

SP1.2 Compose the defined process

Using the data from OPP as your guide (baselines and models), and within the context of the objectives from SP1.1, select the appropriate sub-processes from the set of standard processes (OPD) that will enable you to achieve your objective.  This is a little like a more granular, data-focused way of performing IPM SP1.1 and SP1.2.

SP1.3 Select sub-processes that will be statistically managed

Which sub-processes will you need to manage / monitor in order to understand if you are going to achieve your objectives?  Again, OPP can help here.

SP1.4 Manage Project Performance

Using the aforementioned monitoring, manage the performance of the project, taking corrective action as needed.  Why do I have data points outside of my control limits ("assignable causes")? Time to find out (you can use CAR for this).

SG2: Statistically manage sub-process performance

Use statistical methods to understand variation and take corrective action when necessary. Some of these practices will probably be performaned along with SG1 (above).

SP2.1: Select measures and analytic techniques

What measures am I going to use to steer the ship, and what techniques (e.g.; process performance charts, histograms, XmR charts, etc) am I going to use to understand variation?

SP2.2 Use statistical methods to understand variation

Use the techniques and methods to identfy assignable causes of variation (e.g.; outside of control limits).  

SP2.3 Monitor performance of selected subprocesses

Understand how the various sub-processes you have selected are performing, so you can understand progress towards achieving the objectives set in SG1.

SP2.4  Record statistical management data

We need to update the baselines created by the OPP capability with actual process-performance data - and this is where we do that.  This way the next poor schmuck that comes along will learn from MY screwups.

Whew!  Well, there is a lot here, and it's way different than PP/PMC.  Is it worth doing?  

Well, let me ask you this.  Do you want to identify and eliminate defects earlier?  Do you want the next project to be better than the current one, and the next one after that even better?  Do you want to spend less time on drudgery and more time on engineering?


I'm not sure I understand OPP - what's it all about?

Hey Jeff,

We're a ML3 company and we'd like to move up to ML4.  There is a lot of argument about the meaning of OPP.  Could you shed some light on this?

Absolutely!  Almost all organizations struggle with the four "High Maturity" Process Areas, and Organizational Process Performance, or OPP, is the one they struggle with the most.

It's impossible to claim you are performing at "high maturity" without OPP.  It's a foundational process area that provides an infrastructure that allows you to use the other HM process areas (as well as perform better with the ML2/3 PAs).  It exists to establish and maintain (sound familiar?) the basic statistical data you need to continuously improve your projects and your organization.

SG1: Establish Performance Baselines and Models

The practices that lead to achieving this goal are practices that support the selection of processes to statistically monitor, establishing organizational objectives, establishing measures to use, creating baselines of process performance, and creating process performance"models" to help predict the outcome of a set of processes being performed (and to assist projects in the selection of sub-processes from the organizations standard set of processes - QPM)

So, let's say that our goal is for all projects to have a customer satisfaction rating of 10.  Can we achieve it?  What does history tell us?  Is that realistic?  What were the causes when project's did NOT achieve this?  And what might we do with our projects to achieve this goal specification?

These are the questions that OPP is designed to support.

SP1.1 Select Processes

Exactly which sub-processes (from the standard set of processes) will we be including in our analysis effort?  You often see process elements related to Peer Reviews, Defect discovery, engagement (TeamScore) and the like.

SP1.2 Establish Process Performance Measures

Which process attributes are we going to measure?  If you were to select engagement, you might measure actual participation vs. planned participation.  For peer reviews you might measure preparation time for each peer review.  These are measures of some attributes of the sub-process.

SP1.3 Establish quality and process performance objectives

These objectives come from various sources.  Some come from the business ("we need to increase sales by 5%) or from the engineering process itself (reduce defects by 12%).  If we have enough data to establish an analysis of the process performance variation, and we can calculate the natural bounds of process performance, then this will have to be our objective for the process - as it's all the process is capable of delivering ("Voice of the Process")!  In other words, the process is telling you"STOP! You can't reach this objective!"

SP1.4 Establish Process Performance Baselines

These baselines are the data that represent actual process performance (and it's variation).  We typically will plot these data using some type of process performance (control) chart or histogram.  We are trying to establish what the sub-process is capable of - these natural bounds are called the "voice of the process" and they establish that, if the process remains the same, here is what will likely occur (within limits).  

So, using engagement as an example, we can measure customer engagement's effect on customer satisfaction by measuring their engagement at different points in the process, and comparing that to historical customer satisfaction results.

SP1.5 Establish Process Performance Models

Models are used to estimate the value of a process performance measure (the "result") if a given set of sub-processes is performed.  

There are many techniques available for this - simulation for instance - but in the end this is a "what if" analysis using the historical data to estimate an outcome with a high-degree of probability.  This is useful for projects as they attempt to predict the outcome, as opposed to just guessing.

So what?

You've probably noticed that there is nothing in OPP that actually says "fix the problems."  
That's OK - there are other process areas for this (CAR and OID).  

OPP is a "capability."  It gives us the data we need to make better decisions about the use of the process - and it tells us exactly what we're capable of.

Whether we like it or not!

Sunday, September 28, 2008

Lot's of people are moaning about the SEI's new interpetation of High Maturity - what does it all mean?

There's a lot of "noise" being made about the SEI's "High Road" view on level 4 and 5, although the model itself hasn't changed much for these 4 PA's.  My company, a past Level 5 achiever is ready for a CMMI ML5 renewal and there's lots of moaning and groaning and questions about this "High Road".What's really the new emphasis (or activities or requirements) that the SEI is imposing at Levels 4 and 5?

This is an AWESOME question.  Thank you!  There certainly has been quite a bit of conversation about this in the past two years, and while the practices have not changed all that much, the informative material has, and that material forms the basis for understanding the meaning of the practices.  So, let's explore it.
The short answer is, too many organizations (along with their Lead Appraisers) were achieving ML4/5 without understanding the basic requirement to monitor and control selected sub-processes using statistical techniques, and using that as a foundation to improve the process and organizational performance.  This includes the introduction of new ideas and innovations - which at ML5 should be, at least in part, based on statistical data about process performance.
Here is my "high-road" interperpation of the PAs.  Keep in mind that there is a lot of detail in these and I am speaking at the 30,000 foot level.
OPP - the foundation of all HM practices is Organizational Process Performance.  It is meant to establish a monitoring and prediction capability using statistical techniques.  Here we select which sub-processes we want to include in our baselines, we set objectives that are within the natural process limits based on historical data ("voice of the process") and we model "what ifs" based on that voice (and other objectives).  It is expected that you will use a variety of techniques (for example process performance charts, histograms, Pareto analysis, et al) to understand process performance.  Just metrics isn't enough.  You need to be identifying "assignable causes" or "special cause" of variation using these techniques.
Think of this as the "single-source of truth" when it comes to process performance.
QPM - Here we will task projects with using the data from OPP to set, and hopefully achieve, the objectives for the project.  While OPP is focused on the meta-data (organizational), projects focus on the "micro-data," meaning what is going on at the phase or iteration level of their project. They compose their process based on the set of standard process using the baseline data from OPP to help them selected the sub-processes and set objectives for "what is possible" on their project (again "voice of the process").  They determine what THEY will statistically monitor, using similar techniques, like process performance charts to identify variation, and they monitor process performance, taking corrective action if this data appears to tell them they are not going to meet their process and quality objectives.  Actual performance data is then used to update the baseline process performance data that OPP is monitoring and tracking.
Think of this as the projects benefiting from past projects (using OPP), and then passing that benefit fo future projects (QPM to OPP).
CAR - Causal Analysis and Resolution is a (mostly) project-based PA that analyzes statistical data (provided by both OPP and QPM (through OPP)) to identify defects or problems (process and product) and examines possible causes using techniques such as mind-mapping, fish-bone diagramming, and the "five-whys."  Proposals for correction are made, these proposals are implemented, the "correction" is evaluated, and the data is recorded so that OPP can use it by updating their baselines with actual performance data.
Think of this as a way to know which problems you should be attacking in which sequence, and then ensuring that you actually solved it.  

OID - Organizational Innovation and Deployment is an organizational level PA that is similar to OPF in some ways, in that it strives to understand process needs, meet organizational objectives, and implement action plans, but it's far more deliberate AND is based on statisticdal data. Proposals are collected, and the feasibility of those proposals is analyzed, using data and modeling provided by way of OPP and CAR.  The deployment of new processes is also more cautious than OPF, in OID pilots are used, effects of the change are evaluated, and deployment is managed all within the context of meeting the objectives of the business. 
Think of OID as a more cautious, deliberate, and focused way to improve the process, with a higher degree of certainty that the improvements will not just be "different" but will enable us to better achieve process and quality objectives.
You can think of High Maturity as a set of techniques to bring the "Voice of the Process" (the actual performance results, within limits, of a stable process being performed) and the "Voice of the Customer," the specification that is set as a goal for us to achieve.
Hope this helps!

Can I lead a CMMI effort without being a certified assessorr

I've been assigned the leadership of our CMMI-based process improvement effort.  Do I need to become a certified Lead Assessor to do this?

Congratulations (I think?)!  This is a great opportunity for you to have a real impact on the direction of your company (or it could be a spectacular failure, which wouldn't be so good)!

There is no certification required to lead this type of effort, so have at it!  If you choose to conduct a formal SCAMPI appraisal you will need to engage a SCAMPI Lead Appraiser.

However . . . .

This type of effort would be more successful if you had at least the basic "Introduction to CMMI" training class, and if possible, the "Intermediate Concepts of CMMI" course as well.

The CMMI is a complex set of interdependent practices, and even though I've taught the "Intro" class over 40 times, I still learn new things every time.

"The Book" won't tell you anything about how to successfully lead a project like this though  - it will merely give you a view into what other successful companies have done with their process (i.e.; "best practices").

Good luck!

Tuesday, September 16, 2008

Hey Jeff! Some of the process areas seem redundant. Why do we have Requirements Management AND Requirements Development?

I have few questions about understating the difference between the following Process Areas of CMMI.
1. Difference between Requirement Management and Requirement Development.
2. Difference between Measurement and Analysis and Quantitative Process Management
3. Difference between Organization Process Focus and Organization Innovation and Deployment
Could you please provide me some insight on these process areas to clearly explain the difference. I would be grateful if you can provide me some examples.

A casual reading of the CMMI model specification may leave you with the impression that they are redundant, but a bit of research will reveal that each process area has been carefully designed for a specific reason.

Requirements Management provides us guidance for accepting new (initial or changed) requirements.  We're asked to ensure they "pass the test" for acceptance, that everyone understands what accepting that requirement means to the project team, that traceability is updated (or created), and the coordination with the plan takes place.  It is a Project Management activity (usually).

So for instance, a requirement MUST be testable, traceable, provided by the right person, and accepted by the right person.  That's the first practice in Requirements Management.

Requirements Development, on the other hand, involves asking and understanding the needs of the customer, developing a requirements specification, drilling down to product and/or technical requirements, and validating the agreed upon requirements. RD works WITH REQM, but they are different.

Measurement and Analysis provided guidance for creating an infrastructure for organizational measurement, and for the collection, storage, and analysis of those metrics.  It is foundational for ML2 (and the rest of CMMI).  Metrics need to tie to the goals and objectives of the organization - indeed - they need to support them.  You might say they need to be traceable to the goals.

Quantitative Project Management (you wrote "process" but I think you meant "project") is a set of practices that guide project managers towards the use of statistical models (generated in OPP) to select and manage their process (by "composing the defined process") in order to meet quantitative ("statistical") goals and objectives.  The statistical implication is always present in ML4 or ML5 process areas.  It is not about creating or using measures, it's about monitoring the project, and then feeding the data back into the baseline.

Organizational Process Focus is a set of practices that guide you through the management of your process improvement project.  It includes setting of needs, assessments, planning, and executing the process project plan. 

Organizational Innovation  and Deployment) may seem a bit like OPF, but in OID improvements are selected using the data generated from OPP ("quantitative vs qualitative") and QPM. Innovations are therefore much more targeted and granular.  Also, piloting of the proposed process improvement is expected.

Take a detailed read through these process areas and you'll see what I mean.  Good luck!

Monday, September 15, 2008

Can a company go right to Maturity Level Three or Four?

Our company just performed a self-assessment and we believe we can jump right to Maturity Level Three or Four.  Are we allowed to do that?

First of all, congratulations on completing a self-assessment.  That's the first step to improving you company, your products, and the satisfaction of your customers.

There is no specific rule against performing a ML3 appraisal "out of the box" without first performing a ML2, although the SEI cautions you that you "should" not skip levels.  That said, I've performed a number of appraisals for organizations that started at ML3.  So it's possible.

It may not be advisable though.  In order to deploy a useful and sustainable process that actually helps your company (a novel idea!) it's helpful to think in terms of releases and iterations.  People learn incrementally, as do organizations, and your chances of success are dramatically increased if you deploy smaller components of the process over time.

ML2 is designed to cover the basics of running a project-based organization.  It may seem like it's easy, but it can be difficult, and the results can be dramatic.  It would be healthy for your organization to first achieve CMMI ML2, run it for a year or two, and then address ML3.

But, that said, I've seen companies push it through, although the results have not all been successful.  If you're going to do it, consider conducting a less formal SCAMPI-C or SCAMPI-B appraisal prior to your SCAMPI-A to ensure that you really ARE performing at ML3 before you conduct an appraisal that goes into the public record.  Good luck!

Saturday, September 13, 2008

Join us for our October 'Introduction to CMMI' class

Fellow blog'sters,

Fall is a great time to be in the upper Midwest in Michigan and attend one of our fun and informative "Introduction to CMMI" classes!  Not only will we have some great debate and conversation about process improvement, but we'll also be revealing the "Secrets of CMMI Appraisals" and how to deploy process improvement using "AgileCMMI."  Visit for more information!