Published here March 2009.

 

Musings Index

Comments on Professionalism and Professions

Now and again I receive Emails that are both interesting and thoughtful. The following is a reflection on Professionalism and the Professions as applied to project management. The following is edited for readability on line.

On 6/23/08, Steve Jenkin[1] wrote by Email

Max,

Thanks for all your excellent writing/material on your web site. I know enough about "Project Management" to not sell myself as a project manager in my field, computing. Not because I can't make things happen or get a team to deliver - I just don't have the "Organizer/Thruster" personality to take it past a small scale. Nor do I have the necessary organizational political skills needed to ramp it up! I feel my talents are better used elsewhere, so everybody wins.

So, I read with interest your letter from 1988. See A Little Bit of PMI History. The only workable definition for Professional that I like is: Paid Practitioner. That pushes everything onto the definition of Professionalism (or Professional Behavior but I'm not going there - too much and too hard.)

The point I've reached with "Professions" is that I postulate two propositions:

  1. You can't have a strong Industry and Profession without a strong Professional Association (e.g. PMI), and
  2. In Strong Professions, practitioners cannot repeat Known Errors/Faults without direct, personal consequences.

This is in addition to the normal:

  1. Body of Knowledge and Practice
  2. Code of Conduct/Ethics
  3. Barriers to Entry
  4. Disciplinary Process

The implications of my proposition (2) are that "Professions" learn and improve, and this implies that "Practitioner Performance" can be measured, repeatably and reliably. And further, that there is also a well known repository for "What Works and What doesn't".

Aviation is a perfect example of a "learning and improving" industry:

  • There are clear metrics published (and publicly available).
  • Massive failures (i.e. crashes) are thoroughly investigated.
  • Effective measures are put in place to avoid recurrence of known faults/problems.
  • Not all accidents/failures are due to individual error, many are system or organizational errors.

The system works because the Accident Investigator (ATSB) is a separate entity from the Compliance and Enforcement entity (FAA). There is also a pervasive mechanism to communicate findings and new requirements for all parts of the industry, not just manufacturers, operators or pilots. There is a pervasive Safety and Improvement culture across all disciplines and work types, e.g. Refuelers, to Air Traffic Controllers, to Maintenance Engineers, to Pilots, to Cabin Crew. This is supported by an anonymous incident reporting system open to anyone.

At some point government, as both regulators and law-makers, took an interest in aviation and decided killing people in crashes was bad. Therefore, as in civil engineering and construction, they introduced criminal penalties for real negligence, incompetence and/or deliberate action. However, this level of legislative involvement, while useful, is not necessarily essential.

There are important counter-examples to show that the professional culture in western aviation does really matter and it is not the technology alone. For example:

  • General (private) aviation safety has barely improved in 30-40 years
  • A number of countries (e.g. Indonesia) have appalling safety figures to the point that it is possible for carriers from those countries to have their Air Operator certificates revoked in other countries such as in Australia.

As Forsberg, Mooz and Cotterham have observed:

"Of all the project management concepts, Lessons Learned from prior failures and successes is the most neglected."[2]

I think this is linked to my Proposition #2: The Profession has to deliberately learn, at the very least from major (public) failures. That's because professional learning supports two important aspects of practice:

  • Quality/Safety improvement, and
  • High-Performance working.

Beyond some point, simple defect inspection or "correct personal practices" cannot yield higher Quality/Safety. Higher levels of Quality/Safety must be planned - deliberate and intentional - and require systemic analysis and addressing organizational/systems errors. Quality has to be designed in.

The side-benefit of all this Error/Failure Analysis is understanding how to work better, it is the same work/analysis needed to improve performance. So, if people work on reducing errors/failures, they will automatically improve performance or output and, thereby, considerably improve profit/bottom-line results. The most obvious and simplest metric describing this effect is in reduced rework.

A unique and powerful statement of the benefits of Lessons Learned appear in this broadcast interview: "Minimizing Harm To Patients In Hospital". In 2001, Dr Brent James[3] reported on over ten years of effort at Intermountain Health Care (Utah-Colorado), making these observations, in part:

  • "A better model is do it right the first time. It looks like that could save as much as 15% to 25% of our total cost of operations"
     
  • "In ten years we had 4,155 confirmed human errors. In parallel with that we had 3,996 confirmed moderate or severe adverse drug events. Those [3,996] were injuries. The fascinating thing was the overlap. Among 3,996 confirmed injuries, 138 or 3.5% resulted because of a human error."
     
  • "[What we have done about it is] that we have a pretty good protocol. It requires for pressure sores that we train the nurses to first assess risk on admission. And for high risk patients you assess them every day. The second thing is that we have special bed surfaces and special management techniques that greatly reduce the incidence of new ulcers; in a carefully controlled study we were able to reduce the incidence of new ulcers by more than three-quarters. I recognized something, this isn't just better care. For massive cost savings associated with that, it's far cheaper to prevent pressure ulcers than it is to deal with them after they've occurred."
     
  • "The idea that every time there's an injury we write a rule, that just makes the world so hopelessly complex, it would probably increase injury rates. And of course the third response that Paul [Barach][4] talks about is our learning system. Learning systems look for general patterns, through which you can change systems and take down injury rates.
     
  • "The system learns from its mistakes. This is probably a task that we'll be facing forever, and will root out the big ones, health care will become much more safe. If you start to put your organizational structure in place, make the assignments, identify the high priority areas, and then you build it right into the infrastructure, so it's easy to do it right rather than do it wrong."
     


1. Steve Jenkin is a Systems and Design Specialist. He can be reached at stevej098@gmail.com
2. Forsberg, K., H. Mooz & H. Cotterham, Visualizing Project Management: A Model for Business and Technical Success, 2nd Edition, Wiley, NY, 2000, p5, quoted here: maxwideman.com/papers/pm-models/summary.htm
3. Dr. Brent James, Executive Director, Intermountain Health Care, Salt Lake City, Utah
4. Dr Paul Barach, the University of Chicago, tracks three potential responses in health organizations to an injury or an error. These are: Pathologic (kill the messenger, punish the physician); Bureaucratic (write a rule); Learning (system for self improvement).
 
Home | Issacons | PM Glossary | Papers & Books | Max's Musings
Guest Articles | Contact Info | Search My Site | Site Map | Top of Page