Blog

Don’t Let Rigor Become Rigor Mortis

Blog

At the height of WWII, Murray Peshkin was drafted into the Army while attending Cornell University as an undergraduate student.  Thanks to a few undergrad classes in Physics, after he completed basic training he was offered the opportunity to work for the theoretical physicist and Nobel Laureate Richard Feynman on the Manhattan Project.  Though he soon developed a profound respect for Feynman, Peshkin sometimes complained that Feynman’s work lacked rigor.  On one of those occasions Feynman responded, “Murray, do you know what rigor mortis means? It means died of too much rigor.”  Peshkin took the lesson to heart realizing that the rigorousness of any analysis must be in proportion to its desired outcomes and identified risks.  Some analyses require much rigor while others require less.  The question is, how do you know when enough analysis is enough?

Before embarking on a business analysis endeavor, thought needs to be put into the amount of analysis and rigor that will be required.  Too many times I have seen Service Management Offices (SMOs) shuttered due to the perception that too much time was required to accomplish too little.  The SMO creates beautiful charts, copious amounts of documentation, and lengthy roadmaps however, so much time is spent doing analysis and planning, the organization grows weary of the long wait for results.  This condition,  known as “analysis paralysis,”  happens when so much rigor is applied to analyzing a problem that nothing meaningful gets done and no one is happy.

On the flip side, organizations also need to beware of inadequate analysis.  Early in my career as an IT manager, before electronic mail systems were ubiquitous, I had an executive stop me in the hallway to express his desire for an “email system.”  This was the early 1990s and at the time everyone knew what email was, right?  Email is electronic mail sent through an asynchronous messaging system, sounds simple enough. Though we were lacking an adequate budget for the project, my team used open-source software to create an internal email system complete with an SMTP gateway.  It functioned well, was secure and easy to install, and was easily integrated with our authentication system.  However, when the email system was presented to the executive he was disappointed.  To my bewilderment, he explained that what we had created was not an email system.  He went on to explain that in his mind an email system does calendaring, task management, and resource management capability in addition to electronic messaging.  As the system we built only offered asynchronous messaging, in his mind it wasn’t an email system at all.  It turned out that what he wanted was more of a groupware system.  It was as though he asked for a chocolate chip cookie but got oatmeal raisin instead.

This experience underscores the fact that business decisions should be made thoughtfully, but thoughtfulness should be in proportion to the decisions that need to be made.  How do you determine when the focus should be on a quick win or a deep dive?  Often it’s not an easy answer.  However, there are some analysis planning techniques to identify clues that can help lead to the answer.

When planning an analysis, it’s a good idea to start with a problem statement.  The  statement should include a  brief succinct description of the issue or problem to be addressed, or a condition that requires improvement.  Problem statements are also developed with input from key stakeholders.  If my IT team could have engaged the executive to discuss the problem he wanted to solve with an internal email system, we could have identified his other requirements for enabling communication within collaborative groups.

The next step in planning an analysis is to identify what success looks like.  Is it a high return on investment or a positive net present value?  Is it increased customer satisfaction?  What about increased efficiency or process improvement?  At the end of the project, what must exist?  This activity will help identify Critical Success Factors (CSFs), which help focus the analysis and give meaning to its purpose.  They are also important for understanding requirements.

If we could have received more information from the executive that described what capabilities must be provided for the email system to be successful (i.e., CSFs) before our work began, we would have come much closer to meeting the executive’s needs.  In a relatively short interview, we could have gathered the knowledge that his system should ideally provide an online calendar, let him message his assistant, email his daughter away at college, and enable him to manage meeting room schedules. With that, we would have had enough input to begin formulating CSFs, those critical capabilities and conditions that must exist at the end of the project as well as improvements that must be realized.

Once all CSFs have been identified and approved, it’s time to determine how you can measure your team’s success in reaching them.  This is where Key Performance Indicators (KPIs) come into play.  KPIs are measurable values used to demonstrate how effectively your project achieves its objectives, both during creation and after completion. KPIs also help you right-size your analysis.  The amount of rigor required by the analysis can often be determined by the number of KPIs.  An analysis of a problem with one or two KPIs will generally need less rigor than an analysis that requires nine or ten.  The magnitude of the decisions that need to be made determine the number of KPIs required.  Just like all rules, there are exceptions to this one too, but the exceptions prove the rule.

Once you know what success looks like and how to measure it, you can identify the scope and magnitude of the analysis.  For example, a KPI for the executive’s email system might have been improved meeting room utilization.  A KPI also might have been reduced scheduling conflicts or successful electronic mail exchange between people both within the organization and without .  When you know the CSFs and what is required to monitor and measure success against them (KPIs), you then have a better idea of the rigor needed for the analysis.  In the executive email case, the right-size analysis would have called for analyzing alternatives for improving the use of meeting rooms, reducing the number of concurrently scheduled team meetings, and securely exchanging electronic messages with people inside and outside the organization.  Employing KPIs early in the analysis would have allowed us to regularly report our progress against the “as-is” baseline for formative evaluations throughout the project, and  deliver a summative evaluation after the project ended.

The pre-planning work performed before an analysis begins is often as important as the analysis itself.  Good planning helps to right-size the analysis so that rigor doesn’t become rigor mortis.  Cask’s personnel bring the experience, tools, techniques, and knowledge  to help clients avoid analysis paralysis thereby ensuring that business analyses efforts are adequate and complete without unnecessary effort.

X