Tuesday, October 2, 2007

GE's "Work Out" for Process Improvement

Along with Six Sigma, General Electric developed a much simpler approach to process improvement called "Work Out".

There's even an entire book devoted to the subject - "The GE Work-Out" by Dave Ulrich and two other authors. Whether the topic deserves an entire book is a question I'll leave to the reader, but the approach is straighforward and the results can be quite stunning.

FourThought Partners has used this approach and found that the investment to perform this methodology is quite low, and the return large.

Essentially, the consultant/facilitator does the classic consultant cliche - borrowing the client's watch to tell him what time it is. In a very structured way, your people are brought together for a couple of brainstorming sessions and some "quick and dirty" ROI calculation sessions. The ideas that survive this process are brought to management in an unusually rapid process for go/no-go decisions, and the team is challenged to start producing results in weeks rather than months or longer.

Both "low hanging fruit" and benefits that take longer to realize are identified and acted upon without a lot of bureaucracy, and the teams involved are encouraged by the process to continue thinking about even more improvement. It's a satisfying experience that even those with short attention spans will appreciate!

If you are interested in learning more about GE Work-out, and want to determine whether FourThought Partners can help you implement it, please contact us.

Friday, September 28, 2007

Constructive Laziness

What is "constructive laziness"? Is it a good thing?

At FourThought Partners, we consider constructive laziness to be one of the best attributes a thinker can have. In mathematics, it is called the search for elegance. It means not being satisfied with a complicated solution, but rather seeking out a simple and straightforward one.

Too often, business processes, especially old ones, are anything but simple. They represent a long ago solution (which may once have been elegant) with lots of fixes changes and enhancements on over the years to solve the problems of the day. By today, the process has no single owner, it is incredibly complicated, and it has legacy features that no one can quite explain (but, which no one dare eliminate for fear that the house of cards may come crashing down).

In such a case, it would be highly desirable to simplify the process, and make the work easier - hence the term "laziness", which in this context means looking for less work and a process that is easier to understand and execute.

Another example of a context in which constructive laziness is a good thing is when looking for a way to display data. The non-constructive lazy solution is to just dump it out and let the reader try to figure out what it says. (How often have you been in a meeting where that is what the presenter did?)

The constructive lazy solution is to think about how the presentation could be short and simple, with conclusions readily apparent to the reader. It's actually more work to do it this way, but the result is much easier to explain and to modify at a future time. In other words, it's much less work for everyone over the long haul.

We at FourThought Partners strive to be constructively lazy in all our projects. It may have been difficult to perform our assignment, but we do not want it to be difficult to get benefit from our work. We strive to find the easiest way to get each problem solved.

Thursday, September 20, 2007

How to "do" Process Improvement

At FourThought Partners, we like to keep things as simple as possible. We've developed a methodology to process improvement that is straightforward and objective. The description that I'll provide here describes the basic steps, but leaves out the "secret sauce". That secret sauce is creativity, without which the steps do not produce results.

In other words, these steps are necessary, but not sufficient to produce profitable ideas. You need to assign the right person to lead such an effort, either from inside your organization, or from outside (i.e., FourThought Partners) -- then these steps will lead you to success.

STEP 1: Assess the Current Situation

Depending on the nature of the process to be improved (manufacturing, information processing, software, etc.), the nature of observing it will be different. In many cases, assessment begins with sitting next to people who actually perform the process. It is also helpful to meet with the managers and supervisors to get their perspective on what the process is and what is good or bad about it.

The output of this step is an appropriately detailed description of the current process, and an agreement from those involved that the description is accurate.

Step 2: Define Improvement Goals

It is critically important to define success before you go any further. Success should be defined clearly and as quantitatively as possible. For example, if the goal is to improve productivity, then that goal should be expressed in terms like "increase output from x per hour to 2x per hour of staff time, while not affecting quality as measured by customer satisfaction survey scores."

By stating the goals in this way, you will also establish metrics by which current and future performance can be evaluated. If these metrics are not already reported on by the current process, then these reports should be added immediately, even if it entails manually collecting data and computing these metrics on a periodic basis.

Step 3: Establish an Operational Advisory Team

Organize a small team of knowledgable people who can bring real world experience to the improvement exercise. They may not have to devote a great deal of time to this, but they must be available to vett ideas and suggest improvements of their own.

Step 4: Propose Improvement Ideas, "Socialize" them, and Iterate

The creative people on the team must now apply the secret sauce and develop a list of specific actions that could be taken to meet the improvement goals. As these will not be right the first time, the ideas must be discussed with the operational team, feedback must be incorporated, the list modified, and the process repeated until the team agrees that they have developed a good list of ideas.

Step 5: Prioritize the Ideas

Keeping it simple, assign several dimensions of "quality" to each idea. These dimensions could be a) gut feel priority, b) H/M/L level of difficulty, c) H/M/L impact on quality, etc. Once values for each dimension are assigned to each idea, the priority sort of the list should be fairly simple to perform.

Step 6: Get Management Buy-in

Review the prioritized list with management and allow them to further edit the list and re-sort it. Get resources assigned to each initiative that you want to implement, and most importantly, assign one person to be accountable for each effort.

Step 7: Start Improving

Establish the baseline metrics for the improvement measure, develop a project plan, and begin improving the process.

Step 8: Measure Improvement until the project is declared Complete

Using the metrics for measuring improvement, report regularly on the change in those metrics. Be prepared to explain to management why the metrics are or are not changing and regularly re-forecast those metrics. You will know that you have been successful when the chosen metrics have reached acceptable levels.

-----------------

FourThought Partners has performed this process many times with great success. There isn't a process in existence that can't be improved. All that is required is some focus, a committed team, and a creative leader.

Tuesday, September 18, 2007

Outfitting the Mobile Executive

I travel a lot for business and have gathered a lot of "tips" for what to have with me and how to prepare for a trip. If you have ideas to share on this topic, please send them to me and I will distribute them to our readers.

My laptop is indispensible to me. It is my primary computer and absolutely everything digital that I own or create is on it. I have found that synchronizing between computers is such a pain, that the only easy way to make sure I have everything I need with me is for it all to reside on my laptop. So, tip #1 is:

TIP NUMBER 1: Make your laptop your only/primary computer. Keep everything on it and you won't have to remember to synchronize it with other computer(s) that you own.

Because everything digital that I have is on this one computer, backups are essential. It would be a disaster to lose this machine without having everything backed up. I'll be writing a separate post on backups, but suffice it to say that my next tip is to make sure you regularly back up your laptop on to a USB hard drive, online backup service, or even better, both.

TIP NUMBER 2: Back everything up often to at least 1 other place, and better yet, 2.

Cooling my heals at airports is an unfortunate, but seemingly mandatory, part of business travel. As a result, I need to be entertained during delays. One of my favorite ways to use this idle time is via my Slingbox (http://us.slingmedia.com/page/home). This wonderful invention allows me to watch my home DVR from anywhere I can get a broadband connection. I can watch "live" TV and anything I have previously recorded on my DVR. It's not hi-def, but it's good enough for enjoyable viewing.

TIP NUMBER 3: Consider buying a slingbox and catching up on your TV viewing while waiting at airports or in hotels.

While on the road I always worry that my laptop or hard drive will fail and that I won't have necessary files to work with. For this reason, I carry a USB thumb drive which I use for small backups of active files. I use a program SYNCBACKSE from a company called 2BrightSparks that makes keeping the drive up to date very easy. Since I also use an online backup service, I also have web access to my complete backup data set, and can use this as well to get quick access to any file.

TIP NUMBER 4: Have available a second copy of your files that you can access even if your laptop crashes.

I also like to watch movies. Though the screen on a video ipod is small, it's pretty good for watching a movie. The iTouch looks like it's even better. I copy DVD's to my iPod with a relatively low cost piece of software, "PQ DVD to iPod Video Suite", that you can get at http://www.pqdvd.com/.

TIP NUMBER 5: Convert some movies from DVD to your iPod so that you'll always have something to watch.

If you have additional suggestions, please email them to me at the email address listed on the left.

Tuesday, September 11, 2007

Improving Account Retention

Organizations with large numbers of accounts have an important challenge: how to maintain a low attrition rate by servicing well in a mass-servicing mode.

It's almost a contradiction - servicing well and mass-servicing. We've all had the experience of being mass-serviced -- by our phone companies or credit card processors, for example. In too many cases, we feel ill-served as we perceive our needs are secondary to the supplier's need to keep servicing costs low.

In a recent assignment, I worked with an organization that had thousands of accounts and a service organization that was run by protocols for servicing. These protocols governed how often an account should be called and what questions should be asked. Despite the fact that the service organization followed these protocols, their rate of client attrition was growing. This fast growing organization was finding that attrition was becoming a significant "headwind" for overall growth. They needed to solve this problem. But how?

The Approach

We looked at the client lifecycle and determined that the nature of risk was different during different stages. In this case, it made sense to look at 4 stages of the lifecycle:
  1. During the implementation phase and shortly thereafter
  2. The steady state in which accounts are serviced and hopefully kept satisfied
  3. The period immediately after a notice or threat to cancel
  4. The time after the relationship has ended and during which the account should be re-sold

In each phase of the lifecycle, we examined the unique risk factors of each stage. We identified the quantitative and qualitative characteristics of high risk and low risk accounts, and the proactive vendor behavior that would most likely have the desired retention effect.

To support the institutionalization of what we learned, we designed special reports that would rank accounts in terms of their posession of risky attributes, and developed business practices that would lower risk. It should be noted that we did not really have to develop these items from scratch because (as we have found in most companies) there were islands of best practices already in existence. I say "islands" because these great practices were not generally documented or shared, but I also describe them as "best practices" because they actually worked well. The trick was finding these best practices - once we found them, it was straighforward to document and distribute them.

Feedback Loops

Once we developed quantitative ways to identify risk, we "automatically" had a way to guage the effectiveness of the techniques for reducing risk and the people using those techniques. If, after an intervention, the quantitative measures of risk had not gone down, we could conclude that either the technique or the preson using it were not effective. When this happened, the retention process called for an escalation. While the more expert resources were scarce, this escalation process triaged accounts in a manner that saved the experts for the cases where they were really needed.

How can you use these techniques?

If you have a need to improve your account retention results, FourThought Partners can help. We will help you identify what the best in your service organization already know, improve it, and roll it out in a controlled way.

We are so certain that these techniques will work that we're even willing to bet part of our fee on the results delivered.

Tuesday, September 4, 2007

The Case for Metrics

"If you want to improve something, measure it. If you really want to improve something, measure it often."

I don't know who said this first, but it is 100% true. So, why don't more people do this more often?
  • It's hard to determine what the right metrics are for a given issue.
  • It's tedious to calculate the values of the metrics, especially if you have to do it manually and often.
  • The metrics that come to mind don't really capture exactly what you have in mind.
  • The metrics might get used against you.

What are the right metrics?

For example, if you want to make software changes that will improve client satisfaction, how will you measure client satisfaction? How will you take into the account the fact that different clients have different levels of skill in their organizations? How do you take into the account the fact that client satisfaction, number of support calls, and other measures vary so much from month to month even when you aren't trying to improve anything?

The answer to these valid questions is that almost any reasonable metric is better than nothing. And, two reasonable metrics are better than one, and may be as good as one better one.

When starting to measure an important business barometer, start simple. Accept the fact that you may never get the perfect metric for customer satisfaction with your product, but that there are some really good proxies for that metric that will tell you over the long haul whether you're getting the result you hoped for.

By way of example, let's say you are trying to make the installation process better. You're about to release a new version of the software and you want to know if it accomplishes this goal. What do you do?

  1. Choose a couple of metrics. In this case, you can use the number of support calls received during installation and the number of support minutes (or hours or days) required to get a client up and running. You could also choose to use client satisfaction survey results or even a subjective rating by your installer (if one is involved). There is nothing wrong with deciding which metrics you will use based on the degree of difficulty of getting the data - select the most convenient ones.
  2. Start measuring these metrics before you implement the enhancement. The longer before the better, but anyting is better than nothing.
  3. Implement the change and watch the metrics. If possible, measure them more frequently than you had been so that you can get "early returns".
  4. Give the new process some time to take hold. Respond to emergencies, but try to give the new process a reasonable period of time before you make more changes so that you can get some good data.
  5. Review the results, determine what can be further improved, and repeat from step 3.

When you've gotten all the improvement you need or all that you can afford to chase, it's time to move on to another opportunity.

How can I make it less tedious?

This is an easy one. Simply choose more available measures. Try not to invent something for the purpose of measuring unless it's absolutlely necessary. Remember, metrics are a proxy for the real thing and are correlated with it. It should not be necessary to develop the best possible metrics.

How can I avoid having the metrics from being mis-used?

You can't. You can get out in front by drawing the right conclusions first. Be honest when something isn't working, and then figure out how to fix it. Get rightful credit for identifying problems and then fixing them.

Conclusion

Metrics are the best way to quantify a problem and evaluate its solution. By making the discussion on a problem quantitative, you are making it objective and less personal. When reasonable people have access to the same data, they more often reach consensus than when they limit their inputs to anecdotes and subjective theories.

Bridging the Gap Between IT and Sales (or other users)

My first gig as a consultant was to help a very large and sophisticated technology services company determine why their teriffic new web-based offering was getting such poor reviews from their field sales organization and end users.

The offering was really powerful and for its time, was nearly revolutionary in its comprehensiveness. IT believed that it delivered exactly what their sales organization had requested, and was extremely frustrated that the feedback was so bad. Sales wouldn't admit that IT did what it was asked to do, only that their clients didn't like the offering. Sales was demanding that it be "fixed", but could not say what was broken.

All in all, this was a miserable situation, and no one was willing to reach out to the "other side" to try to figure out what someone could actually do about this problem.

So, what did I do? As the joke goes, I borrowed their watch to tell them what time it was. I hit the phones and the road, and talked to a couple of dozen of their sales people and clients. It took a while to see the forest for the trees, but eventually a clear problem description emerged.

There were actually two problems: 1) response time from remote locations (outside the network) was so bad that work just couldn't get done, and 2) the user interface exposed all the power of the package to the user, even the majority of the users who did not need 75 to 90% of its power, and it felt overwhelming.

IT hadn't discovered the response time problem because all their testing had been from within their network; and they didn't realize that ease of use was an issue because they thought that all customers needed the full power of the package.

The solution was simple to say, but a bit more difficult to actually perform. The response time problem had to be fixed and the user interface had to be layered in such a way that a simple interface was presented to most users, and a more complex and powerful interface was presented to the minority of power users.

Once these things were done, a much greater level of client satisfaction was achieved.

What are the lessons from this engagement that can be generalized?
  • Listen to the subject matter experts (in this case, Sales and IT), but also talk to the market directly.
  • Listen with an open mind. If you think you already know the answer, you won't hear what's being said.
  • Nearly every user base is segmented, and often one of the results of this segmentation is a need for different views of the solution. Many, many packages have power users and casual users -- make sure you address the needs of both.
  • When testing, test from the perspectives of your end users in as many ways as you can think of: inside and outside the firewall, with and without other applications running, at various times of day, etc.

Why I started this blog

I've been a management consultant, working with technology companies, for nearly 10 years. In that time I've seen a lot of things done right, and more interestingly, a lot of things done wrong. I've also observed that many companies have similar issues, and that almost everyone can learn from the experiences of others.

The purpose of this blog is to share these experiences, and hopefully help a few people address some of their business challenges.

A few ground rules:
  • Never name names: we can learn from others' mistakes, but we don't have to embarass them in the process.
  • Stay constructive
  • Try to generalize problems and solutions for the widest applicability

I hope you'll get some value from these writings and that you'll send me your comments, whether you agree or disagree with my postings.

Thanks.