Tag: Standard Work

Integrated Waste: Lather, Rinse, Repeat

Image via Wikipedia

Admittedly, it has been a while since I checked a shampoo bottle for directions, however, I do recall a time in my life reading:  Lather, Rinse, Repeat.  Curiously, they don’t say when or how many times the process needs to be repeated.

Perhaps someone can educate me as to why it is necessary to repeat the process at all – other than “daily”.  I also note that this is the only domestic “washing” process that requires repeating the exact same steps.  Hands, bodies, dishes, cars, laundry, floors, and even pets are typically washed only once per occasion.

The intent of this post is not to debate the effectiveness of shampoo or to determine whether this is just a marketing scheme to sell more product.  The point of the example is this:  simply following the process as defined is, in my opinion, inherently wasteful of product, water, and time – literally, money down the drain.

Some shampoo companies may have changed the final step in the process to “repeat as necessary” but that still presents a degree of uncertainty and assures that exceptions to the new standard process of “Lather, Rinse, and Repeat as Necessary” are likely to occur.

In the spirit of continuous improvement, new 2-in-1 and even 3-in-1 products are available on the market today that serve as the complete “shower solution” in one bottle.  As these are also my products of choice, I can advise that these products do not include directions for use.

Scratching the Surface

As lean practitioners, we need to position ourselves to think outside of the box and challenge the status quo.  This includes the manner in which processes and tasks are executed.  In other words, we not only need to assess what is happening, we also need to understand why and how.

One of the reasons I am concerned with process audits is that conformance to the prescribed systems, procedures, or “Standard Work” somehow suggests that operations are efficient and effective.  In my opinion, nothing could be further from the truth.

To compound matters, in cases where non-conformances are identified, often times the team is too eager to fix (“patch”) the immediate process without considering the implications to the system as a whole.  I present an example of this in the next section.

The only hint of encouragement that satisfactory audits offer is this: “People will perform the tasks as directed by the standard work – whether it is correct or not.”  Of course this assumes that procedures were based on people performing the work as designed or intended as opposed to documenting existing habits and behaviors to assure conformance.

Examining current systems and procedures at the process level only serves to scratch the surface.  First hand process reviews are an absolute necessity to identify opportunities for improvement and must consider the system or process as a whole as you will see in the following example.

Manufacturing – Another Example

On one occasion, I was facilitating a preparatory “process walk” with the management team of a parts manufacturer.  As we visited each step of the process, we observed the team members while they worked and listened intently as they described what they do.

As we were nearing the end of the walk through, I noted that one of the last process steps was “Certification”, where parts are subject to 100% inspection and rework / repair as required.  After being certified, the parts were placed into a container marked “100% Certified” then sent to the warehouse – ready for shipping to the customer.

When I asked about the certification process, I was advised that:  “We’ve always had problems with these parts and, whenever the customer complained, we had to certify them all 100% … ‘technical debate and more process intensive discussions followed here’ … so we moved the inspection into the line to make sure everything was good before it went in the box.”

Sadly, when I asked how long they’ve been running like this, the answer was no different from the ones I’ve heard so many times before:  “Years”.  So, because of past customer problems and the failure to identify true root causes and implement permanent corrective actions to resolve the issues, this manufacturer decided to absorb the “waste” into the “normal” production process and make it an integral part of the “standard operating procedure.”

To be clear, just when you thought I picked any easy one, the real problem is not the certification process.  To the contrary, the real problem is in the “… ‘technical debate and more process intensive discussions followed here’ …” portion of the response.  Simply asking about the certification requirement was scratching the surface.  We need to …

Get Below the Surface

I have always said that the quality of a product is only as good as the process that makes it.  So, as expected, the process is usually where we find the real opportunities to improve.  From the manufacturing example above, we clearly had a bigger problem to contend with than simply “sorting and certifying” parts.  On a broader scale, the problems I personally faced were two-fold:

  1. The actual manufacturing processes with their inherent quality issues and,
  2. The Team’s seemingly firm stance that the processes couldn’t be improved.

After some discussion and more debate, we agreed to develop a process improvement strategy.  Working with the team, we created a detailed process flow and Value Stream Map of the current process.  We then developed a Value Stream Map of the Ideal State process.  Although we did identify other opportunities to improve, it is important to note that the ideal state did not include “certification”.

I worked with the team to facilitate a series of problem solving workshops where we identified and confirmed root causes, conducted experiments, performed statistical analyses, developed / verified solutions, implemented permanent corrective actions, completed detailed process reviews and conducted time studies.  Over the course of 6 months, progressive / incremental process improvements were made and ultimately the “certification” step was eliminated from the process.

We continued to review and improve other aspects of the process, supporting systems, and infrastructure as well including, but not limited to:  materials planning and logistics, purchasing, scheduling, inventory controls, part storage, preventive maintenance, redefined and refined process controls, all supported by documented work instructions as required.  We also evaluated key performance indicators.  Some were eliminated while new ones, such as Overall Equipment Effectiveness, were introduced.


Some of the tooling changes to achieve the planned / desired results were extensive.  One new tool was required while major and minor changes were required on others.  The real tangible cost savings were very significant and offset the investment / expense many times over.  In this case, we were fortunate that new jobs being launched at the plant could absorb the displaced labor resulting from the improvements made.

Every aspect of the process demonstrated improved performance and ultimately increased throughput.  The final proof of success was also reflected on the bottom line.  In time, other key performance indicators reflected major improvements as well, including quality (low single digit defective parts per million, significantly reduced scrap and rework), increased Overall Equipment Effectiveness (Availability, Performance, and Quality), increased inventory turns, improved delivery performance (100% on time – in full), reduced overtime,  and more importantly – improved morale.


I have managed many successful turnarounds in manufacturing over the course of my career and, although the problems we face are often unique, the challenge remains the same:  to continually improve throughput by eliminating non-value added waste.  Of course, none of this is possible without the support of senior management and full cooperation of the team.

While it is great to see plants that are clean and organized, be forewarned that looks can be deceiving.  What we perceive may be far from efficient or effective.  In the end, the proof of wisdom is in the result.

Until Next Time – STAY lean!

Vergence Analytics
Twitter:  @Versalytics

22 Seconds to Burn – Excel VBA Teaches Lean Execution

Cover of "Excel 2003 Power Programming wi...
Cover via Amazon


VBA for Excel has once again provided the opportunity to demonstrate some basic lean tenets.  The methods used to produce the required product or solution can yield significant savings in time and ultimately money.  The current practice is not necessarily the best practice in your industry.  In manufacturing, trivial or minute differences in methods deployed become more apparent during mass production or as volume and demand increases.  The same is true for software solutions and both are subject to continual improvement and the relentless pursuit to eliminate waste.

Using Excel to demonstrate certain aspects of Lean is ideal.  Numbers are the raw materials and formulas represent the processes or methods to produce the final solution (or product).  Secondly, most businesses are using Excel to manage many of their daily tasks.  Any extended learning can only help users to better understand the Excel environment.

The Model:

We recently created a perpetual Holiday calendar for one of our applications and needed an algorithm or procedure to calculate the date for Easter Sunday and Good Friday.  We adopted an algorithm found on Wikipedia at http://en.wikipedia.org/wiki/Computus that produces the correct date for Easter Sunday.

In our search for the Easter Algorithm, we found another algorithm that uses a different method of calculation and provides the correct results too.  Pleased to have two working solutions, we initially did not spend too much time thinking about the differences between them.  If both routines produce the same results then we should choose the one with the faster execution time.  We performed a simple time study to determine the most efficient formula.  For a single calculation, or iteration, the time differences are virtually negligible; however, when subjected to 5,000,000 iterations the time differences were significant.

This number of cycles may seem grossly overstated, however, when we consider how many automobiles and components are produced each year then 5,000,000 approaches only a fraction of the total volume.  Taken further, Excel performs thousands of calculations a day and perhaps even as many more times this rate as numbers or data are entered on a spreadsheet.  When we consider the number “calculations” performed at any given moment, the number quickly grows beyond comprehension.


As a relatively new student to John Walkenbach’s book, “Excel 2003 Power Programming with VBA“, speed of execution, efficiency, and “Declaring your Variables” have entered into our world of Lean.  We originally created two (2) routines called EasterDay and EasterDate.  We then created a simple procedure to run each function through 5,000,000 cycles.  Again, this may sound like a lot of iterations but computers work at remarkable speeds and we wanted enough resolution to discern any time differences between the routines.

The difference in the time required to execute 5,000,000 cycles by each of the routines was surprising.  We recorded the test times (measured in seconds) for three separate studies as follows:

  • Original EasterDay:  31.34,  32.69,  30.94
  • Original EasterDate:  22.17,  22.28,  22.25

The differences between the two methods ranged from 9.17 seconds to 8.69 seconds.  Expressed in different terms, the duration of the EasterDay routine is 1.39 to 1.46 times longer than EasterDate.  Clearly the original EasterDate function has the better execution speed.  What we perceive as virtually identical systems or processes at low volumes can yield significant differences that are often only revealed or discovered by increased volume or the passage of time.

In the Canadian automotive industry there are at least 5 major OEM manufacturers (Toyota, Honda, Ford, GM, and Chrysler), each producing millions of vehicles a year.  All appear to produce similar products and perform similar tasks; however, the performance ratios for each of these companies are starkly different.  We recognize Toyota as the high velocity, lean, front running company.  We contend that Toyota’s success is partly driven by the inherent attention to detail of processes and product lines at all levels of the company.


We decided to revisit the Easter Day calculations or procedures to see what could be done to improve the execution speed.  We created a new procedure called “EasterSunday” using the original EasterDay procedure as our base line.  Note that the original Wikipedia code was only slightly modified to work in VBA for Excel.  To adapt the original Wikipedia procedure to Excel, we replaced the FLOOR function with the INT function in VBA.  Otherwise, the procedure is presented without further revision.

To create the final EasterSunday procedure, we made two revisions to the original code without changing the algorithm structure or the essence of the formulas themselves.  The changes resulted in significant performance improvements as summarized as follows:

  1. For integer division, we replaced the INT (n / d) statements with a less commonly used (or known) “\” integer division operator.  In other words, we used “n \ d” in place of “INT( n / d)” wherever an integer result is required.  This change alone resulted in a gain of 11 seconds.  One word of caution if you plan to use the “\” division operator:  The “n” and “d”  are converted to integers before doing the division.
  2. We declared each of the variables used in the subsequent formulas and gained yet another remarkable 11 seconds.  Although John Walkenbach and certainly many other authors stress declaring variables, it is surprising to see very few published VBA procedures that actually put this to practice.


The results of our Time Tests appear in the table below.  Note that we ran several timed iterations for each change knowing that some variations in process time can occur.

EasterDay = 31.34375 Original Code uses INT( n / d ) to convert Division Results
EasterSunday = 20.828125 1.  Replaced INT ( n / d) with (n \ d)
EasterDate = 22.28125 Original Code – Alternate Calculation Method
Re-Test to Confirm Timing
EasterDay = 30.9375 Original Code uses INT( n / d ) to convert Division Results
EasterSunday = 20.921875 1.  Replaced INT ( n / d) with (n \ d)
EasterDate = 22.25 Original Code – Alternate Calculation Method
Re-Test to Confirm Timing
EasterDay = 30.90625 Original Code uses INT( n / d ) to convert Division Results
EasterSunday = 21.265625 1.  Replaced INT ( n / d) with (n \ d)
EasterDate = 22.25 Original Code – Alternate Calculation Method
Re-Test to Confirm Timing
EasterDay = 31.078125 Original Code uses INT( n / d ) to convert Division Results
EasterSunday = 9.171875 2.  Variables DECLARED!
EasterDate = 22.1875 Original Code – Alternate Calculation Method
Re-Test to Confirm Timing
EasterDay = 31.109375 Original Code uses INT( n / d ) to convert Division Results
EasterSunday = 9.171875 2.  Variables DECLARED!
EasterDate = 22.171875 Original Code – Alternate Calculation Method

The EasterSunday procedure contains the changes described above.  We achieved a total savings of approximately 22 seconds.  The integer division methods used both yield the same result, however, one is clearly faster than the other.

The gains made by declaring variables were just as significant.  In VBA, undeclared variables default to a “variant” type.  Although variant types are more flexible by definition, performance diminishes significantly. We saved at least an additional 11 seconds simply by declaring variables.  Variable declarations are to VBA as policies are to your company, they define the “size and scope” of the working environment.  Undefined policies or vague specifications create ambiguity and generate waste.

Lessons Learned:

In manufacturing, a 70% improvement is significant; worthy of awards, accolades, and public recognition.  The lessons learned from this example are eight-fold:

  1. For manufacturing, do not assume the current working process is the “best practice”.  There is always room for improvement.  Make time to understand and learn from your existing processes.  Look for solutions outside of your current business or industry.
  2. Benchmarking a current practice against another existing practice is just the incentive required to make changes.  Why is one method better than another?  What can we do to improve?
  3. Policy statements can influence the work environment and execution of procedures or methods.  Ambiguity and lack of clarity create waste by expending resources that are not required.
  4. Improvements to an existing process are possible with results that out perform the nearest known competitor.  We anticipated at least being able to have the two routines run at the similar speeds.  We did not anticipate the final EasterSunday routine to run more than 50% faster than our simulated competitive benchmark (EasterDate).
  5. The greatest opportunities are found where you least expect them.  Learning to see problems is one of the greatest challenges that most companies face.  The example presented in this simple analogy completely shatters the expression, “If it ain’t broke, don’t fix it.”
  6. Current practices are not necessarily best practices and best practices can always be improved.  Focusing on the weaknesses of your current systems or processes can result in a significant competitive edge.
  7. Accelerated modeling can highlight opportunities for improvement that would otherwise not be revealed until full high volume production occurs.  Many companies are already using process simulation software to emulate accelerated production to identify opportunities for improvement.
  8. The most important lesson of all is this:

Speed of Execution is Important >> Thoughtful Speed of Execution is CRITICAL.

We wish you all the best of this holiday season!

Until Next Time – STAY Lean!

Vergence Analytics

At the onset of the Holiday project, the task seemed relatively simple until we discovered that the rules for Easter Sunday did not follow the simple rules that applied to other holidays throughout the year.  As a result we learned more about history, astronomy, and the tracking of time than we ever would have thought possible.

We also learned that Excel’s spreadsheet MOD formula is subject to precision errors and the VBA version of MOD can yield a different result than the spreadsheet version.

We also rediscovered Excel’s Leap Year bug (29-Feb-1900).   1900 was not a leap year.  The leap year bug resides in the spreadsheet version of the date functions.  The VBA date function recognizes that 29-Feb-1900 is not a valid date.

Time Studies with your BlackBerry

Performing a time study is relatively easy compared to only few years ago.  The technologies available today allow studies to be conducted quite readily.

Time Studies and OEE (Overall Equipment Effectiveness)

The Performance factor for OEE is based on the Ideal Cycle Time of the process.  For fixed rate processes, the Name-Plate rate may suffice but should still be confirmed.  For other processes such a labour intensive operations, a time study is the only way to determine the true or ideal cycle time.

When measuring the cycle time, we typically use “button to button” timing to mark a complete cycle.  It can be argued that an operator may lose time to retrieve or pack parts or move containers.  Including these events in the gross cycle time will hide these opportunities.  It is better to exclude any events that are not considered to be part of the actual production cycle.

When calculating the Performance factor for Overall Equipment Effectiveness (OEE), the efficiency shortfalls will be noted by the less than 100% performance.  The reasons for this less than optimal level of peformance are attributed to the activities the operator is required to perform other than actually operating the machine or producing parts.

All operator activities and actions should be documented using a standardized operating procedure or standardized work methodology.  This will allow all activities to be captured as opposed to absorbed into the job function.

The BlackBerry Clock – Stopwatch

One of the tools we have used on the “fly” is the BlackBerry Clock’s Stopwatch function.  The stopwatch feature is very simple to use and provides lap time recording as well.

When performing time studies using a traditional stopwatch, being able to keep track of individual cycle times can be difficult.  With the stopwatch function, the history for each “lap” time is retained.  To determine the individual lap time or cycle time, we recommend dividing the total lapsed time by the number of completed cycles (or laps).

The individual lap times are subject to a certain degree of uncertainty or error as there will always be a lead or lag time associated with the pushing of the button on the BlackBerry to signal the completion of a cycle.  Although this margin of error may be relatively small, even with this level of technology, the human element is still a factor for consideration.

Once the time study is complete you can immediately send the results by forwarding them as an E-mail, PIN, or SMS.

The BlackBerry Camera – Video Camera

Another useful tool is the video camera.  Using video to record operations and processes allows for a detailed “step by step” analysis at any time.  This is particularly useful when establishing Standard Operating Procedures or Standardized Work.

Uploading videos and pictures to your computer is as easy as connecting the device to an available USB port.  In a matter of minutes, the data is ready to be used.

Video can also be used to analyze work methods, sequences, and also serves as a valuable problem solving tool.

Until Next Time – STAY Lean!

We are not affiliated with Research In Motion (RIM).  The intent of this post is to simply demonstrate how the technology can be used in the context described and presented.

OEE and Standardized Work

Overall Equipment Efficiency (OEE):  Standardized Work

After you start collecting OEE data for your processes, you may notice significant variance between departments, shifts, and even employees performing the work.  Of the many aspects that you will be inclined to investigate, standardized work should be one of them.

Making sure that all employees are executing a process or sequence of processes correctly and exactly the same way every time is the topic of standardized work.  The OEE data may also direct you to review how the processes are being executed by some of the top performers to determine if they are truly demonstrating best practices or simply cutting corners.

Lean practices are founded on learning by observing.  We cannot stress the importance of observing an operation to see first hand what opportunities for improvement (waste elimination) are available.  OEE data is a compass that directs you where to look; however, the destination for improvements is the process, the very source from where the data originated.

Establishing Standard Cycle Times

One of the first questions we usually ask is, “How were the standard cycle times determined?”  Was the standard based on best practices, quoted rates, time studies, name plate ratings, or published machine cycle times?

We recommend conducting an actual time study using a stop watch and calculating part to part (button to button) cycle times accordingly.  We have used the stop watch capability of the BlackBerry many times.  Results for lap times and total elapsed time are easily recorded and can be e-mailed as soon as the study is complete.

The sample size of course will depend on the actual rate of the machine and should be statistically relevant.  One or two cycles is not sufficient for an effective time study.

For operator “controlled” processes, we recommend involving the employees who normally perform the work when conducting the time study.  It doesn’t make sense to have the “office experts” run the equipment for a short burst to set a rate that cannot be sustained or is just simply unreasonable.

Many processes, those dependent on human effort or automation, are usually controlled by PLC’s that are also capable of providing the machine cycle time.  At a minimum, we recommend validating these cycle times to at least satisfy yourself that these are part to part or “button to button” cycle times.

For automated operations, PLC’s can typically be relied upon to provide a reasonable cycle time.  Without going to far into process design and development, you will need to understand the elements that control the process sequences.  Some processes are driven by time controls (an event occurs after a period predetermined period of time) versus those that may be event-driven (an event occurs based on satisfying a dependent “sensor on-off” condition or similar “event signal” mechanism.

The real key to understanding the process being studied is to develop a flow chart clearly defining each of the process steps.  It is of equal importance to observe the differences that may be occurring between employees performing the work.  Either the instructions lack clarity or habits (good or bad) have been developed over time.  Although templates exist to aid in the development of standardized work, don’t wait to find the right tool.

Using Video – Record it Live

We highly recommend using a video recorder to capture the process in action.  With the technology available today, video is readily available and a very cost-effective method of documenting your processes.  Video presents several advantages:

  1. Captures activities in real-time.
  2. Provides instant replay.
  3. Establish process or sequence event timing in real-time.
  4. Eliminates need for “stop watches” to capture multiple event timing.
  5. Can be used as a training aid for new employees to demonstrate “standardized work practices”.
  6. Can be used to develop “best practices”.
  7. Reduces or minimizes potential for time measurement error.

We have successfully used video to not only develop standardized work for production processes, but also for documenting and recording best practices for tool changes, set up, and checking or inspection procedures.

Standardized work eliminates any questions regarding the proper or correct way of performing the work required.  Standardized work procedures allow additional development work to be completed “offline” without further disruption to the production process.


Of course, Standardized Operating / Work procedures are required to establish effective and meaningful value stream maps but even more importantly, they become an effective tool to understand the opportunity for variances in your OEE data, certainly where manual or “human” controlled operations are concerned.

It has been argued that OEE data in and of itself is not statistically relevant and we are inclined to agree with this statement.  The simple reason is that the processes being measured are subject to significant internal and external variances or influences.  Examples may include reduced volumes, product mix changes, tool change frequency, employee turnover, and economic conditions.

As mentioned in many of our posts, it is important to understand “WHAT and WHY” we are measuring.  Understanding the results is more important than the result itself.  A company looking to increase inventory turns may resort to smaller production runs and more frequent tool changes.  This will reduce Availability and, in turn, will result in a lower OEE.  The objective may then be to find a way to further reduce tool change times to “improve” the Availability.

The use of OEE data can vary in scope, ranging from part specific performance to plant wide operations.  As the scope of measurement changes, so do the influences that impact the net result.  So once again, we urge you to use caution when comparing data between personnel, shifts, departments, and production facilities.  Typically, first or day shift operations have greater access to resources that are not available on the “off’ shifts.

Perhaps the greatest “external” influence on current manufacturing operations is the rapid collapse of the automotive industry in the midst of our current economic “melt down”.  The changes in operating strategy to respond to this new crisis are bound to have an effect on OEE among other business metrics.

The ultimate purpose of Lean practices is to reduce or eliminate waste and doing so requires a rigorous “document and review” process .  The ability to show evidence of current versus proposed practices will reduce or eliminate the roadblocks that may impede your continuous improvement objectives.

While the post is brief today, hopefully the message is helpful.

Until Next Time – STAY Lean!