Our article on “Lean Code” strongly suggests that the principles of lean can also be applied to the realm of software development, applications, and more specifically, programming.
Python has evolved to become a very popular and powerful programming language. However, as mentioned in “Lean Code“, the performance of your application or program is as dependent on the skills of the programmer as they are on the capabilities of the programming language itself.
It’s essential to realize that developers built pandas on top of NumPy. As a result, every task you perform using pandas also goes through NumPy. To obtain the benefits of pandas, you pay a performance penalty that some testers say is 100 times slower than NumPy for a similar task.
The functionality offered by pandas makes writing code faster and easier for the programmer, however, the performance trade-off exists for the end user. Knowing when to use one module over the other depends on the programmer’s understanding of the language as opposed to simply providing a specific functionality.
Python for Data Science provides sufficient information to decide the best fit case for either pandas or NumPy. The relevance of sharing this is to stress the importance of continually reading, learning, and understanding as much as possible about your language of choice for a given application.
From the end user’s perspective, performance matters and everyone wants it “yesterday”. So, the question is, “Do we code quickly and sacrifice performance or sacrifice delivery for quick code? What would you do?
The new year is upon us and, as is typical for this time of year, resolutions are one of the primary topics of conversation. With just over a week into the new year, it is very likely that the discussions of resolutions and goals have already begun to subside.
Unfortunately, for the many who do make resolutions, very few ever manage to achieve them. The reasons for failure are many but, more often than not, we either set the wrong goals or we fail to identify intermediate performance goals for the range of activities required to reach the final goal.
The diagram suggests that goals are determined by reviewing our needs and desires. However, what we desire most is often what we need least. For business leaders, strategy, goals, and objectives stem from a vision statement that reflects our purpose for being, our WHY. We are, in essence, Driven by Dreams and Powered by Goals.
What do the “right goals” look like? The John Whitmore model offers the following three (3) acronyms to help us discern the value and sustainability of our goals:
SMART: Specific, Measurable, Attainable, Realistic, and Time Phased.
PURE: Positively Stated, Understood, Relevant, and Ethical.
CLEAR: Challenging, Legal, Environmentally Sound, Agreed, and Recorded.
To be successful, resolutions, much like goals and objectives, require more than a simple statement of intent. We need a plan that describes how we’re actually going to achieve them. In other words, we need to define “the means to an end.” As suggested by the Whitmore model, the expression, “Fail to Plan – Plan to Fail”, is only partially true when we consider that our success also requires us to be sufficiently motivated and challenged to embark on, and endure, the journey.
What if …
Clearly, not everything goes as planned. There are risks and obstacles that must be considered and, where possible, addressed as part of the planning process. Contingency plans are as much a part of planning as the “master” plan itself.
While it seems impossible to “expect the unexpected”, black swan events do occur. How we respond to these events is often the “make or break” point of our journey. During this time, our commitment to our goals and perhaps even our vision will be tested. For this reason, our core purpose or “why” must be of sufficient value to sustain our efforts and give cause to overcome the distractions and setbacks that are sure to occur.
Goals without dates are merely dreams and, likewise, goals without a means to achieve them are meaningless. Motivate your team by instilling a vested interest through the development of a detailed plan that will be sure to inspire the team to not only follow up but to follow through on their commitments.
The scope and scale of a plan is dependent on the goals we are striving to achieve. We tend to underestimate the resources and effort required to accomplish the tasks at hand. The ability to identify detailed actions or tasks, required resources, responsibilities, and realistic timing will help to create a plan that leads to a successful conclusion, avoiding much of the confusion and frustration that poor planning can bring.
After all is said and written – it must be done. Execution of the plan – putting words into action – is how our goals become a reality. A variety of tools are at our disposal to manage our activities and progress ranging from simple white boards to professional project management software. However these activities are managed, we must ensure that we don’t get caught up in the management “process” itself and focus on the immediate tasks or actions at hand.
Additional learning occurs with every change or transformation process. As such, I prefer to use an “agile” approach that offers flexibility to change or evolve our “means” or “methods” without compromising the goal we originally set out to achieve.
Practice proves theory every time and the real proof of wisdom is in the results. We wish you all the best of successes to achieve the goals that you may have set for yourself and your team in 2013.
The time between Christmas and New Year’s eve is one of transition as we consider the events that occurred over the past year and prepare for the new year ahead. Experts are sure to present their annual summaries and will also attempt to “predict” what may be in store for us in the year to come. As lean leaders we also recognize the necessity to make and take the time for introspection and hansei (reflection).
Lean is by definition a perpetual transition from the current state to an ideal future state as we understand it. As our culture and technologies evolve, we continue to open doors to more opportunities and perhaps an even greater potential than first imagined. As such, we seek to advance our understanding as we pursue our vision of lean and it’s scope of application.
Lean is often described as a journey. While the vision is clearly defined, the means for achieving it continue to evolve and, as we’ve stated many times before, “There’s always a better way and more than one solution.” From a lean perspective, the Plan-Do-Check-Act (PDCA) cycle challenges us to consider every change as a temporary state where each subsequent iteration ultimately brings us closer to realizing our vision.
Recognizing that we are in a continual state of transition should give us cause to embrace the ideology that the nature of change can only be viewed as a temporary condition. True resistance to change should only occur when the vision itself is compromised. Similarly, the absence of a clear vision is also cause for resistance. We contend that where the purpose or vision remains constant, the means or the methods of achieving it – incremental or disruptive – are more readily adopted.
The “Change Curve” presented in the diagram above clearly suggests that the commitment to change progresses from Leadership to Change Agents and finally to the End Users with each “group” requiring an increasing span of time to absorb and embrace the change accordingly. A potential for frustration and resistance to change occurs when the next iteration is introduced before the change that precedes it has been adopted and “experienced”. For this same reason and as suggested in our post, “Apple’s Best Kept Secrets … May Be Their Worst Enemy“, companies (including Apple) must be careful to manage the frequency at which change occurs to avoid frustrating employees and potential customers in the process.
The absence of change or lack of evidence that change is coming is and should be cause for concern. Research In Motion’s (RIM) continued delays in releasing the BlackBerry 10 (BB10) resulted in lost confidence from investors and share prices dropped sharply in return. RIM’s attempts to “talk” through the company’s strategy and the future of the BlackBerry could not sustain their one time dominance of the smart phone market. Thankfully for RIM, the BlackBerry, slated to launch on January 30, 2013, is receiving raving reviews as a high quality next generation smart phone. Only time will tell if too much time has passed to win people over.
Lean leaders recognize that real change begins in the hearts and minds of every stakeholder and is a pre-requisite before any physical changes can occur. A learning organization embraces the concept of “transitional” thinking where each change represents the current level of knowledge and understanding. Where perpetual learning occurs, transitional thinking ensues, and subsequent changes mark our progress along the journey.
As we look forward to 2013, we thank you for your continued support and wish you the best of successes in the New Year ahead.
How is it that some leaders have a way to bring calm to crisis, chaos, and conflict, weeding out fact from fiction, and somehow setting the path straight for others to follow? The answer is quite simple, they have the tools and ability to make effective decisions efficiently.
I recognize that very few, if any, problems can truly be solved by searching for answers in a book. “The Decision Book” by Mikael Krogerus and Roman Tschappeler presents 50 models for strategic thinking where the objective is not to necessarily find the answers but to understand various models or methods that can be used to help discover them.
The models presented may be used to simplify problems or opportunities enabling you to make the best decisions possible. Deciding which model to use is simply a matter of reviewing the matrix presented on the inside covers of the book itself. The scope of application of each model is specifically targeted to one of four “How To” categories:
How to improve yourself
How to understand yourself better
How to understand others better
How to improve others
Concisely written, the models are presented in a manner that makes them immediately practical. Each model is typically presented with a single written page followed by an illustration to demonstrate how the model may be applied.
At 173 pages, “The Decision Book” is a quick read from cover to cover, however, it also makes for a perfect handbook as each model is unique unto itself. Where correlations between models exist, they are also indicated in the text.
The Decision Book is not all inclusive though it does present many of the best known models for strategic thinking and is certainly one to add to your library. Just remember that making a decision is only the first step. Execution is the key to making it a reality.
When Toyota arrived on the North American manufacturing scene, automakers were introduced to many of Toyota’s best practices including the Toyota Production System (TPS) and the well-known “Toyota Way”. Since that time, Toyota’s best practices have been introduced to numerous other industries and service providers with varying degrees of success.
In simple terms, Toyota’s elusive goal of single piece flow implicitly demands that parts be processed one piece at a time and only as required by the customer. The practice of batch processing was successfully challenged and proven to be inefficient as the practice inherently implies a certain degree of fragmentation of processes, higher inventories, longer lead times, and higher costs.
To the contrary, over specialization can lead to excessive process fragmentation and is evidenced by decreased efficiency, higher labour costs, and increased lead times. In other words, we must concern ourselves with assuring that we have optimized process tasks to the extent that maximum flow is achieved in the shortest amount of time.
An example of excessive specialization can be found in the healthcare system here in Ontario, Canada. Patients visit their family doctor only to be sent to a specialist who in turn prescribes a series of tests to be completed by yet another layer of “specialists”. To complicate matters even more, each of these specialized services are inconveniently separated geographically as well.
Excessive fragmentation can be found by conducting a thorough review of the entire process. The review must consider the time required to perform “real value added” tasks versus non-value added tasks as well as the time-lapse that may be incurred between tasks. Although individual “steps” may be performed efficiently and within seconds, minutes, or hours, having to wait several days, weeks, or even months between tasks clearly undermines the efficiency of the process as a whole.
In the case of healthcare, the time lapse between visits or “tasks” is borne by the patient and since the facilities are managed independently, wait times are inherently extended. Manufacturers suffer a similar fate where outside services are concerned. Localization of services is certainly worthy of consideration when attempting to reduce lead times and ultimately cost.
Computers use de-fragmentation software to “relocate” data in a manner that facilitates improved file storage and retrieval. If only we could “defrag” our processes in a similar way to improve our manufacturing and service industries. “Made In China” labels continue to appear on far too many items that could be manufactured here at home!
It seems that Lean Healthcare is getting a lot of exposure here as of late. I will qualify this by saying “in practice” rather than “name”. The Toronto Star published yet another article, Sunnybrook cuts wait for prostate diagnosis down to 72 hours, that once again demonstrates that improvements can be made if we put our minds to it.
The Need to Change
The need to change is premised on this excerpt from the article:
“But after the needle biopsy . . . it was like my future was hanging from a thread. It was hell.”
And later …
“Men have waited too long,” says Dr. Robert Nam, a Sunnybrook uro-oncologist who is spearheading the accelerated prostate protocol.
“They wait two to three weeks. And two to three weeks knowing that they could have a live-altering disease is something to me that is not acceptable.”
Why – Beyond Reducing Wait Time
Aside from the emotional strain, hidden from view or otherwise, cancers are always best treated when they are detected early:
While many prostate cancers are slow-growing – some are left completely alone — others are aggressive and benefit from immediate treatment.
“There is a big misconception that prostate cancer is such a slow-growing disease that we don’t need to rush into anything,” Nam says.
How did they do it?
The goods news is that they already had a model to work from:
In a new program that mirrors one launched two years ago for rapid breast tumour diagnoses, Toronto’s Sunnybrook Health Sciences Centre has now pledged to give men the results of prostate cancer biopsies within three days.
They also procured new equipment and found efficiencies in the way that results were processed:
The diagnostic acceleration will be accomplished mainly by “finding efficiencies” among hospital pathologists who examine the biopsied tissues and determine the presence and severity of the ailment. Nam says any priority shift in the hospital’s pathology department – which expects no staff increase — will not mean other forms of cancer get shorter shrift.
Room to Improve
As mentioned earlier, Sunnybrook had a surrogate model to follow but there is still room to improve:
Men will still have to wait three times longer for their results than women, who are promised a breast cancer diagnosis within a day of being biopsied.
It’s NOT about the money!
I share this information on the premise that we are continually reminded, at least here in Ontario, that we simply don’t have the resources or the funds to improve health care. I become increasingly frustrated by the misconception of our government that we are already as efficient as we possibly can be.
“We made it cost neutral and . . . we did not jeopardize any other program within the pathology department,” he says.
I am thankful that Sunnybrook Hospital staff have demonstrated yet again that real opportunities for improvement can be made without incurring additional expense to the system.
It’s the Culture
The significance of the effort here is not just the idea itself but the culture that allows these ideas to flourish. Sunnybrook Hospital clearly supports improvements from within and outside the hospital and is also quite eager to share them as evidenced in our previous post, Lean – Sunnybrook Doctors Benefit from Gaming Technology.
I am currently reading “Toyota Under Fire” by Jeffrey K. Liker and Timothy N. Ogden where once again it is confirmed that Toyota’s culture is at the very core of it’s resilience and ability to adapt and change to meet the current crisis at hand. Clearly, the economic crisis we still find ourselves having to contend with is cause to pause and reflect on how we can indeed adapt and change to meet our every day challenges in our personal lives, business, industry, and governments alike.
There is much to be learned and so much more to be gained. We must learn to watch and listen and at the very least acknowledge that there is always a better way.
In my article “Waste: The Devil is in the Details“, I discussed the importance of paying attention to the details. From a company or personal perspective, the underlying theme to identify waste (or opportunity) is to be continually cognizant of what it is we’re doing and asking “Why?”
I have continually stressed the importance of conducting process reviews right where the action is. It seems we’re not alone in this thinking and I thought it was quite fitting to share an e-mail I received from John Shook:
Decompressing now from last week’s Lean Transformation Summit in Dallas, there is much to reflect upon. We heard from four companies and experienced six learning sessions to explore the frontiers and fundamentals of lean transformation. And it is always exciting to get together with 440 like-minded, lean-thinking individuals.
Apologies again to the many of you weren’t able to attend since the event sold out so early. You should know, however, we do not plan to expand the size of the event in the future. We want to continue to limit it to a relatively intimate size to enable and encourage interaction, dialogue, debate, networking, and casual socializing.
I do have good news for those of you who missed the event. One highlight was the debut of Jim Womack’s new book, Gemba Walks, which is now available to you.
Many have asked what Jim has been up to since stepping down as CEO of LEI. The answer is that Jim has remained as busy as ever and, what’s more, now his letters are back, in different form. In Gemba Walks, Jim compiles many of his eLetters, written between 2001 and 2011. Gemba Walks is more than a mere compilation, however, with some new content and new commentary for each letter, edited and grouped by topic. As a reader, I can tell you that the experience of reading the letters in this new context is surprising, refreshing, enlightening, and, well, fun. It’s always an enjoyable romp to join Jim on a walk through a gemba and Gemba Walks provides the next best thing to being there.
These three principles of lean leadership are well-known: Go see, ask why, and show respect. You know that to “go see” is fundamental to all lean thinking and acting. But, what does that actually mean? How do we go see?”
Gemba Walks reveals how Jim’s thinking has evolved over time as a result of observing what happens as lean has taken root in companies around the world over time. New successes lead inevitably to new, and better problems, for lean practitioners. This book documents how companies are continuing to press forward.
In my foreword, I recall the first time I had a chance to visit a gemba with Jim, when I was still a Toyota employee:
“The first time I walked a gemba with Jim was on the plant floor of a Toyota supplier. Jim was already famous as the lead author of The Machine That Changed the World; I was the senior American manager at the Toyota Supplier Support Center. My Toyota colleagues and I were a bit nervous about showing our early efforts of implementing TPS at North American companies to “Dr. James P. Womack.” We had no idea of what to expect from this famous academic researcher.
“My boss was one of Toyota’s top TPS experts, Mr. Hajime Ohba. We rented a small airplane for the week so we could make the most of our time, walking the gemba of as many worksites as possible. As we entered the first supplier, walking through the shipping area, Mr. Ohba and I were taken aback as Jim immediately observed a work action that spurred a probing question. The supplier was producing components for several Toyota factories. They were preparing to ship the exact same component to two different destinations. Jim immediately noticed something curious. Furrowing his brow while confirming that the component in question was indeed exactly the same in each container, Jim asked why parts headed to Ontario were packed in small returnable containers, yet the same components to be shipped to California were in a large corrugated box. This was not the type of observation we expected of an academic visitor in 1993.
“Container size and configuration was the kind of simple (and seemingly trivial) matter that usually eluded scrutiny, but that could in reality cause unintended and highly unwanted consequences. It was exactly the kind of detail that we were encouraging our suppliers to focus on. In fact, at this supplier in particular, the different container configurations had recently been highlighted as a problem. And, in this case, the fault of the problem was not with the supplier but with the customer – Toyota! Different requirements from different worksites caused the supplier to pack off the production line in varying quantities (causing unnecessary variations in production runs), to prepare and hold varying packaging materials (costing money and floor space), and ultimately resulted in fluctuations in shipping and, therefore, production requirements. The trivial matter wasn’t as trivial as it seemed.
“We had not been on the floor two minutes when Jim raised this question. Most visitors would have been focused on the product, the technology, the scale of the operation, etc. Ohba-san looked at me and smiled, to say, ‘This might be fun.'” (Click here for a free pdf of the complete foreword.)
Fun it has been. Challenging it has been, too, but always full of learning. Fun and challenging learning it will no doubt continue to be.
I am often asked what book to recommend to start someone down the lean path. From now on, Gemba Walks will be that book. With an overview of tools and theory told through stories and explorations of real events, Gemba Walks invites readers to tackle problems on an immediate and personal level. In so doing, it gives courage for beginners to get started. And for veterans to keep going.
Chairman and CEO
Lean Enterprise Institute, Inc.
Again it is worth noting the attention to detail. I recall a number of occassions where I have challenged customers to address operational differences between facilities (not much different from the situation above). I can say that Toyota was one of the few companies that listened and actually did something about it.
I recognize that benchmarking is not a new concept. In business, we have learned to appreciate the value of benchmarking at the “macro level” through our deliberate attempts to establish a relative measure of performance, improvement, and even for competitor analysis. Advertisers often use benchmarking as an integral component of their marketing strategy.
The discussion that follows will focus on the significance of benchmarking at the “micro level” – the application of benchmarking in our everyday decision processes. In this context, “micro benchmarking” is a skill that we all possess and often take for granted – it is second nature to us. I would even go so far as to suggest that some decisions are autonomous.
With this in mind, I intend to take a slightly different, although general, approach to introduce the concept of “micro benchmarking”. I also contend that “micro benchmarking” can be used to introduce a new level of accountability to your organization.
Human Resources – The Art of Deception Interviews and Border Crossing
Micro benchmarking can literally occur “in the moment.” The interview process is one example where “micro benchmarking” frequently occurs. I recently read an article titled, “Reading people: Signs border guards look for to spot deception“, and made particular note of the following advice to border crossing agents (emphasis added):
Find out about the person and establish their base-line behavior by asking about their commute in, their travel interests, etc. Note their body language during this stage as it is their norm against which all ensuing body language will be compared.
The interview process, whether for a job or crossing the border, represents one example where major (even life changing) decisions are made on the basis of very limited information. As suggested in the article, one of the criteria is “relative change in behavior” from the norm established at the first greeting. Although the person conducting a job interview may have more than just “body language” to work with, one of the objectives of the interview is to discern the truth – facts from fiction.
Obviously, the decision to permit entry into the country, or to hire someone, may have dire consequences, not only for the applicant, but also for you, your company, and even the country. Our ability to benchmark at the micro level may be one of the more significant discriminating factors whereby our decisions are formulated.
Decisions – For Better or Worse:
Every decision we make in our lives is accompanied by some form of benchmarking. While this statement may seem to be an over-generalization, let’s consider how decisions are actually made. It is a common practice to “weigh our options” before making the final decision. I suggest that every decision we make is rooted against some form of benchmarking exercise. The decision process itself considers available inputs and potential outcomes (consequences):
Better – Worse
Pro’s – Con’s
Advantages – Disadvantages
Life – Death
Success – Failure
Safe – Risk
Decisions are usually intended to yield the best of all possible outcomes and, as suggested by the very short list above, they are based on “relative advantage” or “consequential” thinking processes. At the heart of each of these decisions is a base line reference or “benchmark” whereby a good or presumably “correct” decision can be made.
We have been conditioned to believe (religion / teachings) and think (parents / education / social media / music) certain thoughts. These “belief systems” or perceived “truths” serve as filters, in essence forming the base line or “benchmark” by which our thoughts, and hence our decisions, are processed. Every word we read or hear is filtered against these “micro level” benchmarks.
I recognize that many other influences and factors exist but, suffice it to say, they are still based on a relative benchmark. Unpopular decisions are just one example where social influences are heavily considered and weighed. How many times have we heard, “The best decisions are not always popular ones.” Politicians are known to make the tough and not so popular decisions early on in their term and rely on a waning public memory as the next election approaches – time heals all wounds but the scars remain.
Decisions – Measuring Outcomes
As alluded to in the last paragraph, our decision process may be biased as we consider the potential “reactions” or responses that may result. Politics is rife with “poll” data that somehow sway the decisions that are made. In a similar manner, substantially fewer issues of value are resolved in an election year for fear of a negative voter response.
In essence there are two primary outcomes to every decision, Reactions and Results. The results of a decision are self-explanatory but may be classified as summarized below.
If you are still with me, I suggest that at least two levels of accountability exist:
The process used to arrive at the decision
The results of the decision
In corporations, large and small, executives are often held to account for worse than expected (negative) performance, where results are the primary – and seemingly only – focus of discussion. I contend that positive results that exceed expectations should be subject to the same, if not higher, level of scrutiny.
Better and worse than expected results are both indicative of a lack of understanding or full comprehension of the process or system and as such present an opportunity for greater learning. Predicting outcomes or results is a fundamental requirement and best practice where accountability is an inherent characteristic of company culture.
Toyota is notorious for continually deferring to the most basic measurement model: Planned versus Actual. Although positive (better than expected) results are more readily accepted than negative (worse than expected) results, both impact the business:
Better than expected:
Other potential investments may have been deferred based on the planned return on investment.
Financial statements are understated and affects other business aspects and transactions.
Decision model / process does not fully describe / consider all aspects to formulate planned / predictable results
Decision process to yield actual results cannot be duplicated unless lessons learned are pursued, understood, and the model is updated.
Worse than expected:
Poor / lower than expected return on investment
Extended financial obligations
Negative impact to cash flow / available cash
Lower stakeholder confidence for future investments
Decision model / process does not fully describe / consider all aspects to formulate planned / predictable results
Decision process will be duplicated unless lessons learned are pursued, understood, and the model is updated.
The second level of accountability and perhaps the most important concerns the process or decision model used to arrive at the decision. In either case we want to discern between informed decisions, “educated guesses”, “wishful thinking”, or willful neglect. We can see that individual and system / process level accountabilities exist.
The ultimate objective is to understand “what we were thinking” so we can repeat our successes without repeating our mistakes. This seems to be a reasonable expectation and is a best practice for learning organizations.
Some companies are very quick to assign “blame” to individuals regardless of the reason for failure. These situations can become very volatile and once again are best exemplified in the realm of politics. There tends to be more leniency for individuals where policies or protocol has been followed. If the system is broken, it is difficult to hold individuals to account.
The Accountability Solution – Show Your Work!
So, who is accountable? Before you answer that, consider a person who used a decision model and the results were worse than the model predicted. From a system point of view the person followed standard company protocol. Now consider a person who did not use the model, knowing it was flawed, and the results were better than expected. Both “failures” have their root in the same fundamental decision model.
The accountabilities introduced here however are somewhat different. The person following protocol has a traceable failure path. In the latter case, the person introduced a new “untraceable” method – unless of course the person noted and advised of the flawed model before and not after the fact.
Toyota is one of the few companies I have worked with where documentation and attention to detail are paramount. As another example, standardized work is not intended to serve as a rigid set of instructions that can never be changed. To the contrary, changes are permissible, however, the current state is the benchmark by which future performance is measured and proven. The documentation serves as a tangible record to account for any changes made, for better or worse.
Throughout high school and college, we were always encouraged to “show our work”. Some courses offered partial marks for the method although the final answer may have been wrong. The opportunities for learning here however are greater than simply determining the student’s comprehension of the subject material. To the contrary, it also offers an opportunity for the teacher to understand why the student failed to comprehend the subject matter and to determine whether the method used to teach the material could be improved.
Showing the work also demonstrates where the process break down occurred. A wrong answer could have been due to a complete misunderstanding of the material or the result of a simple mis-entry on a calculator. Why and how we make our decisions is just as important to understanding our expectations.
While the latter situations may be more typical of a macro level benchmark, I suggest that similar checks and balances occur even at the micro level. As mentioned in the premise, some decisions may even be autonomous (snap decisions). Examples of these decisions are public statements that all too often require an apology after the fact. The sentiments for doing so usually include, “I’m sorry, I didn’t know what I was thinking.” I am always amazed to learn that we may even fail to keep ourselves informed of what we’re thinking sometimes.
Admittedly, it has been a while since I checked a shampoo bottle for directions, however, I do recall a time in my life reading: Lather, Rinse, Repeat. Curiously, they don’t say when or how many times the process needs to be repeated.
Perhaps someone can educate me as to why it is necessary to repeat the process at all – other than “daily”. I also note that this is the only domestic “washing” process that requires repeating the exact same steps. Hands, bodies, dishes, cars, laundry, floors, and even pets are typically washed only once per occasion.
The intent of this post is not to debate the effectiveness of shampoo or to determine whether this is just a marketing scheme to sell more product. The point of the example is this: simply following the process as defined is, in my opinion, inherently wasteful of product, water, and time – literally, money down the drain.
Some shampoo companies may have changed the final step in the process to “repeat as necessary” but that still presents a degree of uncertainty and assures that exceptions to the new standard process of “Lather, Rinse, and Repeat as Necessary” are likely to occur.
In the spirit of continuous improvement, new 2-in-1 and even 3-in-1 products are available on the market today that serve as the complete “shower solution” in one bottle. As these are also my products of choice, I can advise that these products do not include directions for use.
Scratching the Surface
As lean practitioners, we need to position ourselves to think outside of the box and challenge the status quo. This includes the manner in which processes and tasks are executed. In other words, we not only need to assess what is happening, we also need to understand why and how.
One of the reasons I am concerned with process audits is that conformance to the prescribed systems, procedures, or “Standard Work” somehow suggests that operations are efficient and effective. In my opinion, nothing could be further from the truth.
To compound matters, in cases where non-conformances are identified, often times the team is too eager to fix (“patch”) the immediate process without considering the implications to the system as a whole. I present an example of this in the next section.
The only hint of encouragement that satisfactory audits offer is this: “People will perform the tasks as directed by the standard work – whether it is correct or not.” Of course this assumes that procedures were based on people performing the work as designed or intended as opposed to documenting existing habits and behaviors to assure conformance.
Examining current systems and procedures at the process level only serves to scratch the surface. First hand process reviews are an absolute necessity to identify opportunities for improvement and must consider the system or process as a whole as you will see in the following example.
Manufacturing – Another Example
On one occasion, I was facilitating a preparatory “process walk” with the management team of a parts manufacturer. As we visited each step of the process, we observed the team members while they worked and listened intently as they described what they do.
As we were nearing the end of the walk through, I noted that one of the last process steps was “Certification”, where parts are subject to 100% inspection and rework / repair as required. After being certified, the parts were placed into a container marked “100% Certified” then sent to the warehouse – ready for shipping to the customer.
When I asked about the certification process, I was advised that: “We’ve always had problems with these parts and, whenever the customer complained, we had to certify them all 100% … ‘technical debate and more process intensive discussions followed here’ … so we moved the inspection into the line to make sure everything was good before it went in the box.”
Sadly, when I asked how long they’ve been running like this, the answer was no different from the ones I’ve heard so many times before: “Years”. So, because of past customer problems and the failure to identify true root causes and implement permanent corrective actions to resolve the issues, this manufacturer decided to absorb the “waste” into the “normal” production process and make it an integral part of the “standard operating procedure.”
To be clear, just when you thought I picked any easy one, the real problem is not the certification process. To the contrary, the real problem is in the “… ‘technical debate and more process intensive discussions followed here’ …” portion of the response. Simply asking about the certification requirement was scratching the surface. We need to …
Get Below the Surface
I have always said that the quality of a product is only as good as the process that makes it. So, as expected, the process is usually where we find the real opportunities to improve. From the manufacturing example above, we clearly had a bigger problem to contend with than simply “sorting and certifying” parts. On a broader scale, the problems I personally faced were two-fold:
The actual manufacturing processes with their inherent quality issues and,
The Team’s seemingly firm stance that the processes couldn’t be improved.
After some discussion and more debate, we agreed to develop a process improvement strategy. Working with the team, we created a detailed process flow and Value Stream Map of the current process. We then developed a Value Stream Map of the Ideal State process. Although we did identify other opportunities to improve, it is important to note that the ideal state did not include “certification”.
I worked with the team to facilitate a series of problem solving workshops where we identified and confirmed root causes, conducted experiments, performed statistical analyses, developed / verified solutions, implemented permanent corrective actions, completed detailed process reviews and conducted time studies. Over the course of 6 months, progressive / incremental process improvements were made and ultimately the “certification” step was eliminated from the process.
We continued to review and improve other aspects of the process, supporting systems, and infrastructure as well including, but not limited to: materials planning and logistics, purchasing, scheduling, inventory controls, part storage, preventive maintenance, redefined and refined process controls, all supported by documented work instructions as required. We also evaluated key performance indicators. Some were eliminated while new ones, such as Overall Equipment Effectiveness, were introduced.
Some of the tooling changes to achieve the planned / desired results were extensive. One new tool was required while major and minor changes were required on others. The real tangible cost savings were very significant and offset the investment / expense many times over. In this case, we were fortunate that new jobs being launched at the plant could absorb the displaced labor resulting from the improvements made.
Every aspect of the process demonstrated improved performance and ultimately increased throughput. The final proof of success was also reflected on the bottom line. In time, other key performance indicators reflected major improvements as well, including quality (low single digit defective parts per million, significantly reduced scrap and rework), increased Overall Equipment Effectiveness (Availability, Performance, and Quality), increased inventory turns, improved delivery performance (100% on time – in full), reduced overtime, and more importantly – improved morale.
I have managed many successful turnarounds in manufacturing over the course of my career and, although the problems we face are often unique, the challenge remains the same: to continually improve throughput by eliminating non-value added waste. Of course, none of this is possible without the support of senior management and full cooperation of the team.
While it is great to see plants that are clean and organized, be forewarned that looks can be deceiving. What we perceive may be far from efficient or effective. In the end, the proof of wisdom is in the result.
I recently published, Urgent -> The Cost of Things Gone Wrong, where I expressed concern for dashboards that are attempting to do too much. In this regard, they become more of a distraction instead of serving the intended purpose of helping you manage your business or processes. To be fair, there are at least two (2) levels of data management that are perhaps best differentiated by where and how they are used: Scorecards and Dashboards.
I prefer to think of Dashboards as working with Dynamic Data. Data that changes in real-time and influences our behaviors similar to the way the dashboard in our cars work to communicate with us as we are driving. The fuel gauge, odometer, two trip meters, tachometer, speedometer, digital fuel consumption (L/100 km), and km remaining are just a few examples of the instrumentation available to me in my Mazda 3.
While I appreciate the extra instrumentation, the two that matter first and foremost are the speedometer and the tachometer (since I have a 5 speed manual transmission). The other bells and whistles do serve a purpose but they don’t necessarily cause me to change my driving behavior. Of note here is that all of the gauges are dynamic – reporting data in real time – while I’m driving.
A Scorecard on the other hand is a periodic view of summary data and from our example may include Average Fuel Consumption, Average Speed, Maximum Speed, Average Trip, Maximum Trip, Total Miles Traveled and so on. The scorecard may also include other items such as driving record / vehicle performance data such as Parking Tickets, Speeding Tickets, Oil Changes, Flat Tires, Emergency and Preventive Maintenance.
Take some time to review your current metrics. What metrics are truly influencing your behaviors and actions? How are you using your metrics to manage your business? Are you reacting to trends or setting them?
It’s been said that, “What gets measured gets managed.” I would add – “to a point.” It simply isn’t practical or even feasible to measure everything. I say, “Measure to manage what matters most”.
Remember to get your free Excel Templates for OEE by visiting our downloads page or the orange widget in the sidebar. You can follow us on twitter as well @Versalytics.