User talk:Kolesmic

From Clinfowiki
Jump to: navigation, search

Welcome to Clinfowiki! We hope you will contribute much and well. You will probably want to read the help pages. Again, welcome and have fun! Vmohan (talk) 00:40, 1 May 2019 (UTC)

Crew/Cockpit Resource Management (CRM)


This terminology and concept comes from aviation industry and has been developed as safety measures after a number of accidents, which were attributed to hierarchical culture of cockpit. The hierarchical culture includes The Halo Effect, which is a belief that senior leaders are significantly less capable of making an error and expects individual perfection from a person in charge. The hierarchical culture intimidates subordinate crew-members from speaking up when time-critical problems are detected by a less senior crew member. Cockpit Resource Management is a set of measures that were developed to minimize the Halo Effect and enable junior crew members to speak up.

Historical background

There are two aviation incidents, which prompted implementations of CRM. First episode was a collision of two Boeing 747 on March 27, 1977. It happened at Los Rodeos Airport under very limited visibility conditions. One airplane belonged to Pan American airlines and another to KLM. The Pan Am Boeing was blocking the runway and the KLM Boeing has initiated a takeoff without being officially cleared by the tower. The decision of initiating takeoff was done by captain of the plane in part because the crew members were unable to speak up and report that the plane has not been cleared for takeoff by air traffic controllers. This incident resulted in 583 people being killed and was called the deadliest accident in aviation history.

The second accident happened in December 28, 1978 near Portland, Oregon. The United Flight 173 was making an approach to PDX airport and noticed that a light did not come on indicating landing gear down position. The captain took and airplane into holding pattern and attempted to troubleshoot the malfunction. While focusing on the landing gear, the airplane circled around Portland for over an hour, ran out of fuel and crash landed in suburbs, killing two crew members and eight passengers. The captain of the airplane has ignored multiple warning from the team members about running out of fuel and noticed this problem only when engines began flaming out. A NTSB Air Safety investigator Dr. Alan Diehl, who was an aviation psychologist, after investigating this accident, has made recommendations to ensure flight crew members have been properly trained in principles of CRM.

According to some researchers, between 70%-80% of aviation accidents prior to 1980’s were attributed to human factor (5), which by definition makes it a preventable type of accident. After implementation of CRM the number of aviation accidents related to human factor has been sharply declining and reduced to virtually zero. In his book Why Hospitals Should Fly, John Nance gives the following perspective: “During the same five-year period in which passenger deaths aboard major U.S. airlines hit a total of zero (2001 to 2006), American hospitals killed an estimated 250,000 to 500,000 patients with medical mistakes. That’s the equivalent of crashing approximately 1,400 fully loaded Boeing 747’s with no survivors!” (1)

Why Hospitals Should Fly

After several years of practice and a number of prevented aviation accidents, the cockpit resource management has been brought to attention of medical community by an airline pilot John J. Nance, who found striking similarities in hospital culture and aviation culture prior to implementation of CRM. In 2008 Mr. Nance published a book Why Hospitals Should Fly. One of the questions he asks in this book is “How can it be that in 2008, a checked bag on an airline flight is still exponentially safer than a patient in an American hospital?” (1) Mr. Nance defines medical mistakes as “human mistakes, which were done within a system inadequately designed to catch and neutralize them.” (1) He believes that by learning from aviation industry and re-designing the healthcare system it is possible to achieve similar results in medicine, as it has been done in aviation.

Normalizing the error

One of the main principles of CRM is normalizing the error. Prior to implementation of CRM the focus was on preventing the error. However CRM shifts the emphasis from “prevention” to “catching” the error. In his book Mr. Nance makes a special effort to show that even brightest physicians are capable of making life-threatening errors, but if hospital culture is designed to anticipate such errors it will help to catch them before a patient is injured.

Collegial interactive Team (CIT)

According to CRM one of the ways to catch and correct for errors is creating Collegial Interactive Teams (CIT’s). This is a cultural intervention, which allows teams to reduce barriers in communications to allow junior members of the team to approach senior leaders and express their ideas, suggestions, opinions and even diagnoses without being perceive as challenge to leader’s authority or professional abilities.

Overcoming Barriers in Communications

In his book Mr. Nance states that physician-nurse communications arise from absence of collegiality and that contributes to adverse outcomes. One of the ways suggested to remove barriers is to use carry on conversation on a first-name basis. That will allow nurses to speak more freely and will create collaborative atmosphere in the team with equal respect and equal power. The motto for CRM, which is directly related to removing the barriers in communications is “see it, say it, fix it.”

Closed-loop communications

Closed-loop communications is a major component of CRM. Closed loop communications are recommended for entering provider’s orders into an electronic health record (EHR) system. It consists of entering practitioner (i.e. providers, nurses, pharmacists, nursing assistants) reading back the provider’s orders before releasing them in the EHR. This allows for the whole team to verify the order and catch errors. Another application of closed-loop communications is a surgical “time-out” pause, during which a crucial patient information, such as name, diagnosis, surgical site and type of the procedure is communicated to the whole team before the scalpel is applied to the patient.

CRM recommends a prescribed scrip for critical communications, which consists of the following elements:

1. Opening statement or attention getter: because too many communications were attempted as indirect communications or just stating the problem aloud, the CRM model requires starting the communication with directly addressing the individual, preferably by first name, such as “Hey Mike”, or “Doctor Smith”

2. Give background information: for example: “Ms. Johnson is a 70 years old female with a history of coronary artery disease, who is coming for a total knee replacement today. She reports increasing signs of angina over last 2 months, which is now triggered by minimal physical activity”

3. State your concern: this portion is created for the person to say exactly what is seeing as a problem and should start with words: “I am concerned...” For example: I’m concerned for a high risk of intraoperative complications, such as MI”

4. State a solution: in this portion a person is encouraged to propose a solution to the stated problem: “I would like to order a cardiology consult or a stress test before proceeding with a surgery”

5. Obtain agreement (or buy-in): “Would that be OK with you?” Usually the CRM training includes acting on the recommendations if the person has asked more than twice and did not receive an answer due to rest of the crew being pre-occupied with other critical issues.


1. Nance, John J.. Why Hospitals Should Fly: The Ultimate Flight Plan to Patient Safety and Quality Care . Second River Healthcare. Kindle Edition.

2. Human Factors Report on the Tenerife Accident - Air Line Pilots Association of the United States (Archive)

3. Aviation Safety Network for United 173

4. Portland airliner crash in 1978 killed 10, but changed the way crews are trained - The Oregonian

5. Shappell, S., Wiegmann, D., HFACS Analysis of Military and Civilian Aviation Accidents: A North American Comparison, A special report of Civil Aviation Medical Institute & University of Illinois at Urbana-Champaign, USA

Submitted by Michael Kolesnikov, NP