Tuesday, March 29, 2016

Lessons from Sam Elfassy, Senior Director, Corporate Safety- Air Canada.



This past Summer I was fortunate to have been invited to participate at the North York Medical Center medical Staff Leadership conference. The general theme was all about risk- to the patient, the team and the institution. The first talk was on Canadian healthcare privacy and security concerns in the global electronic healthcare era. The next was an excellent talk by chief of aviation safety at Canadian airways on integration of their Safety Management Systems in many high-risks environments including health-care. Following that was a discussion on several methods of being proactive on team effectiveness. Just after this was a discussion of several legal risks to the institution and Canadian healthcare including the lack of jobs for graduating residents and the aging MD population. I then presented management strategies for crises that can apply to crises at the bedside, ward, hospital, or entire instruction. The final discussion tied everything together thru an institutional risk management model assessing the highest risks the entire institution faces. So as I said previously, this leadership conference covers all extremes of risk a hospital may face and would suffice as a model for others for certain.


During a break and during the meeting I had the pleasure of talking with Samuel Elfassy Senior Director, Corporate Safety, Environment & Quality at Air Canada last week.
Two interesting topics were Pilot Load shedding and Systems issues that creep up on you.
Load Shedding: Like physicians, pilots learn to offload data that seems extraneous to them in an effort to focus on what appears to be most important- like flying the plane- especially when a crisis develops. This can be good and bad depending on what data they are choosing to ignore. Like a surgeon, ignoring the wrong piece of input can prove to result in disaster.
When the team as a whole has a narrow field of view, we can miss very important signs of impending doom so we ought to learn how to work together to assure that while the operator cannot see everything, someone is assuring we are not missing a bad indication.
Subtle Systems issues: We discussed the fate of planes when Jack Screw maintenance routines grow lax due to approved FAA regulations. As the TSA report on the 2000 crash of Alaskan Airways flight into the Pacific Ocean revealed, just because nothing has gone wrong, does not necessarily mean policy that has been in place for a good reason should be changed.


Kenneth A. Lipshy, MD, FACS
www.crisismanagementleadership.com

No comments:

Post a Comment