Thursday, March 14, 2019

HUMANS TRAPPED BY AUTOMATION


Our minds 

resolve to adapt to our surroundings fairly quickly. Imagine my surprise when the Air Asiana jumbo jet came crashing down on San Francisco's Runway 28L in pieces strewing metal and bodies all over the runway. It was a shock of disbelief. In this day and age where Auto throttle and Autoland features are redundant in the airliners, why would this happen?

The answer was quite simple, horrific as it was; The Pilots had relied on automation to the point of dependency. They had been managers most of their pilot life in the airlines and given the redundancy of the equipment and it’s stellar history, well, the corporate offices of the airlines, figured it better for the aircraft to do its magic without the interference from the pilot. Sure enough the Instrument Landing System on the SF Runway was out for maintenance and the pilots were given instructions for a straight-in “Visual Approach” to the Runway 28L, that they were unable to manage.

The excerpts below on Air Asiana flight #214 are from the NTSB: https://youtu.be/zTXDalv7kNQ

“The flight was vectored for a visual approach to runway 28L and intercepted the final approach course about 14 nautical miles (nm) from the threshold at an altitude slightly above the desired 3° glidepath. This set the flight crew up for a straight-in visual approach; however, after the flight crew accepted an air traffic control instruction to maintain 180 knots to 5 nm from the runway, the flight crew mismanaged the airplane’s descent, which resulted in the airplane being well above the desired 3° glidepath when it reached the 5 nm point. The flight crew’s difficulty in managing the airplane’s descent continued as the approach continued. In an attempt to increase the airplane’s descent rate and capture the desired glidepath, the pilot flying (PF) selected an autopilot (A/P) mode (flight level change speed [FLCH SPD]) that instead resulted in the autoflight system initiating a climb because the airplane was below the selected altitude. The PF disconnected the A/P and moved the thrust levers to idle, which caused the autothrottle (A/T) to change to the HOLD mode, a mode in which the A/T does not control airspeed. The PF then pitched the airplane down and increased the descent rate. Neither the PF, the pilot monitoring (PM), nor the observer noted the change in A/T mode to HOLD.

As the airplane reached 500 ft above airport elevation, the point at which Asiana’s procedures dictated that the approach must be stabilized, the precision approach path indicator (PAPI) would have shown the flight crew that the airplane was slightly above the desired glidepath. Also, the airspeed, which had been decreasing rapidly, had just reached the proper approach speed of 137 knots. However, the thrust levers were still at idle, and the descent rate was about 1,200 ft per minute, well above the descent rate of about 700 fpm needed to maintain the desired glidepath; these were two indications that the approach was not stabilized. Based on these two indications, the flight crew should have determined that the approach was unstabilized and initiated a go-around, but they did not do so. As the approach continued, it became increasingly unstabilized as the airplane descended below the desired glidepath; the PAPI displayed three and then four red lights, indicating the continuing descent below the glidepath. The decreasing trend in airspeed continued, and about 200 ft, the flight crew became aware of the low airspeed and low path conditions but did not initiate a go-around until the airplane was below 100 ft, at which point the airplane did not have the performance capability to accomplish a go-around. The flight crew’s insufficient monitoring of airspeed indications during the approach resulted from expectancy, increased workload, fatigue, and automation reliance.”

On the other side are the Lion Air and the more recent Ethiopian Airliners that were both struck with tragedy. Both seemed to be the result of a new automated system: The Maneuvering Characteristics Augmentation System (MCAS) is an automated safety feature on the 737 Max 8 designed to prevent the plane from entering into a stall, or losing lift.
https://phys.org/news/2019-03-ethiopian-airlines-mcas-boeing-max.html

The potential of the new automation seems to have played a part, at least in the Lion Air information and the flight paths and behavior of both aircraft, on Radar seem to be identical, indicating the ghost of the MCAS automation within the silicon chips may have created the maladies that either crew of the two airliners were untrained or unable to handle. Important to note, a flight crew had successfully interrupted the automation from the MCAS system in a previous flight prior to the Lion Air tragedy.

Lion Air:  
https://www.bbc.com/news/world-asia-46373125
Ethiopian Airline: 
https://www.cnn.com/2019/03/10/africa/ethiopian-airlines-crash-boeing-max-8-intl/index.html

“Click, Click and Click, Click” is what the Chief Pilot of the American Airlines stated in this great video, “The children of the Magenta Line.” It is worth a watch and explanatory of our continuous dependence to a greater degree on automation.
or click on the Blue icon "Watch on Vimeo"


Tesla on the ground seems to be having some trouble on the roadways as well. It seems that the cohabitation of robotics and humans has to be synchronized to the degree that both sides; the silicon chips and the Grey Matter, have defective understanding and that the Grey Matter side better be better prepared to reconcile any anomalies that might arise in the chip algorithm command. “Click, Click and Click, Click” to disconnect the Autopilot and fly by hand while resolving the anomaly.

The other side of this automation in keeping personal data safe is also wreaking havoc; routinely we hear of data breaches in the digital medical record keeping business. Millions have their personal data exposed those who will sell it for nefarious reasons. Increasingly the younger generation by a 55% margin seems to think that allowing giant digital corporations to know everything about their personal life is ok, since, they do “nothing wrong.” The fallacy of this thinking is blatant if one is to reason through. Innocence is determined by the current vogue in thinking. What is acceptable in the 2010s may be considered lunacy in the 2020s. So imagine your chagrin when all the “clouds” are combed over and voila, your innocent mischief is now considered a crime against humanity. The #Metoo scenario with the advent of past and present digital records is playing out even against those who initiated this form of societal justice. Real-Time video manipulations and other assorted nefarious tricks are creating animosity amongst the best of friends. Sad reality from such nefarious actions abound in the world today.

1. It is perhaps not necessarily being a luddite that one gains some protection, but from common sense usage of the grand scale digital universe. Keep your personal information to yourself. (FYI)

2. Personal Information in today’s world is equivalent to your life’s work and acquired assets.

3. Monitor Automation but don’t let it rule you.

4. Keep a healthy respect for all automation. Be it in aircraft, cars or even medical devices.

5. Remember the Software was written by humans.

6. When in doubt, OVERRIDE the automation.

7. A single nucleotide misplaced/inserted/ deleted can cause a calamity of cancerous proportions, so can a single digit inserted/deleted/misapplied can cause the algorithm to cause the robotic maneuver to stop dead in its tracks.

Above all learn the discipline of Reason and Critical Thinking. Life is short and Learning is fun. Read, write and spread that honest joy around.

No comments:

Post a Comment