I plan to write a few articles on this platform about the concepts of the Black Swan, Antifragility, and putting its own Skin in the Game from a software engineering standpoint. All of these concepts originated in the work of Nassim Taleb (a non-specialist and a non-intellectual), after all, over the last decades we have seen a number of highly educated media experts or intellectuals making decisions or even influencing people without take the risk for decisions or even worry about the consequences of their statements.
Perhaps that is why Nassim Taleb (the author of Skin in the Game) is not comfortable when given the label of intellectual or expert.
But I said the point here is to raise reflections on the concepts mentioned above and their relationship to software engineering.
See this image:
This is a Boeing 737 MAX 8 after a MCAS software crash
I have been trying to meditate on the fact that software was never just code. This image should make any developer explicit what code matters. Software means bringing the real world, the physical world, into the virtual world and then merging them.
You may have codes that run on a console (XBOX or PS) on which you may be flying an airplane or a car. The software provides entertainment, works your emotions, and perhaps arouses a talent for skills.
But you also have code running on critical products like … commercial aircraft.
Code and software are never written to be appreciated as art in a museum, they usually interact with people.
In the book Skin in the Game, Nassim Taleb gives an example of the motivation we have in trusting our lives in the hands and professional competence of an airplane pilot while traveling on vacation or work and can usually only think of vacation or work during the flights.
The pilot is on the same plane as the passengers. He literally has his Skin in the Game.
If you have read or heard about Antifragility, you can imagine that it is not easy to write an article criticizing the civil aviation industry.
This industry is rigid, serious, efficient and is an example of how to learn from mistakes and become better.
But in this case, not even pilots who were on the same flight and lost their lives could have avoided it (evidence points to this and models are prohibited from flying), the central problem of these aircraft was software.
This is hard. Obviously, of course, who made the software, who tested the software, who worked hard to do the best in a critical software industry did not think that lives would be lost because of software. It’s just only code!
I believe the simple fact of working on critical software projects like this one evidence the extreme abiliity of the people involved, but…
Investigations into accidents with the model
737 MAX 8 are still ongoing and so, considering that investigations can actually get the concrete reality of events, new facts can be revealed.
However, based on journalistic investigations of the incident, some things should be cited:
The spokesman said F.A.A. employees were following agency rules when they didn’t review the change. “The change to MCAS didn’t trigger an additional safety assessment because it did not affect the most critical phase of flight, considered to be higher cruise speeds,” an agency spokesman said.
Source: The New York Times
- The design of the aircraft obviously involves not just software, but software is a critical part of the design;
- The software has undergone unforeseen changes (sorry the extremist fanatics who misrepresented the concepts of Agile, sometimes unforeseen changes are bad for everyone involved and they don’t deliver value) and for some involved they weren’t major enough changes, not should influence “important” things;
- They did not give due importance to the need for further revisions and testing after the software change;
- Aircraft pilots (Skin in the Game) had not been informed of the changes;
- One of the premises of the MCAS software was to ensure more flight safety in more extreme situations, in practice they took control of the pilots.
The conclusion we can make, assuming this NYT article is reliable, is that people with greater decision-making power, perhaps experts, have ignored that software can cause real problems. Big problems.
“In creating MCAS, they violated a longstanding principle at Boeing to always have pilots ultimately in control of the aircraft,” said Chesley B. Sullenberger III, the retired pilot who landed a jet in the Hudson River. “In mitigating one risk, they created another, greater risk.”
The civil aviation industry (in this case) did the opposite of what was expected, allowing a single point of failure in the project to literally bring down the system, with human lives aboard.
Technology through software engineering is increasingly abstracting the real world from the virtual world, and the goal of this should be to assist with reliable tools so that the final decision could, where possible, be in the hands of those who have the “Skin in the Game”.
Software development is already important, and the more the infrastructure around us returns through code and software, the more our responsibility towards society increases.
Our responsibility, not only technical (developers, software engineers, software architects …), but also the bureaucrats involved in the whole process of developing technology, those who often replace technical reason with the “art” of politics.